How Does Linear And Logistic Regression Work In Machine Learning?

  • Rohit Dwivedi
  • Apr 26, 2020
  • Machine Learning
How Does Linear And Logistic Regression Work In Machine Learning? title banner

Linear regression and logistic regression both are machine learning algorithms that are part of supervised learning models. Since both are part of a supervised model so they make use of labelled data for making predictions. 


Linear regression is used for regression or to predict continuous values whereas logistic regression can be used both in classification and regression problems but it is widely used as classification algorithm. Regression models aim to project value based on independent features.


The main difference that makes both different from each other is when the dependent variables are binary logistic regression is considered and when dependent variables are continuous then linear regression is used. 


Linear Regression


Every person must have come across linear models when they were at school. Mathematics taught us about linear models. It is the same model that is used widely in predictive analysis now. It majorly tells about the relationship between a target that is dependent variable and predictors using a straight line. Linear regression is basically of two types that are Simple and Multiple Regression.

Regression line between X axis and Y axis. 

Experience on X-axis & Salary on Y-axis

In the above plot, Salary is the dependent variable that is on (Y-axis) and the independent variable is on X-axis that is Experience. More experience means more salary. The regression line can be written as:


Where coefficients are a0 and a1 and the error term is ε.


Linear regression can have independent variables that are continuous or may be discrete in nature but have continuous dependent variables. The best fit line in linear regression is calculated using mean squared error that finds out the relationship between dependent that is Y and independent that is X. There is always a linear relationship that is present between both the two. 


Linear regression only has one independent variable whereas in multiple regression there can be more than one independent variable.


Let us go through a regression problem. The problem is to make a prediction model that can predict ‘mpg’ on the basis of independent features. The dataset can be downloaded from the UCI Machine learning repository that is auto mpg


Figure shows the view of dataset

Figure shows the view of dataset


  • Imported the necessary libraries and the dataset.
  • Did exploratory data analysis where found ‘car name’ was useless for the prediction so dropped it. ‘Mpg’ column is the target column and rests all other columns are independent variables.

Figure shows splitting of data and initiating the algorithm


  • Defined dependent and independent features that are X & y respectively.

  • Imported train-test_split from sklean.model_selection.

  • Divided the data into 75:25 ratio of 75% for training and 25% for testing data. Imported LinearRegression() from sci-kit learn library. 

  • Created object as regression_model for LinearRegression().

  • Fitted the data to the model for training.


Figure shows the coefficients and training, test accuracy

  • Printed the coefficient for the independent feature.
  • Printed the model score on the training data that cae 81%.
  • Printed the testing accuracy that came 84%.


Logistic Regression


It is an algorithm that can be used for regression as well as classification tasks but it is widely used for classification tasks. The response variable that is binary belongs either to one of the classes. It is used to predict categorical variables with the help of dependent variables. 


Consider there are two classes and a new data point is to be checked which class it would belong to. Then algorithms compute probability values that range from 0 and 1. 


For example, whether it will rain today or not. In logistic regression weighted sum of input is passed through the sigmoid activation function and the curve which is obtained is called the sigmoid curve.


Graph of sigmoid function.

Figure shows graph of Sigmoid Function

The logistic function that is a sigmoid function is an ‘S’ shaped curve that takes any real values and converts them between 0 to 1. If the output given by a sigmoid function is more than 0.5, the output is classified as 1 & if is less than 0.5, the output is classified as 0. If the graph goes to a negative end then y predicted will be 0 and vice versa.


If we obtain the output of sigmoid to be 0.75 then it tells us that there are 75% chances of that happening, maybe a toss coin.


Figure shows Binary Classification

Binary Classification

The above figure shows inputs and the probabilities that the outcome is between two categories of a binary dependent variable based on one or more independent variables that can be continuous as well as categorical.


Let us now explore a problem statement that is a classification problem using Logistic regression from sklearn. The dataset is pima-indian dataset that is taken from Kaggle website where we need to classify patients whether diabetic or not.


Code implementation of importing data and necessary libraries.




  • Imported all the necessary libraries and the dataset.

  • Performed EDA that can be seen in the python file uploaded on the GitHub. 

(Figure shows how independent and target variable are initialized that is X & Y)


It was found that the ratio of diabetic to non-diabetic is 1:2 that means the model to predict whether the person will be non-diabetic is more than diabetic. 


  • Defined dependent feature and independent features that is ‘Class’.

  • Divided the data into training and test sets in a ratio of 70:30.

  • Created an object as a model and defined Logistic Regression() from the sklearn library.

  • Fitted the data to the model for training.


Figure shows how model performed on the testing data


  • Predicted the class on test data and stored in y_predict variable.

  • Printed model score on the testing data that came 76%.

  • Printed confusion matrix between y_predict and y_test.


The reason for low accuracy can be because of the data sampling issue. These results can be enhanced by using upsampling and downsampling techniques in machine learning. You can refer here for the python file for both the problem statement and dataset here.


Differences Between Linear And Logistic Regression


  • Linear regression is used for predicting the continuous dependent variable using a given set of independent features whereas Logistic Regression is used to predict the categorical.

  • Linear regression is used to solve regression problems whereas logistic regression is used to solve classification problems.

  • In Linear regression, the approach is to find the best fit line to predict the output whereas in the Logistic regression approach is to try for S curved graphs that classify between the two classes that are 0 and 1.

  • The method for accuracy in linear regression is the least square estimation whereas for logistic regression it is maximum likelihood estimation. 

  • In Linear regression, the output should be continuous like price & age, whereas in Logistic regression the output must be categorical like either Yes / No or 0/1.

  • There should be a linear relationship between the dependent and independent features in the case of Linear regression whereas it is not in the case of Logistic regression.

  • There can be collinearity between independent features in case of linear regression but it is not in the case of logistic regression.




In this blog, I have tried to give you a brief idea about how linear and logistic regression is different from each other with a hands-on problem statement. I have discussed the linear model, how sigmoid functions work, and how classification in logistic regression is made between 0 and 1. How prediction is made for continuous values. I have taken two problem statements where I have worked on classification as well as a regression problem. And lastly, I have discussed differences between both the algorithms.