Regression techniques
linear regression:
In statistics, linear regression is a linear approach to modelling the relationship between a dependent variable(y) and one or more independent variables(X). In linear regression, the relationships are modeled using linear predictor functions whose unknown model parameters are estimated from the data. Linear Regression is one of the most popular algorithms in Machine Learning. That’s due to its relative simplicity and well known properties.
ref:https://medium.com/@gilberttanner/linear-regression-explained-8e45f234dc55
logistic regression:
It’s a classification algorithm, that is used where the response variable is categorical . The idea of Logistic Regression is to find a relationship between features and probability of particular outcome .
E.g. When we have to predict if a student passes or fails in an exam when the number of hours spent studying is given as a feature, the response variable has two values, pass and fail.
Linear Regression is used to predict continuous variables.
Logistic Regression is used to predict categorical variables (mostly binary)
Linear Regression outputs the value of the variable as its prediction
Logistic Regression outputs the PROBABILITY of occurrance of an event as its prediction
Linear Regression’s accuracy and goodness of fit can be measured by loss, R squared, Adjusted R squared etc.
Logistic Regression: Measuring the accuracy of categorical distributions can become tricky due to imbalance. So, we have to use a bunch of metrics to measure the model’s fit. Some of them are -
Accuracy, Precision, Recall, F1 score (harmonic mean of precision and recall), ROC curve (for determining probability threshold for classification), Confusion Matrix, Concordance, Gini and the list goes on…
There are many other comparisons that can be drawn in addition to this. I have tried to keep it “practitioner friendly”.