Learn Before
Logistic Regression
Logistic Regression - Regularization
Like ridge and lasso regression, a regularization penalty on the model coefficients can also be applied with logistic regression, and is controlled with the parameter C.
In fact, the same L2 regularization penalty used for ridge regression is turned on by default for logistic regression with a default value C = 1. With large values of C, logistic regression tries to fit the training data as well as possible. While with small values of C, the model tries harder to find model coefficients that are closer to 0, even if that model fits the training data a little bit worse.
If the L1 and L2 norms are added to the logistic regression, the effect is: feature selection can be done, and overfitting can also be prevented to a certain extent. Reason: L1–LASSO can generate sparse solutions for feature selection; L2–Ridge constrains model parameters to prevent overfitting. In addition, L2 can get smooth weights.
0
2
Tags
Data Science
Related
OLS fitting cannot be used for classification
Using LDA vs Logistic Regression
Logistic Regression Videos
Binary Classification Metrics
Hypothesis
Hypothesis function
Logistic Regression Formulation
Logistic Regression Mathematical Equation
Logistic Regression - Regularization
Linear Regression vs Logistic Regression
Softmax Regression (Activation)
Learn After
L1 regularization and L2 regularization