Learn Before
Concept
Why does Ridge Regression Improve Over Least Squres?
- Bias-variance trade-off: as increases, the flexibility of the ridge regression fit decreases, leading to decreased variance but increased bias.
When the number of variables p is almost as large as the number of observations n, the least squares estimates will be extremely variable. And if p > n, then the least squares estimates do not even have a unique solution, whereas ridge regression can still perform well by trading off a small increase in bias for a large decrease in variance.
Hence, ridge regression works best in situations where the least squares estimates have high variance.
- Substantial computational advantages over best subset selection.
0
1
Updated 2020-06-19
Tags
Data Science