Overfitting a supervised statistical model
Increasing the flexibility of a supervised statistical model may result in overfitting, such that the trained model gets prone to the noise or errors in the observations (training data) rather than approximating the true . So, when choosing a functional form and training a supervised statistical model, we should always consider a trade-off between the flexibility and overfitting of .
Models that re too complex for the amount of training data available are said to overfit and are not likely to generalize well to new examples.
0
2
Contributors are:
Who are from:
Tags
Data Science
D2L
Dive into Deep Learning @ D2L
Related
Overfitting a supervised statistical model
Training Error and Test Error
Generalizability of a supervised statistical model
Underfitting a supervised statistical model
Measuring Model Complexity: Rademacher complexity
Bias of Supervised Models in Statistical Learning
Variance of Supervised Models in Statistical Learning
Falsifiability of Machine Learning Models
Notions of Model Complexity
Relationship Between Dataset Size and Model Complexity