Concept

Overfitting a supervised statistical model

Increasing the flexibility of a supervised statistical model may result in overfitting, such that the trained model f^\hat{f} gets prone to the noise or errors in the observations (training data) rather than approximating the true ff. So, when choosing a functional form and training a supervised statistical model, we should always consider a trade-off between the flexibility and overfitting of f^\hat{f}.

Models that re too complex for the amount of training data available are said to overfit and are not likely to generalize well to new examples.

0

2

Updated 2026-05-06

Tags

Data Science

D2L

Dive into Deep Learning @ D2L