Concept
Overfitting
Overfitting occurs when a model performs significantly better on its training data than on validation or holdout data, resulting in a large generalization gap. This indicates that the model is overly complex and has begun to memorize noise rather than learning generalizable patterns. However, severe overfitting is not always inherently detrimental in deep learning; the best predictive models often exhibit this behavior. The primary goal is to drive the overall generalization error lower, meaning the gap is only problematic if it obstructs that goal.
0
1
Updated 2026-05-03
Tags
Bayesian Statistics
Statistics
Data Science
D2L
Dive into Deep Learning @ D2L