Learn Before
Formula
Generalization Gap
The generalization gap is the mathematical difference between a model's empirical training error () and its expected generalization error (), expressed as . If a model achieves a training error of exactly zero, the generalization gap becomes precisely equal to the generalization error, meaning that further predictive progress can only be made by reducing this gap.
0
1
Updated 2026-05-06
Tags
D2L
Dive into Deep Learning @ D2L