Learn Before
Formula

Generalization Gap

The generalization gap is the mathematical difference between a model's empirical training error (RextrmempR_ extrm{emp}) and its expected generalization error (RR), expressed as RextrmempRR_ extrm{emp} - R. If a model achieves a training error of exactly zero, the generalization gap becomes precisely equal to the generalization error, meaning that further predictive progress can only be made by reducing this gap.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L

Learn After