Learn Before
Formula
Generalization Error Bound via Vapnik-Chervonenkis Dimension
A key result in statistical learning theory bounds the generalization gap between the empirical error and the true error as a function of the Vapnik-Chervonenkis (VC) dimension and the dataset size . With a probability of at least , the generalization gap is strictly less than , provided that , where is a constant depending on the loss scale. While theoretically profound, this bound often provides a pessimistic estimate for complex models.
0
1
Updated 2026-05-03
Tags
D2L
Dive into Deep Learning @ D2L