Learn Before
Formula

Generalization Error Bound via Vapnik-Chervonenkis Dimension

A key result in statistical learning theory bounds the generalization gap between the empirical error RextrmempR_ extrm{emp} and the true error RR as a function of the Vapnik-Chervonenkis (VC) dimension and the dataset size nn. With a probability of at least 1δ1-\delta, the generalization gap is strictly less than α\alpha, provided that αc(extrmVClogδ)/n\alpha \geq c \sqrt{( extrm{VC} - \log \delta)/n}, where c>0c > 0 is a constant depending on the loss scale. While theoretically profound, this bound often provides a pessimistic estimate for complex models.

0

1

Updated 2026-05-03

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L