Learn Before
Comparison

Neural Networks as Nonparametric Models

Although deep neural networks possess millions of parameters that are updated and saved during training, their behavior can often be more fruitfully understood as nonparametric. Because neural networks are massively over-parametrized—having many more parameters than are needed to fit the training data—they tend to interpolate the training data, achieving zero training error. This capacity to perfectly memorize the dataset mirrors the behavior of nonparametric models, indicating that deep networks exhibit a level of complexity that scales similarly to nonparametric methods.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L

Learn After