Theory

Neural Tangent Kernel

The Neural Tangent Kernel (NTK) establishes a formal mathematical connection between large, over-parametrized neural networks and nonparametric kernel methods. Theoretical research demonstrates that in the limit, as multilayer perceptrons with randomly initialized weights grow infinitely wide, their training dynamics become mathematically equivalent to nonparametric kernel methods using a specific distance function—the NTK. While the NTK may not fully explain all behaviors of modern deep learning, it serves as a successful analytical tool that underscores the usefulness of nonparametric modeling for understanding deep networks.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L

Related