Concept

Relationship Between Dataset Size and Model Complexity

The amount of available training data dictates the appropriate level of model complexity. For a fixed task and data distribution, model complexity should not increase more rapidly than the dataset size. With fewer training samples, a model is highly susceptible to overfitting, making simpler models difficult to beat. However, as dataset size increases, generalization error typically decreases, allowing for the successful training of more complex architectures like deep neural networks.

0

1

Updated 2026-05-03

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L