Concept

Memory Footprint of Neural Network Training

The memory footprint required to retain intermediate values during neural network training is roughly proportional to the number of network layers and the batch size. Because these variables must be stored until backpropagation concludes, training deeper networks with larger batch sizes leads to a substantial increase in memory consumption, making the system more susceptible to out-of-memory errors.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L