Concept

Retention of Intermediate Variables in Neural Networks

To avoid duplicate calculations, backpropagation reuses the intermediate values that were computed during the forward propagation pass. As a result, all intermediate variables must be retained in memory until backpropagation is completely finished. This retention requirement is a major reason why training a neural network demands significantly more memory than plain prediction (inference), where intermediate values can be immediately discarded.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L