Learn Before
Concept

Storage of Intermediate Variables in Backpropagation

During the backpropagation algorithm, the network is traversed in reverse order from the output to the input layer to calculate gradients. In order to compute the gradients with respect to parameters closer to the input, the algorithm explicitly stores intermediate variables—specifically the required partial derivatives—in memory so they can be reused according to the chain rule.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L

Related