Learn Before
Concept

Interdependence of Forward and Backward Propagation

During neural network training, forward and backward propagation are fundamentally interdependent. Forward propagation depends on the current model parameters, which are updated by the optimization algorithm using gradients calculated during the most recent backpropagation step. Conversely, backpropagation traverses the computational graph in reverse and depends on the intermediate variables that were computed and stored during the forward propagation pass. For example, computing the regularization term during forward propagation depends on the current values of model parameters W(1)\mathbf{W}^{(1)} and W(2)\mathbf{W}^{(2)}, which are given by the optimization algorithm according to backpropagation in the most recent iteration. On the other hand, the gradient calculation during backpropagation depends on the current value of the hidden layer output h\mathbf{h}, which is given by forward propagation.

0

1

Updated 2026-05-06

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L