Learn Before
Concept

Backward Graph Traversal

To compute the gradients of a vector-valued function, its computational dependency graph is traversed in a backwards direction, tracing from the output back to the inputs. During this backward pass, matrices of partial derivatives are multiplied along the paths to determine how changes in the input variables affect the final output.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L