Learn Before
Concept
Persistence of Detached Subgraphs
Detaching a variable from the computational graph of a subsequent downstream calculation does not destroy the original graph that created the variable itself. For example, if the ancestors of a variable are detached to prevent gradient flow into a subsequent variable , the computational history leading up to remains fully intact. Therefore, it is still possible to compute the gradient of with respect to its own independent inputs, as its local dependency graph persists.
0
1
Updated 2026-05-02
Tags
D2L
Dive into Deep Learning @ D2L