Example

Example: Gradient of y=2xopxy = 2\mathbf{x}^{ op}\mathbf{x} with Respect to x\mathbf{x}

Consider the scalar-valued function y=2xopxy = 2\mathbf{x}^{ op}\mathbf{x}. When differentiated with respect to the column vector x\mathbf{x}, the resulting gradient is 4x4\mathbf{x}. In deep learning frameworks, after allocating gradient memory and computing yy, calling the backward pass function calculates this gradient and stores it in the vector's gradient attribute.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L