Concept

Gradient Descent

Gradient descent is a fundamental optimization algorithm that leverages gradients to minimize a model's loss function. Because the gradient of a function points in the direction of steepest ascent, moving the model's parameters in the opposite direction iteratively lowers the loss. Each step of such gradient-based optimization algorithms requires calculating the exact gradient of the loss with respect to the parameters.

Image 0

0

3

Updated 2026-05-06

Tags

Data Science

D2L

Dive into Deep Learning @ D2L