Learn Before
Concept

Learning Rate Decay

Learning rate decay is the gradual reduction of the learning rate as a function of time to speed up the learning algorithm. Decaying the learning rate as the gradient descent approaches completion reduces noise and facilitates a tighter convergence to a target.

0

2

Updated 2021-05-24

Tags

Data Science