Concept

Update Weight Iteratively Until Convergence

Weights are changed to the optimal values according to the results of the backpropagation algorithm.

Since the weights are updated a small delta step at a time, several iterations are required in order for the network to learn. After each iteration, the gradient descent force updates the weights towards less and less global loss function. The amount of iterations needed to converge depends on the learning rate, the network meta-parameters, and the optimization method used.

0

1

Updated 2020-11-02

Tags

Data Science