Learn Before
Concept

Gradient Checking

This is the technique used to check that our implementation is correct. There are different formulas for gradient checking and one of those is two-sided form: J(θ+ϵ)J(θϵ)2ϵ\frac {J(\theta + \epsilon) - J(\theta - \epsilon)} {2\epsilon} Common choice for ϵ\epsilon is $10^{-7}$. You shouldn't use gradient checking for the whole training data as it can be slow.

0

2

Updated 2021-03-12

Tags

Data Science