Learn Before
Relation

Likelihood that Gradient Descent will fail

If failure is defined as a model becoming unable to escape a non-global local minima through training, then it is almost guaranteed that this will occur with a non-trivial and finite training set. This can be attributed to inherent model identifiability\textbf{model identifiability} issues which may allow uncountably infinite local minima to exist.

In practice, however, experts now believe that most relevant local minima are already close to the global minimum when given sufficiently large training sets. Although this is still an active area of research.

0

1

Updated 2021-06-24

References


Tags

Data Science

Learn After