Learn Before
Concept
Manual Implementation Learning Rate Decay
When training one model at a time, over a long period of time, we can manually decrease whenever the learning rate slows down. This requires manually tuning the value of by periodically checking it over the entire learning period and can be strenuous. So, this method works with training only a small number of models.
0
2
Updated 2020-11-16
Contributors are:
Who are from:
Tags
Data Science