Concept

Setting Probability P for Dropout Regularization in Deep Learning

For the variable P, as P gets larger, the regularization will decrease and the training error will be lower. Often times you will set P = 1 for certain layers that aren't overfitting as much in order to ensure you are keeping every unit in that layer.

0

1

Updated 2021-12-02

Tags

Data Science