Relation

Pros and Cons of each Activation Function

Sigmoid Pros: useful for binary classification, output layer Cons: shifted from center

Tanh Pros: has centering effect, almost always superior to sigmoid Cons: complex in comparison to ReLU

ReLU Pros: simpler, default Cons: Derivative is 0 when z is negative --> leaky reLU

0

3

Updated 2021-02-28

Tags

Data Science