Relation
Sigmoid/Logistic vs. TanH/Hyperbolic Tangent functions
- tanh works almost always better than Sigmoid function, but Sigmoid should be used for the output layer of binary classification models.
- tanh is in range (-1, 1), but Sigmoid is in range (0, 1).
- Using tanh activation functions in the hidden layers optimizes training because it centers the data around 0 rather tahn 0.5
0
1
Updated 2021-11-11
Tags
Data Science
Related
Pros and Cons of Sigmoid/Logistic Function
Derivative of Sigmoid/Logistic Function
Sigmoid/Logistic vs. TanH/Hyperbolic Tangent functions
Question about sigmoid node
Pros and Cons of Hyperbolic Tangent Function
Derivative of TanH/Hyperbolic Tangent Function
More on the Tanh function
Sigmoid/Logistic vs. TanH/Hyperbolic Tangent functions
You have built a network using the tanh activation for all the hidden units. You initialize the weights to relative large values, using np.random.randn(..,..)*1000. What will happen?