Learn Before
  • TanH/Hyperbolic Tangent Function

Pros and Cons of Hyperbolic Tangent Function

Pros:

  • Zero centered—making it easier to model inputs that have strongly negative, neutral, and strongly positive values.
  • Smooth gradient, preventing “jumps” in output values.
  • Output values bound between -1 and 1, normalizing the output of each neuron.
  • Clear predictions—For X above 2 or below -2, tends to bring the Y value (the prediction) to the edge of the curve, very close to 1 or -1. This enables clear predictions.

Cons (Like the Sigmoid function):

  • Vanishing gradient—for very high or very low values of X, there is almost no change to the prediction, causing a vanishing gradient problem. This can result in the network refusing to learn further, or being too slow to reach an accurate prediction.
  • Computationally expensive

0

1

5 years ago

Tags

Data Science

Related
  • Pros and Cons of Hyperbolic Tangent Function

  • Derivative of TanH/Hyperbolic Tangent Function

  • More on the Tanh function

  • Sigmoid/Logistic vs. TanH/Hyperbolic Tangent functions

  • You have built a network using the tanh activation for all the hidden units. You initialize the weights to relative large values, using np.random.randn(..,..)*1000. What will happen?