logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Non-Linear Activation Functions

    Concept icon
Concept icon
Concept

TanH/Hyperbolic Tangent Function

S(x)=tanh(x)=ex−e−xex+e−xS(x) = tanh(x) = \frac{e^x-e^{-x}}{e^x + e^{-x}}S(x)=tanh(x)=ex+e−xex−e−x​

Image 0

0

1

Concept icon
Updated 2021-11-18

Contributors are:

Yue Kuang
Yue Kuang
🏆 1

Who are from:

University of Michigan - Ann Arbor
University of Michigan - Ann Arbor
🏆 1

References


  • Wikipedia

  • Neural Network Reference

Tags

Data Science

Related
  • Linear vs. Non-Linear Activation Functions

  • Sigmoid/Logistic Function

    Concept icon
  • TanH/Hyperbolic Tangent Function

    Concept icon
  • Swish Function

    Concept icon
  • ReLU (Rectified Linear Unit)

    Concept icon
  • ELU (Exponential Linear Unit)

    Concept icon
  • Which activation function is represented by each of these plots?

  • Which of the following introduces nonlinearity into neural networks?

  • Softmax Function

Learn After
  • Pros and Cons of Hyperbolic Tangent Function

  • Derivative of TanH/Hyperbolic Tangent Function

    Concept icon
  • More on the Tanh function

  • Sigmoid/Logistic vs. TanH/Hyperbolic Tangent functions

  • You have built a network using the tanh activation for all the hidden units. You initialize the weights to relative large values, using np.random.randn(..,..)*1000. What will happen?

logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github