Concept

Non-Linear Activation Functions

Modern neural network models use non-linear activation functions. They allow the model to create complex mappings between the network’s inputs and outputs, which are essential for learning and modeling complex data, such as images, video, audio, and data sets which are non-linear or have high dimensionality.

There are 8 common non-linear activation functions:

  • Sigmoid / Logistic
  • TanH / Hyperbolic Tangent
  • ELU (Exponential Linear Unit)
  • ReLU (Rectified Linear Unit)
  • Leaky ReLU
  • Parametric ReLU
  • Softmax
  • Swish

0

2

Updated 2021-10-23

Tags

Data Science