Relation
Linear vs. Non-Linear Activation Functions
Non-linear functions address the problems of a linear activation function:
- They allow backpropagation because they have a derivative function which is related to the inputs.
- They allow “stacking” of multiple layers of neurons to create a deep neural network. Multiple hidden layers of neurons are needed to learn complex data sets with high levels of accuracy.
0
1
Updated 2021-03-12
Tags
Data Science
Related
Linear vs. Non-Linear Activation Functions
Sigmoid/Logistic Function
TanH/Hyperbolic Tangent Function
Swish Function
ReLU (Rectified Linear Unit)
ELU (Exponential Linear Unit)
Which activation function is represented by each of these plots?
Which of the following introduces nonlinearity into neural networks?
Softmax Function
Problems with Linear Activation Function
Linear vs. Non-Linear Activation Functions