Relation

Linear vs. Non-Linear Activation Functions

Non-linear functions address the problems of a linear activation function:

  • They allow backpropagation because they have a derivative function which is related to the inputs.
  • They allow “stacking” of multiple layers of neurons to create a deep neural network. Multiple hidden layers of neurons are needed to learn complex data sets with high levels of accuracy.

0

1

Updated 2021-03-12

Tags

Data Science