Learn Before
When do we need Deep Learning?
The Universal Approximation Theorem (UAT) declares that only one hidden layer, with a finite number of neurons, is enough for approximating any looked-for function. This is an impressive statement for two reasons: On the one hand, this theorem proves the immense capacity of neural networks. But, on the other hand… Does it mean that we never need Deep Learning? No, breathe deeply, it doesn’t mean that…
The UAT doesn’t specify how many neurons it must contain. Although a single hidden layer could be enough to model a specific function, it could be more efficient to learn it by multiple hidden layers net. Furthermore, when training a net, we are looking for a function that best generalizes the relationship in data. Even if a single hidden network is able to represent the function that best fits the training examples, this would not mean that it generalizes better the behavior of the data out of the training set.
This is very well explained in the book Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville:
In summary, a feedforward network with a single layer is sufficient to represent any function, but the layer may be infeasibly large and may fail to learn and generalize correctly. In many circumstances, using deeper models can reduce the number of units required to represent the desired function and can reduce the amount of generalization error.
0
0
Tags
Data Science
Related
Deep Learning Algorithms
Rules-Based Systems vs. Classic Machine Learning vs. Representation Learning vs. Deep Learning
Deep Learning vs. Reinforcement Learning
“Why is deep learning taking off?”
The learning circle of the neural network
Research Ideas for Deep Learning
Troubleshooting a deep learning model
Applications of neural networks in supervised learning
Formulating the dataset in a Deep Learning Problem
Deep vs. Shallow Neural Networks
Deep learning core concepts
When do we need Deep Learning?
Machine Learning vs Deep Learning
How to solve the overfitting problems in deep learning
Top 15 deep learning applications
Deep Learning History
Deep Learning (in Machine Learning) References
Challenges Motivating Deep Learning
DeepFake
Attention is all you Need (Presentation)
Explaining Complex Concepts with Simple Examples
A machine learning system is being designed to identify different species of birds in photographs. The model first learns to recognize basic elements like lines, curves, and color gradients. In subsequent stages, it combines these basic elements to identify more complex components like feathers, beaks, and eyes. Finally, it uses the arrangement of these components to classify the bird species. Which statement best analyzes the fundamental principle that gives this approach its power?
Choosing the Right Machine Learning Approach
A machine learning model is tasked with identifying a cat in an image. Arrange the following stages of representation in the order they would likely be learned by a system that builds complex concepts from simpler ones, starting from the most basic input.
End-to-End Training