Learn Before
Symmetry Breaking in Deep Learning
Symmetry breaking refers to a requirement of initializing machine learning models like neural networks.
When a neural network model has weights, all initialized to the same value, it can be difficult or impossible for the weights to differ as the model is trained. This is known as the “symmetry” problem.
Initializing the model to small random values breaks the symmetry and allows different weights to learn independently of each other.
0
1
Contributors are:
Who are from:
Tags
Data Science
D2L
Dive into Deep Learning @ D2L
Related
Example of Weight Initialization
Vanishing/exploding gradient
Symmetry Breaking in Deep Learning
How to Initialization Weights to Prevent Vanishing/Exploding Gradients
Transfer Learning in Deep Learning
Multi-task Learning in Deep Learning
Variance of Layer Output in Forward Propagation
Default Random Initialization
Xavier Initialization