Learn Before
Concept

Symmetry Breaking in Deep Learning

Symmetry breaking refers to a requirement of initializing machine learning models like neural networks.

When a neural network model has weights, all initialized to the same value, it can be difficult or impossible for the weights to differ as the model is trained. This is known as the “symmetry” problem.

Initializing the model to small random values breaks the symmetry and allows different weights to learn independently of each other.

0

1

Updated 2026-05-06

Tags

Data Science

D2L

Dive into Deep Learning @ D2L