Concept

Batch Normalization in Deep Learning

Batch normalization is a method that helps to speed neural network training. The basic idea is that we shift and scale the inputs to every layer of neural network. For example, for a 3-layer neural network shown below, we want to normalize z[3]z^{[3]} to speed the training of w[3]w^{[3]} and b[3]b^{[3]} in the third layer.

Image 0

0

2

Updated 2021-03-31

Tags

Data Science