Batch normalization

Batch normalization is a technique for improving the performance and stability of neural networks. The idea is to normalize the layer inputs so that they have a mean of zero and variance of 1. Batch normalization was introduced in Sergey Ioffe's and Christian Szegedy's 2015 paper, Batch Normalization is Necessary to Make DCGANs Work. The idea is that instead of just normalizing the inputs to the network, we normalize the inputs to layers within the network. It's called batch normalization because during training, we normalize each layer's input by using the mean and variance of the values in the current mini-batch.

Get Practical Convolutional Neural Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.