Dropout

The result after max-pooling is minimum information within the output. But this may still be too much information; the machine may still overfit. Therefore, a very interesting quandary arises: what if some of the activations were randomly zeroed?

This is the basis of dropout. It's a remarkably simple idea that improves upon the machine learning algorithm's ability to generalize, simply by having deleterious effects on information. With every iteration, random activations are zeroed. This forces the algorithm to only learn what is really important. How it does so involves structural algebra and is a story for another day.

For the purposes of this project, Gorgonia actually handles dropout by means of element-wise multiplication by ...

Get Go Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.