Solution

Dropout is a technique that addresses performance issues of some of the other techniques such as averaging across multiple models. It also prevents overfitting and provides a way to combine exponentially many different neural network architectures efficiently. The term dropout means dropping out units (hidden and visible) in a neural network. By dropping a unit out, it means removing it from the network and its incoming and outgoing connections, as shown in the following figure.

The choice of which units to be dropped is usually random. In a simple case, each unit is retained with a probability p independent of other units. The technique to choose p can be a validation set or can be set at 0.5; this value is close to optimal for ...

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.