Problem statement

Deep neural networks contain nonlinear hidden layers, and this makes them expressive models that can learn very complicated relationships between inputs and outputs. However, these complicated relationships will be the result of sampling noise. These complicated relationships might not exist in test data, leading to overfitting. Many techniques and methods have been developed to reduce this noise. These include stopping the training as soon as performance on a validation set starts getting worse, introducing weight penalties such as L1 and L2 regularization, and soft weight sharing (Nowlan and Hinton, 1992).

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.