Number of training iterations

This hyperparameter is useful for avoiding overfitting. As the model converges (the loss function value plateaus at a point and does not change with epochs), it tends to overfit the training data and moves towards a non-generalized zone in which the test samples do not perform as well as the training data. Setting the number of training iterations carefully around the plateau region ensures early stopping and hence a robust model that generalizes well.

While the hyperparameters are tuned and their effect on the overall cost function is evaluated, the early stopping can be disabled. However, once all the other hyperparameters are fully tuned, we can dynamically set the number of training iterations based on the ...

Get Artificial Intelligence for Big Data now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.