Optimizer

This loss is automatically used as a feedback signal to adjust the way the algorithm works. This adjustment step is what we call learning.

This automatic adjustment in model weights is peculiar for deep learning. Each adjustment or update of weights is made in a direction that will lower the loss for the current training pair (input, target).

This adjustment is the job of the optimizer, which implements what's called the backpropagation algorithm: the central algorithm in deep learning.

Optimizers and loss functions are common to all deep learning methods even the cases where we don't have an input/target pair. All optimizers are based on differential calculus, such as stochastic gradient descent (SGD), Adam, and so on. Hence, ...

Get Natural Language Processing with Python Quick Start Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.