Types of optimizers

First, we look at the high-level categories of optimization algorithms and then dive deep into the individual optimizers.

First order optimization algorithms minimize or maximize a loss function using its gradient values concerning the parameters. The popularly used First order optimization algorithm is gradient descent. Here, the first order derivative tells us whether the function is decreasing or increasing at a particular point. The first order derivative gives us a line which is tangential to a point on its error surface.

The derivative for a function depends on single variables, whereas a gradient for a function depends on multiple variables.

Second order optimization algorithms use the second order derivative, which ...

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.