Gradient descent

Gradient descent is an algorithm which minimizes functions. A set of parameters defines a function, and the gradient descent algorithm starts with the initial set of param values and iteratively moves toward a set of param values that minimizes the function.

This iterative minimization is achieved using calculus, taking steps in the negative direction of the function gradient, as can be seen in the following diagram:

Gradient descent is the most successful optimization algorithm. As mentioned earlier, it is used to do weights updates in a neural network so that we minimize the loss function. Let's now talk about an important ...

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.