Coding a quadratic cost function optimization using Gradient Descent (GD) from scratch

In this recipe, we will code an iterative numerical optimization technique called gradient descent (GD) to find the minimum of a quadratic function f(x) = 2x2 - 8x +9.

The focus here shifts from using math to solve for the minima (setting the first derivative to zero) to an iterative numerical method called Gradient Descent (GD) which starts with a guess and then gets closer to the solution in each iteration using a cost/error function as the guideline.

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.