Gradient Descent and linear regression

The Gradient Descent (GD) is an iterative approach for minimizing the given function, or, in other words, a way to find a local minimum of a function. The algorithm starts with an initial estimate of the solution that we can give in several ways: one approach is to randomly sample values for the parameters. We evaluate the slope of the function at that point, determine the solution in the negative direction of the gradient, and repeat this process. The algorithm will eventually converge where the gradient is zero, corresponding to a local minimum.

The steepest descent step size is replaced by a similar size from the previous step. The gradient is basically defined as the slope of the curve, as shown ...

Get Regression Analysis with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.