Gradient boosting

While a random forest works in a framework where multiple parallel trees are built, gradient boosting takes a different approach—building a deep framework.

The gradient in gradient boosting refers to the difference between actual and predicted values, and boosting refers to improvement, that is, improving the error over different iterations.

Gradient boosting also leverages the way in which decision trees work in the following way:

  • Build a decision tree to estimate the dependent variable
  • Calculate the error, that is, the difference between actual and predicted value
  • Build another decision tree that predicts the error
  • Update the prediction by taking the prediction of error of the previous decision tree into account

This ...

Get Hands-On Machine Learning on Google Cloud Platform now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.