AdaBoost

In the previous section, we have seen that sampling with a replacement leads to datasets where the samples are randomly reweighted. However, if M is very large, most of the samples will appear only once and, moreover, all the choices are totally random. AdaBoost is an algorithm proposed by Schapire and Freund that tries to maximize the efficiency of each weak learner by employing adaptive boosting (the name derives from this). In particular, the ensemble is grown sequentially and the data distribution is recomputed at each step so as to increase the weight of those samples that were misclassified and reduce the weight of the ones that were correctly classified. In this way, every new learner is forced to focus on those regions that ...

Get Mastering Machine Learning Algorithms now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.