Summary

In this chapter, we introduced the essentials of machine learning. We started with some easy, but still quite effective, classifiers (linear and logistic regressors, Naive Bayes, and K-Nearest Neighbors). Then, we moved on to the more advanced ones (SVM). We explained how to compose weak classifiers together (ensembles, RandomForests, and Gradient Tree Boosting). Finally, we had a peek at the algorithms used in big data and clustering.

In the next chapter, you'll be introduced to Graphs, which is an interesting deviation from the predictors/target flat matrices. It is quite a hot topic in data science now. Expect to delve into very complex and intricate networks!

Get Python Data Science Essentials now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.