Choosing a classification algorithm

Naive Bayes is one of the most simple, efficient, and effective inductive algorithms in machine learning. When features are independent, which is rarely true in the real world, it is theoretically optimal and, even with dependent features, its performance is amazingly competitive (Zhang, 2004). The main disadvantage is that it cannot learn how features interact with each other; for example, despite the fact that you like your tea with lemon or milk, you hate a tea that has both of them at the same time.

The main advantage of the decision tree is that it is a model that is easy to interpret and explain, as we studied in our example. It can handle both nominal and numeric features, and you don't have to worry ...

Get Machine Learning in Java - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.