Having an ensemble of several different classifiers means we will be using a group of models. Ensembling is a very popular and easy to understand machine learning technique, and is part of almost every winning Kaggle competition.
Despite initial concerns that this process might be slow, some teams working on commercial software have begun using ensemble methods in production software as well. This is because it requires very little overhead, is easy to parallelize, and allows for a built-in fallback of using a single model.
We will look at some of the simplest ensembling techniques based on simple majority, also known as voting ensemble, and will then build using that.
In summary, this machine learning for NLP section covers ...