Underfitting

Underfitting means that the model was poorly trained. Either the training dataset did not have enough information to infer strong predictions, or the algorithm that trained the model on the training dataset was not adequate for the context. The algorithm was not well parameterized or simply inadequate for the data.

If we measure the prediction error not only on the validation set but also on the training set, the prediction error will be large if the model is underfitting. Which makes sense: if the model cannot predict the training, it won't be able to predict the outcomes in the validation set it has not seen before. Underfitting basically means your model is not working.

Common strategies to palliate this problem include:

Get Effective Amazon Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.