Bag of Word models

The Bag of Word (BoW) models are NLP models that really disregard sentence structure and word placement. In a Bag of Word model, we treat each document as a bag of words. It's easy to imagine just that. Each document is a container that holds a big set of words. We ignore sentences, structure, and which words come first or last. We concern ourselves with the fact that the document contains the words very, good, and bad but we don't really care that very comes before good, but not bad.

Bag of Word models are simple, require relatively little data, and work amazingly well considering the naivety of the model.

Note, the use of model here means representation. I'm not referring to a deep learning model or machine learning model ...

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.