Performing tokenization

Tokenization is the process of converting text into tokens.  These tokens can be paragraphs, sentences, and common individual words, and are commonly based at the word level.  NLTK comes with a number of tokenizers that will be demonstrated in this recipe.

Get Python Web Scraping Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.