Data loaders with torchtext

Writing good data loaders is the most tedious part in most deep learning applications. This step often combines the preprocessing, text cleaning, and vectorization tasks that we saw earlier.

Additionally, it wraps our static data objects into iterators or generators. This is incredibly helpful in processing data sizes much larger than GPU memory—which is quite often the case. This is done by splitting the data so that you can make batches of batchsize samples that fit your GPU memory.

Batchsizes are often powers of 2, such as 32, 64, 512, and so on. This convention exists because it helps with vector operations on the instruction set level. Anecdotally, using a batchsize that's different from a power of 2 has not ...

Get Natural Language Processing with Python Quick Start Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.