Loading pretrained word vectors

As I have just mentioned, I'm going to use a Keras embedding layer. For the second version of the model, we will initialize the weights of the embedding layer with the GloVe word vectors we covered previously in the chapter. To do so, we will need to load those weights from disk and put them into a suitable 2D matrix that the layer can use as weights. We will cover that operation here.

When you download the GloVe vectors, you'll see that you have several text files in the directory you unzipped the download in. Each of these files corresponds to a separate set of dimensions; however, in all cases, these vectors were developed using the same common corpus containing 6 billion unique words (hence the title

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.