A quick example

If you're new to word embeddings, you might be feeling a little lost right now. Hang in there, it will become clearer in just a moment. Let's try a concrete example.

Using word2vec, a popular word-embedding model, we can start with the word cat and find it's 384 element vector, as shown in the following output code:

array([ 5.81600726e-01, 3.07168198e+00, 3.73339128e+00, 2.83814788e-01, 2.79787600e-01, 2.29124355e+00, -2.14855480e+00, -1.22236431e+00, 2.20581269e+00, 1.81546474e+00, 2.06929898e+00, -2.71712840e-01,...

I've cut the output short, but you get the idea. Every word in this model is converted into a 384-element vector. These vectors can be compared to evaluate the semantic similarity of words in a dataset.

Now that ...

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.