O'Reilly logo

Deep Learning Essentials by Jianing Wei, Anurag Bhardwaj, Wei Di

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Training a Word2Vec using TensorFlow

In this section, we will explain step-by-step how to build and train a Skip-Gram model using TensorFlow. For a detailed tutorial and source code, please refer to https://www.tensorflow.org/tutorials/word2vec:

  1. We can download the dataset from http://mattmahoney.net/dc/text8.zip.
  2. We read in the content of the file as a list of words.
  3. We set up the TensorFlow graph. We create placeholders for the input words and the context words, which are represented as integer indices to the vocabulary:
train_inputs = tf.placeholder(tf.int32, shape=[batch_size]) train_labels = tf.placeholder(tf.int32, shape=[batch_size, 1])

Note that we train in batches, so batch_size refers to the size of the batch. We also create a ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required