Additive Gaussian Autoencoder with the MNIST dataset

First, we load the train and test datasets, X_train and X_test:

mnist = input_data.read_data_sets('MNIST_data', one_hot=True)def get_random_block_from_data(data, batch_size):    start_index = np.random.randint(0, len(data) - batch_size)    return data[start_index:(start_index + batch_size)]X_train = mnist.train.imagesX_test = mnist.test.images

Define the variables for the number of samples, n_samples, training_epoch, and batch_size for each iteration of the training and display_step:

n_samples = int(mnist.train.num_examples)training_epochs = 2batch_size = 128display_step = 1

Instantiate the autoencoder and the optimizer. The autoencoder has 200 hidden units and uses sigmoid as the transfer_function ...

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.