GAN – code example

In the following example, we build and train a GAN model using an MNIST dataset and using TensorFlow. Here, we will use a special version of the ReLU activation function known as Leaky ReLU. The output is a new type of handwritten digit:

Leaky ReLU is a variation of the ReLU activation function given by the formula f(xmax(αxx). So the output for the negative value for x is alpha * x and the output for positive x is x.
#import all necessary libraries and load data set%matplotlib inlineimport pickle as pklimport numpy as npimport tensorflow as tfimport matplotlib.pyplot as pltfrom tensorflow.examples.tutorials.mnist import input_datamnist = input_data.read_data_sets('MNIST_data')

In order to build this network, we ...

Get Practical Convolutional Neural Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.