DBNs with two RBM layers

In this section, we will create a DBN with two RBM layers and run it on the MNIST dataset. We will modify the input parameters for the DeepBeliefNetwork(..) class:

name = 'dbn'rbm_layers = [256, 256]finetune_act_func ='relu'do_pretrain = Truerbm_learning_rate = [0.001, 0.001]rbm_num_epochs = [5, 5]rbm_gibbs_k= [1, 1]rbm_stddev= 0.1rbm_gauss_visible= Falsemomentum= 0.5rbm_batch_size= [32, 32]finetune_learning_rate = 0.01finetune_num_epochs = 1finetune_batch_size = 32finetune_opt = 'momentum'finetune_loss_func = 'softmax_cross_entropy'finetune_dropout = 1finetune_act_func = tf.nn.sigmoid

Notice that some of the parameters have two elements for array so we need to specify these parameters for two layers:

  • rbm_layers ...

Get Neural Network Programming with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.