The following steps are used to perform the training. The codes are self-explanatory, like the ones that we have already used in our previous examples. We use softmax to predict the classes by comparing them with true classes:
y_pred = tf.nn.softmax(layer_fc2) y_pred_cls = tf.argmax(y_pred, axis=1) cross_entropy = tf.nn.softmax_cross_entropy_with_logits_v2(logits=layer_fc2, labels=y_true)
We define the cost function and then the optimizer (Adam optimizer in this case). Then we compute the accuracy:
cost_op= tf.reduce_mean(cross_entropy) optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost_op) correct_prediction = tf.equal(y_pred_cls, y_true_cls) accuracy ...