Putting it all together

Now that we've covered the individual pieces, let's take a look at our overall network. This looks similar to the models we've previously covered in the book. However, we're using the loss function categorical_crossentropy, which we covered in the Cost function section of this chapter.

We will define our network using the following code:

def build_network(input_features=None):    # first we specify an input layer, with a shape == features    inputs = Input(shape=(input_features,), name="input")    x = Dense(512, activation='relu', name="hidden1")(inputs)    x = Dense(256, activation='relu', name="hidden2")(x)    x = Dense(128, activation='relu', name="hidden3")(x)    prediction = Dense(10, activation='softmax', name="output")(x)

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.