Tuning the model hyperparameters

Now that we've trained an MLP and a six-layer deep neural network on the problem, we're ready to tune and optimize model hyperparameters.

We will discuss model tuning in depth in Chapter 6, Hyperparameter Optimization. There are a variety of strategies that you can use to choose the best parameters for your model. As you've probably noticed, there are many possible parameters and hyperparameters that we could still optimize.

If you wanted to fully tune this model you should do the following:

  • Experiment with the number of hidden layers. It appears that five might be too many, and one might not be enough.
  • Experiment with the number of neurons in each hidden layer, relative to the number of layers.
  • Experiment ...

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.