Hyperparameter tuning and feature selection

Here are some ways of improving the accuracy by tuning hyperparameters, such as the number of hidden layers, the neurons in each hidden layer, the number of epochs, and the activation function. The current implementation of the H2O-based deep learning model supports the following activation functions:

  • ExpRectifier
  • ExpRectifierWithDropout
  • Maxout
  • MaxoutWithDropout
  • Rectifier
  • RectifierWthDropout
  • Tanh
  • TanhWithDropout

Apart from the Tanh one, I have not tried other activation functions for this project. However, you should definitely try.

One of the biggest advantages of using H2O-based deep learning algorithms is that we can take the relative variable/feature importance. In previous chapters, we have ...

Get Scala Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.