Hyperparameter tuning and feature selection

The flexibility of neural networks is also one of their main drawbacks: there are many hyperparameters to tweak. Even in a simple MLP, you can change the number of layers, the number of neurons per layer, the type of activation function to use in each layer, the number of epochs, the learning rate, weight initialization logic, drop-out keep probability, and so on. How do you know what combination of hyperparameters is best for your task?

Of course, you can use grid search with cross-validation to find the right hyperparameters for linear machine learning models, but for deep learning models, there are many hyperparameters to tune. And since training a neural network on a large dataset takes a lot ...

Get Scala Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.