Parameter tuning using RandomizedSearch

An alternative approach was proposed by Bergstra and Bengio (http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf) in 2012. They demonstrated that a random search across a large hyperparameter space is more effective than a manual approach, as we did for Multinomial Naive Bayes, and often as effective—or more so—than GridSearch.

How do we use it here?

Here, we will build on top of the results such as that of Bergstra and Bengio. We will break down our parameter search into the following two steps:

  1. Using RandomizedSearch, go through a wide parameter combination space in a limited number of iterations
  2. Use the results from step 1 to run GridSearch in a slightly narrow space

We can repeat ...

Get Natural Language Processing with Python Quick Start Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.