O'Reilly logo

R Machine Learning Essentials by Michele Usuelli

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Tuning features and parameters together

In the previous two sections, we identified the best k using all the features (n=37). Then, using the optimal k, we identified the best n. What if the algorithm performs better with k=30 and n=25? We haven't explored that combination as well as many other options, so there might be a combination performing better than k=27 and n=15.

In order to identify the best option, the most simple approach is to test all the alternatives. However, if there are too many possible combinations between the variables, we don't have enough computational power to test all of them. In that case, we can identify the optimal parameters using optimization algorithms such as the gradient descend.

Fortunately, in our case, we are ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required