How it works....

We skip the data ingestion and parsing since it is similar to previous recipes, but what is different is how we set up the parameters, especially the use of "classification" as a parameter that we pass into BoostingStrategy.defaultParams():

val algo = "Classification" val numIterations = 3 val numClasses = 2 val maxDepth = 5 val maxBins = 32 val categoricalFeatureInfo = Map[Int,Int]()   val boostingStrategy = BoostingStrategy.defaultParams(algo)

We also use the evaluate() function to evaluate the parameters by looking at impurity and the confusion matrix:

evaluate(trainingData, testData, boostingStrategy)
Confusion Matrix :124.0 2.02.0 64.0Model Accuracy: 0.9791666666666666Model Error: 0.02083333333333337

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.