- Start a new project in IntelliJ or in an IDE of your choice. Make sure that the necessary JAR files are included.
- Set up the package location where the program will reside:
package spark.ml.cookbook.chapter4
- Import the necessary packages for SparkContext to get access to the cluster:
import org.apache.spark.ml.feature.LabeledPointimport org.apache.spark.ml.linalg.Vectorsimport org.apache.spark.ml.classification.LogisticRegressionimport org.apache.spark.sql._
- Create Spark's configuration and SparkContext so we can have access to the cluster:
val spark = SparkSession.builder.master("local[*]").appName("myLabeledPoint").config("spark.sql.warehouse.dir", ".").getOrCreate()
- We create the LabeledPoint, using the SparseVector ...