There's more...

Prior to Spark 2.0, the SparkContext and SQLContext had to be initialized separately. Refer to the following code snippet if you plan to run the code in previous versions of Spark.

Set up the application parameters so Spark can run (using Spark 1.5.2 or Spark 1.6.1):

val conf = new SparkConf().setMaster("local[*]").setAppName("myVectorMatrix").setSparkHome("C:\\spark-1.5.2-bin-hadoop2.6") val sc = new SparkContext(conf) val sqlContext = new SQLContext(sc)

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.