Prior to Spark 2.0, the SparkContext and SQLContext had to be initialized separately. Refer to the following code snippet if you plan to run the code in previous versions of Spark.
Set up the application parameters so Spark can run (using Spark 1.5.2 or Spark 1.6.1):
val conf = new SparkConf().setMaster("local[*]").setAppName("myVectorMatrix").setSparkHome("C:\\spark-1.5.2-bin-hadoop2.6") val sc = new SparkContext(conf) val sqlContext = new SQLContext(sc)