- Start a new project in IntelliJ or in an IDE of your choice. Make sure that the necessary JAR files are included.
- Set up the package location where the program will reside:
package spark.ml.cookbook.chapter4
- Import the necessary packages for SparkContext to get access to the cluster:
import org.apache.spark.{SparkConf, SparkContext}
- Create Spark's configuration and SparkContext so we can have access to the cluster:
val conf = new SparkConf().setAppName("MyAccessSparkClusterPre20").setMaster("local[4]") // if cluster setMaster("spark://MasterHostIP:7077").set("spark.sql.warehouse.dir", ".") val sc = new SparkContext(conf)
The preceding code utilizes the setMaster() function to set the cluster master location. As you ...