Cluster mode

In cluster mode, a client submits the request to the Spark master and exits. Spark master will schedule the driver on one of the nodes and then the process continues as described in the previous section. In this example, we will run a Spark PI application on a Spark standalone cluster in cluster mode.

To run a Spark application in cluster mode in client mode on the Spark standalone cluster, log in to any of the nodes in the cluster and run the following commands:

cd $SPARK_HOME 
./bin/spark-submit --master spark://spark-master:7077--deploy-mode cluster --class org.apache.spark.examples.SparkPi examples/jars/spark-examples_2.11-2.1.1.jar 

Spark Shell is not supported in Cluster mode.

Get Apache Spark 2.x for Java Developers now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.