O'Reilly logo

Apache Spark for Data Science Cookbook by Padma Priya Chitturi

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Creating a SparkR standalone application from RStudio

In this recipe, we'll look at the process of writing and executing a standalone application in SparkR.

Getting ready

To step through this recipe, you will need a running Spark Cluster either in pseudo distributed mode or in one of the distributed modes, that is, standalone, YARN, or Mesos. Also, install RStudio. Please refer to the Installing R recipe for details on the installation of R.

How to do it…

In this recipe, we'll create standalone application using Spark-1.6.0 and Spark-2.0.2:

  1. Before working with SparkR, make sure that SPARK_HOME is set in environment as follows:
       if (nchar(Sys.getenv("SPARK_HOME")) < 1) {
       Sys.setenv(SPARK_HOME = "/home/padmac/bigdata/spark-1.6.0-bin-
           hadoop2.6")
     } ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required