How to do it...

  1. Start a new project in IntelliJ or in an IDE of your choice. Make sure that the necessary JAR files are included.
  1. Set up the package location where the program will reside:
package spark.ml.cookbook.chapter13
  1. Import the necessary packages:
import java.util.concurrent.TimeUnitimport org.apache.log4j.{Level, Logger}import org.apache.spark.sql.SparkSessionimport org.apache.spark.sql.streaming.ProcessingTime
  1. Create a SparkSession as an entry point to the Spark cluster:
val spark = SparkSession.builder.master("local[*]").appName("DataFrame Stream").config("spark.sql.warehouse.dir", ".").getOrCreate()
  1. The interleaving of log messages leads to hard-to-read output, therefore set logging level to warning:
Logger.getLogger( ...

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.