Spark SQL application

When writing applications using Spark, developers have the option to use SQL on structured data to get the desired results. An example makes this easier for us to understand how to do this:

[hive@node-3 ~]$ cat SQLApp.py from pyspark.sql import SparkSession # Path to the file in HDFS csvFile = "employees.csv" # Create a session for this application spark = SparkSession.builder.appName("SQLApp").getOrCreate() # Read the CSV File csvTable = spark.read.format("csv").option("header", "true").option("delimiter", "\t").load(csvFile) csvTable.show(3) # Create a temporary view csvView = csvTable.createOrReplaceTempView("employees") # Find the total salary of employees and print the highest salary makers highPay = spark.sql("SELECT ...

Get Modern Big Data Processing with Hadoop now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.