Our first Spark script

Our first script reads in a text file and sees how much the line lengths add up to:

import pyspark
if not 'sc' in globals():
    sc = pyspark.SparkContext()
lines = sc.textFile("Spark File Words.ipynb")
lineLengths = lines.map(lambda s: len(s))
totalLength = lineLengths.reduce(lambda a, b: a + b) 
print(totalLength)

In the script, we are first initializing Spark-only if we have not done already. Spark will complain if you try to initialize it more than once, so all Spark scripts should have this if prefix statement.

The script reads in a text file (the source of this script), takes every line, and computes its length; then it adds all the lengths together.

A lambda function is an anonymous (not named) function that takes arguments ...

Get Learning Jupyter now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.