Once you know the core of Scala and the Spark API, the tedium of coding Big Data infrastructure drifts away letting you focus on the real issues. Spark's creator chose Scala as its implementation language. He had good reasons and this course shows why.
Matei Zaharia, Spark’s creator, chose Scala as his implementation language. You'll discover why in this course covering the core of Spark and the Scala API. You'll learn to define classes, variables and functions, and how to use them to define Spark jobs. You'll explore how pattern matching, tuples, and case classes make the code highly concise and easy to understand. You’ll also learn how the type system prevents bugs while providing useful feedback when you explore your data. By course end, you’ll gain the productivity enhancing ability to read and write Spark code and use the Scala API.
In this tutorial you will:
- cover the basic Scala concepts that give you complete access to all of Spark’s features
- learn the significant benefits of using Scala for their Spark work
- boost your productivity by using the Scala API vs. the Java API