RDDs - what started it all...

The RDD API is a critical toolkit for Spark developers since it favors low-level control over the data within a functional programming paradigm. What makes RDDs powerful also makes it harder to work with for new programmers. While it may be easy to understand the RDD API and manual optimization techniques (for example, filter() before a groupBy() operation), writing advanced code would require consistent practice and fluency.

When data files, blocks, or data structures are converted to RDDs, the data is broken down into smaller units called partitions (similar to splits in Hadoop) and distributed among the nodes so they can be operated on in parallel at the same time. Spark provides this functionality right out ...

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.