Summary

This was a whirlwind tour of the basics. We started with a look at the Big O notation, which is used to reason about how fast an algorithm could run. Next came the notion of space-time trade-off.

We saw how trading off some space by caching known results saves time; it avoids needless computations. We looked at pure functions and referential transparency and saw how pure functions are amenable to memoization.

We looked at the idea of effective constant time operations.

FP programs use collections heavily. We looked at some common collection operations and their complexities.

We noted how complexity changes in the face of immutability. Building all this ground will give us enough foothold to look at our next data structure, that is, lists. ...

Get Learning Functional Data Structures and Algorithms now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.