You are previewing Real-Time Big Data Analytics: Emerging Architecture.

Real-Time Big Data Analytics: Emerging Architecture

Cover of Real-Time Big Data Analytics: Emerging Architecture by Mike Barlow Published by O'Reilly Media, Inc.
O'Reilly logo

Chapter 2. How Fast Is Fast?

The capability to store data quickly isn’t new. What’s new is the capability to do something meaningful with that data, quickly and cost-effectively. Businesses and governments have been storing huge amounts of data for decades. What we are witnessing now, however, is an explosion of new techniques for analyzing those large data sets. In addition to new capabilities for handling large amounts of data, we’re also seeing a proliferation of new technologies designed to handle complex, non-traditional data — precisely the kinds of unstructured or semi-structured data generated by social media, mobile communications, customer service records, warranties, census reports, sensors, and web logs. In the past, data had to be arranged neatly in tables. In today’s world of data analytics, anything goes. Heterogeneity is the new normal, and modern data scientists are accustomed to hacking their way through tangled clumps of messy data culled from multiple sources.

Software frameworks such as Hadoop and MapReduce, which support distributed processing applications across relatively inexpensive commodity hardware, now make it possible to mix and match data from many disparate sources. Today’s data sets aren’t merely larger than the older data sets — they’re significantly more complex.

“Big data has three dimensions — volume, variety, and velocity,” says Minelli. “And within each of those three dimensions is a wide range of variables.”

The ability to manage large and complex ...

The best content for your career. Discover unlimited learning on demand for around $1/day.