O'Reilly logo
  • Sagar Mainkar thinks this is interesting:

Flume is designed for high-volume ingestion into Hadoop of event-based data. The canonical example is using Flume to collect logfiles from a bank of web servers, then moving the log events from those files into new aggregated files in HDFS for processing. The usual destination (or sink in Flume parlance) is HDFS. However, Flume is flexible enough to write to other systems, like HBase or Solr.

From

Cover of Hadoop: The Definitive Guide, 4th Edition

Note

Purpose of Flume