Importing web logs data into HDFS using Flume

One of the most important use cases of Flume is importing logs data into HDFS as and when it is produced. In this recipe, we will be executing a Flume agent which will be listening to the logs file.

Getting ready

To perform this recipe, you should have a Hadoop cluster running with you as well as the latest version of Flume installed on it.

How to do it...

  1. To import data into HDFS from web servers, we have to install Flume agent on each web server instance.Following is the configuration we have to use for Flume agent configuration:
    flume1.sources = weblogs-source-1 flume1.channels = hdfs-channel-1 flume1.sinks = hdfs-sink-1 # For each source, channel, and sink, set # standard properties. flume1.sources.weblogs-source-1.type ...

Get Hadoop: Data Processing and Modelling now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.