What is Apache Flume?

One major part of the Hadoop stack is Apache Flume. It’s a major component in getting a large volume of data into the Hadoop Distributed File System (HDFS). Here is a general overview of Flume and its purpose and role in architecting Big Data applications: https://hortonworks.com/apache/flume/

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s