Apache Hadoop Requirement


Why Apache Hadoop or apache hadoop requirement?

As said by high end programmer’s, before learning of any technologies, we should know the requirement of any technologies. It’s will be very easier to learn If we are clear about the apache hadoop requirement before it’s use.

Let’s understand with real time example,  Why Apache Hadoop?,  before ahead to technical aspects.

Real life Big Data example using Hadoop: Let us suppose you have a 50GB of log file of your application and you need the below information from that log file.

  1. How many time the error was occurred in one day.
  2. How many time warning was occurred in a day and any other important details from log file.

Using Hadoop you can write the program and get the answers of above points in a minute.

Can you open 50GB of log file via notepad or any other tool? I think it’s not very easy to open that kind of file. But yes, by using Apache hadoop its can be read easily.

Why Apache hadoop is required?

Format of log file is always unstructured and it can’t be stored in Oracle or any other RDBMS.

Processing of GB’s of files is easy with Apache Hadoop and It can be processed within minutes?

  1. Hadoop can store structure or unstructured data but RDBMS can’t store the unstructured data.
  2. Hadoop can process the data within minutes which is in very large amount, it can be in GB or TB.

That’s why Apache Hadoop is required to process the large about of data within minute along with unstructured data (log file, sensor data etc).

Bigdata Opportunities with an example

Below diagram summarizes the Opportunities in the field of Hadoop along with the real time examples.

Big data Opportunity

That’s just all about the Apache Hadoop Requirement.