How did Hadoop evolve ? Hadoop's basic components

Inspired by Google's Map-reduce which splits into small Fractions to run different nodes a platform called Hadoop was created. It was distributed in 2006 by the support distribution of nutch Search engines. Hadoop has been continuously revised since then. And it also launched is the revised version on the 20th of April 2014. Basically, Hadoop is a framework used for the storage and processing of Big Data. It is focused on Improved terms in performance in terms of the process of clicking data processing in terms of data processing on the clusters of customer hardware. 

Hadoop in simple words is an open-source data frame for the storage of data and running. On hardware it provides data storage for running any kind of data and the ability to handle tasks or jobs it's software framework is written in JavaScript is used in many companies. Hadoop training in Chennai definitely gives complete training for this. 

Hadoop has three components in overall : 

HDFS it is basically used for the storage of data 

Mapreduce: It is used for the process of storing the data 

Yarn: It is used for the storage management 

Hadoop has a cluster of multi - nodes if you want to increase the processing speed you can add clusters 

Hadoop basically deals with all types of data like Structured, Semi-Structured, and unstructured data. 

The Hadoop transfer has 

Scoop 

Flame

Skills required to learn Hadoop 

Basic Java Skills

Linux 

Analytical Skills

Java on SQL. 



Comments