Edit post Follow this blog Administration + Create my blog

Technogeeks is a Group of IT working professionals, located in Pune, Maharashtra, India, Technogeeks Trainers are working on real time projects on multiple technologies and always believe to share the knowledge and best practices to help the candidates to build career on multiple skill sets

Big Data Hadoop Training in Pune

Hadoop is an open-source framework of Apache which is used to store process and analyze data in very big volume. Hadoop is leading Big Data platform used by top IT organizations like Facebook, Google, Yahoo and many more.

Modules of Hadoop:-

  1. HDFS (Hadoop Distributed File System): HDFS is a distributed file system that takes care of the storage segment of Hadoop applications. HDFS distributes multiple replicas of data on compute nodes in the cluster.
  2. Yarn: This Hadoop framework is used for scheduling and cluster resource management.
  3. Map Reduce: This is another framework of Hadoop which is Yarn based system for processing a large amount of data. It takes input data and converts this data into a data set that is computed in Key value pair.
  4. Hadoop Common: These are Java libraries. It used to start the Hadoop program and other Hadoop modules can access these libraries to process the data.

For Help Visit this link Hadoop Training in Pune

Share this post
To be informed of the latest articles, subscribe:
Comment on this post