Following are the key areas tof focus for learning/training which are described later in this article:
- Java Essentials
- Hadoop Essentials
As Hadoop is based on Java programming language, one would want to get expertise of at least intermediary level to do good with Hadoop development. Following are some of the key concepts that one would want to learn or get trained on:
- JVM concepts
- Data types
- Basic Java constructs
- Looping, switch concepts
- Classes, objects and methods
- Collections concepts such as HashMap, ArrayList, LinkedList
- Exception handling
- Building Java projects
It would be important to learn introductory and advanced concepts in relation with some of the following topics:
- Basic introduction to Big Data and Hadoop
- Hadoop Architecture: One should try and learn some of the following concepts:
- Hadoop core components including HDFS & MapReduce
- HDFS architecture (Name Node, Data Node)
- MapReduce architecture (Job tracker, Task tracker)
- Coordination service
- Hadoop Installation & Deployment
- Advanced concepts in HDFS
- Advanced concepts in MapReduce
- HBase concepts including HBase architecture, distributed data storage model, difference between RDBMS & HBase
- Hive & Pig concepts including installation information, architecture related concepts, query compiler & optimizer, client & server components
- Zookeeper concepts including installation information, challenges faced in coordination of distributed applications, usecases such as leader election or distributed locking service, data model etc.
- Introduction to tools such as Apache Flume, Sqoop
- Introduction to commercial distribution of Hadoop such as Cloudera, Hortonworks, MapR, IBM etc.
- Introduction to Hadoop ecosystem including concepts on tools such as Oozie, Mahout, Spark etc.
- Introduction to Hadoop administration, troubleshooting etc.