Categories: Big Data

Hadoop Map-Reduce Explained with an Example

This article represents key steps of Hadoop Map-Reduce Jobs using a word count example. Please feel free to comment/suggest if I missed to mention one or more important points. Also, sorry for the typos.

Following are the key steps of how Hadoop MapReduce works in a word count problem:

  • Input is fed to a program, say a RecordReader, that reads data line-by-line or record-by-record.
  • Mapping process starts which includes following steps:
    • Combining: Combines the data (word) with its count such as 1
    • Partitioning: Creates one partition for each word occurence
    • Shuffling: Move words to right partition
    • Sorting: Sort the partition by word
  • Last step is Reducing which comes up with the result such as word count for each occurence of word.

Following diagram represents above steps.

Following diagram depicts another view on how map-reduce works:
In above diagram, one could see that, primarily, there are three key phases of a map-reduce job:
  • Map: This phase processes data in form of key-value pairs
  • Partitioning/Shuffling/Sorting: This groups similar keys together and sort them
  • Reduce: This places final result with the key.
Ajitesh Kumar

I have been recently working in the area of Data analytics including Data Science and Machine Learning / Deep Learning. I am also passionate about different technologies including programming languages such as Java/JEE, Javascript, Python, R, Julia, etc, and technologies such as Blockchain, mobile computing, cloud-native technologies, application security, cloud computing platforms, big data, etc. I would love to connect with you on Linkedin. Check out my latest book titled as First Principles Thinking: Building winning products using first principles thinking.

Recent Posts

Building an OpenAI Chatbot with LangChain

Have you ever wondered how to use OpenAI APIs to create custom chatbots? With advancements…

13 hours ago

Building a RAG Application with LangChain: Example Code

The combination of Retrieval-Augmented Generation (RAG) and powerful language models enables the development of sophisticated…

2 days ago

How Indexing Works in LLM-Based RAG Applications

When building a Retrieval-Augmented Generation (RAG) application powered by Large Language Models (LLMs), which combine…

5 days ago

Retrieval Augmented Generation (RAG) & LLM: Examples

Last updated: 25th Jan, 2025 Have you ever wondered how to seamlessly integrate the vast…

5 days ago

What are AI Agents? How do they work?

Artificial Intelligence (AI) agents have started becoming an integral part of our lives. Imagine asking…

3 weeks ago

Agentic AI Design Patterns Examples

In the ever-evolving landscape of agentic AI workflows and applications, understanding and leveraging design patterns…

4 weeks ago