Data Science

When to use LabelEncoder – Python Example

In this post, you will learn about when to use LabelEncoder. As a data scientist, you must have a clear understanding on when to use LabelEncoder and when to use other encoders such as One-hot Encoder. Using appropriate type of encoders is key part of data preprocessing in machine learning model building lifecycle.

Here are some of the scenarios when you could use LabelEncoder without having impact on model.

  • Use LabelEncoder when there are only two possible values of a categorical features. For example, features having value such as yes or no. Or, maybe, gender feature when there are only two possible values including male or female.
  • Use LabelEncoder for label columns in case of supervised learning when it is binary classification problem.
  • Don’t use LabelEncoder when the categorical features have more than two values. The nominal categorical features having more than two values may get treated as ordinal one by the machine learning model. Although model performance won’t suffer much, it is recommended to use One-Hot encoder for categorical features having more than two different types of value. In such cases, one may want to use One-hot encoder, also called as dummy encoding. One can also use get_dummies method on Pandas package.

LabelEncoder Python Example

Here is the Python code which transforms thebinary classes into encoding 0 and 1 using LabelEncoder. The Breast Cancer Wisconsin dataset is used for illustration purpose. The information about this dataset can be found at https://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+(Diagnostic). Note that LabelEncoder is class of sklearn.preprocessing package.

import pandas as pd
from sklearn.preprocessing import LabelEncoder

df = pd.read_csv(
     'https://archive.ics.uci.edu/ml/'
     'machine-learning-databases'
     '/breast-cancer-wisconsin/wdbc.data',
     header=None)
#
# Load the training data (X) and labels (y)
#
X = df.loc[:, 2:].values
y = df.loc[:, 1].values
#
# Instantiate LabelEncoder
#
le = LabelEncoder()
y = le.fit_transform(y)

Here is the how the data looks like:

Fig 1. LabelEncoder used for column 1 for binary encoding

In case, you want to look at what all classes got transformed. You can use the following code representing attribute, classes_ on instance of LabelEncoder.

le.classes_

Read one related post on LabelEncoder – LabelEncoder example – single & multiple columns

Ajitesh Kumar

I have been recently working in the area of Data analytics including Data Science and Machine Learning / Deep Learning. I am also passionate about different technologies including programming languages such as Java/JEE, Javascript, Python, R, Julia, etc, and technologies such as Blockchain, mobile computing, cloud-native technologies, application security, cloud computing platforms, big data, etc. I would love to connect with you on Linkedin. Check out my latest book titled as First Principles Thinking: Building winning products using first principles thinking.

Recent Posts

Agentic Reasoning Design Patterns in AI: Examples

In recent years, artificial intelligence (AI) has evolved to include more sophisticated and capable agents,…

1 month ago

LLMs for Adaptive Learning & Personalized Education

Adaptive learning helps in tailoring learning experiences to fit the unique needs of each student.…

2 months ago

Sparse Mixture of Experts (MoE) Models: Examples

With the increasing demand for more powerful machine learning (ML) systems that can handle diverse…

2 months ago

Anxiety Disorder Detection & Machine Learning Techniques

Anxiety is a common mental health condition that affects millions of people around the world.…

2 months ago

Confounder Features & Machine Learning Models: Examples

In machine learning, confounder features or variables can significantly affect the accuracy and validity of…

2 months ago

Credit Card Fraud Detection & Machine Learning

Last updated: 26 Sept, 2024 Credit card fraud detection is a major concern for credit…

2 months ago