In this post, you will learn about the confusion matrix with examples and how it could be used as performance metrics for classification models in machine learning.

Let’s take an example of a classification model which is used to predict whether a person would default on a bank loan. To build this classification model, let’s say, a historical data set of 10000 records got chosen for building the model. As part of building the model, all of the 10,000 records got labeled where each record represented a person and got labeled as “Yes” or “No” based on whether they defaulted (Yes) or not defaulted (No).

Out of 10,000 labeled records, 7550 records are labeled as “No” – Not a defaulter. These cases could be called as “Negative“. 2450 records got labeled as “Yes” – A defaulter. Such cases could be called “Positive”.

The model got trained and did the prediction for all 10,000 cases. Before we get into looking at the confusion matrix, let’s try and understand what will be termed as a true positive, false positive, true negative and false negative cases in relation to the prediction made by the model.

• True Positive (TP): True positive will be a number of records which got predicted as positive and were originally found to be labeled as positive. In the current example, true positive will be a number of records which got predicted to be a defaulter and were found to be originally labeled as a defaulter.

Let’s say, the total number of records representing true positive came out to be 1800.

• False Negative (FN): False negative will be a number of records which got predicted as negative and were originally found to be labeled as positive. In the current example, false negative will be a number of records which got predicted to be non-defaulters and were originally found to be labeled as defaulters.

The total number of records representing false negative would come out to be 2450 – 1800 = 650.

• True Negative (TN): True negative will be a number of records which got predicted as negative and were originally found to be labeled as negative. In the current example, true negative will be a number of records which got predicted to be non-defaulter and were originally found to be labeled as a non-defaulter.

Let’s say, the total number of records representing true negative came out to be 5800.

• False Positive (FP): False positive will be a number of records which got predicted as positive but were originally found to be labeled as negative. In the current example, false positive will be a number of records which got predicted to be defaulter but were originally found to be labeled as a non-defaulter.

Let’s say, the total number of records representing true positive would come out to be 7450 – 5800 = 1650.

Laying the above in a matrix format, the following is how it would look like:

 Labeled/Predicted Predicted as Yes (Positive) Predicted as No (Negative) Labeled as Yes (Positive) 1800 (true positive) 650 (false negative) Labeled as No (Negative) 1650 (false positive) 5800 (true negative)

The above represents the confusion matrix representing the predictions made by the classification model. Let’s quickly go through some popular performance metrics:

• Accuracy can be calculated as (TP + TN)/(TP + FN + TN + FP)
• Precision = TP / (TP + FP)
• Recall or sensitivity = TP / (TP + FN)
• Specificity = TN / (TN + FP)

## Summary

In this post, you learned about the concept of confusion matrix in relation to how it could be used as performance metrics for the classification model. Hope you liked the post. Please feel free to suggest and sorry for the typo.

### Ajitesh Kumar

Ajitesh has been recently working in the area of AI and machine learning. Currently, his research area includes Safe & Quality AI. In addition, he is also passionate about various different technologies including programming languages such as Java/JEE, Javascript and technologies such as Blockchain, mobile computing, cloud-native technologies, application security, cloud computing platforms, big data etc.

He has also authored the book, Building Web Apps with Spring 5 and Angular.