What is Confusion Matrix, Accuracy, Sensitivity, Specificity, Precision, Recall?

Pooja Pawani
5 min readDec 12, 2020

--

I have seen many people getting confused with these metrics. Trust me if you understand how all these terms relates to each other, it would be very easy for you to apply. Let’s dive in -

Confusion Matrix

Confusion matrix is a summary of prediction results on a classification problem. Let’s say we have to identify, on certain features (independent variables), whether a person is diseased or not.

As we know the output of logistic regression is a probability of a certain class and one chooses cutoff to classify the result in either classes. When classifying, there might be some errors like Diseased classified as Not Diseased and/or Not Diseased classified as Diseased.

This is where Confusion Matrix comes into play, that helps to capture those errors made by the model and helps you to evaluate how well the model is. Here we take “Diseased ”label as “Yes ”and “Not Diseased” label as “No”

A typical confusion matrix would look like –

Confusion Matrix

The above table/matrix shows the actual and predicted labels. Thus, from the above matrix we can see that Actual Yeses were 105 but the model has predicted 110 Yeses, similarly actual Nos are 60 but the model has predicted 55 Nos.

First row second column (10) of the matrix are actually No but the model has predicted as Yes. Hence termed “False Positive”

Second row first column (5) of the matrix are actually Yes but the model has predicted as No. Hence termed “False Negative”

Thus, we can say the above two are incorrect predicted labels.

And the corrected predicted labels are first row first column (50) termed as “True Negative” and second row second column (100) termed as “True Positive”

In context of TN, TP, FN, FP, just remember we describe Actual labels as True/False and Predicted labels as Positive/Negative

Now let’s see how one can calculate the accuracy, sensitivity, specificity of the model based on confusion matrix.

1. Accuracy

Accuracy is the ratio of correctly predicted labels to the total predicted labels, which can be expressed in formula as –

2. Sensitivity (True Positive Rate)

Sensitivity is the ratio of number of actual Yeses correctly predicted to the total number of actual Yeses.

Sensitivity is also referred as True Positive Rate

3. Specificity (True Negative Rate)

Specificity is the ratio of number of actual Nos correctly predicted to the total number of actual Nos.

Specificity is nothing but True Negative Rate

Additionally, I have added the below two metrics so that one has the clear idea on what is going on.

4. False Positive Rate

False positive rate is the ratio of number of actual Nos incorrectly predicted to the total number of actual Nos. False positive rate is also expressed as (1-Specificity), since the denominator for FPR and Specificity is same.

5. False Negative Rate

False Negative Rate is the ratio of number of actual Yeses incorrectly predicted to the total number of actual Yeses. Similarly, False negative rate can also be expressed as (1-Sensitivity), since the denominator for FNR and Sensitivity is same.

Note — For the above 4 metrics we see that the denominator is either “Total Number of Actual Yeses” or “Total number of actual Nos”

Denominator , For Sensitivity (True Positive Rate) and False Negative Rate is same AND that For Specificity (True Negative Rate) and False Positive Rate is same

Now, apart from sensitivity and specificity, there are two more metrics that are widely used in the industry which you should know about. They’re known as ‘Precision’ and ‘Recall’. Now, these metrics are very similar to sensitivity and specificity; it’s just that knowing the exact terminologies can be helpful as both of these pairs of metrics are often used in the industry.

1. Precision (Positive Predictive Rate) -

Precision is the ratio of number of actual Yeses correctly predicted to the total number of predicted Yeses. This is nothing but Positive Predictive Rate.

2. Recall (Sensitivity/True Positive Rate) -

Recall is same as Sensitivity (True Positive Rate).

You might be wondering, if these are almost the same, then why we look at them differently? The main reason behind this is that in the industry, some businesses follow the ‘Sensitivity-Specificity’ view and some other businesses follow the ‘Precision-Recall’ view and hence, will be helpful if one know both these standard pairs of metrics.

Hope this helps in evaluating your model with ease by understanding those confusing terminology 😊

Happy reading and happy learning !

--

--

Responses (1)