0% found this document useful (0 votes)
7 views11 pages

BA

A confusion matrix is an N x N matrix used to evaluate the performance of a classification model by comparing actual target values with predicted values. It includes key terms such as True Positives, True Negatives, False Positives, and False Negatives, which help in calculating metrics like accuracy, precision, recall, and F1-score. An example with cats and dogs illustrates how to derive these values from the matrix.

Uploaded by

Minaketan Kar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views11 pages

BA

A confusion matrix is an N x N matrix used to evaluate the performance of a classification model by comparing actual target values with predicted values. It includes key terms such as True Positives, True Negatives, False Positives, and False Negatives, which help in calculating metrics like accuracy, precision, recall, and F1-score. An example with cats and dogs illustrates how to derive these values from the matrix.

Uploaded by

Minaketan Kar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

What is a confusion matrix?

A Confusion matrix is an N x N matrix used for evaluating the performance of


a classification model, where N is the number of target classes. The matrix
compares the actual target values with those predicted by the machine learning
model.

a. Understanding Confusion Matrix:


The following 4 are the basic terminology which will help us in determining the
metrics we are looking for.
• True Positives (TP): when the actual value is Positive and predicted
is also Positive.
• True negatives (TN): when the actual value is Negative and
prediction is also Negative.
• False positives (FP): When the actual is negative but prediction is
Positive. Also known as the Type 1 error
• False negatives (FN): When the actual is Positive but the prediction is
Negative. Also known as the Type 2 error
For a binary classification problem, we would have a 2 x 2 matrix as shown
below with 4 values:
b. Understanding Confusion Matrix in an easier way:
Let’s take an example:
We have a total of 20 cats and dogs and our model predicts whether it is a cat or
not.
Actual values = [‘dog’, ‘cat’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘cat’,
‘dog’, ‘dog’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’]
Predicted values = [‘dog’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘cat’, ‘cat’,
‘cat’, ‘dog’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’]

True Positive (TP) = 6


You predicted positive and it’s true. You predicted that an animal is a cat and it
actually is.
True Negative (TN) = 11
You predicted negative and it’s true. You predicted that animal is not a cat and it
actually is not (it’s a dog).
False Positive (Type 1 Error) (FP) = 2
You predicted positive and it’s false. You predicted that animal is a cat but it
actually is not (it’s a dog).
False Negative (Type 2 Error) (FN) = 1
You predicted negative and it’s false. You predicted that animal is not a cat but it
actually is.

a. Accuracy:

b. Precision:
c. Recall:

4. F-measure / F1-Score

5. Sensitivity & Specificity

You might also like