Ch 07 Evaluation
Ch 07 Evaluation
1) True Positive When some event happens both in prediction and in reality.
2) True Negative When some event don’t happen both in the prediction and
in reality.
3) False Positive When some event happen in the prediction but not in reality.
True Positive
Case 2: Is there a forest fire?
True Negative
Case 3: Is there a forest fire?
False Positive
Case 3: Is there a forest fire?
False Negative
Confusion Matrix
True False
Positive Positive
False True
Negative Negative
EVALUATION METHODS
Accuracy
Precision
Recall
F1 Score
Accuracy
Correct Predictions
_________________
Accuracy % = X 100
Total Cases
TP + TN
_____________
Accuracy % = X 100
TP+TN+FP+FN
Precision
Precision is defined as the ratio of true positive cases versus all the cases
where the prediction is true.
That is, it tells us how many of the predicted true instances are actually
true.
True Positives
Precision = _____________________
All Predicted Positives
TP
Precision = _________
TP+FP
Precision value ranges from 0 to 1.
Recall
True Positives
___________________________
Recall =
True Positive + False Negative
TP
_________
Recall =
TP+FN
Choosing between Precision and Recall
For reference:
FN – Prediction No, Reality Yes FP – Prediction Yes, Reality No
Choosing between Precision and Recall
Precision x Recall
______________
F1 Score = 2 x
Precision + Recall
A model has good performance if the F1 Score for that model is high.
An Example: Confusion Matrix of Disease Prediction by an AI model.
Prediction - No 5 FN 50 TN