0% found this document useful (0 votes)
229 views5 pages

ML Test Questions 3 Confusion Matrix

The document contains multiple-choice questions (MCQs), multiple-select questions (MSQs), and numerical answer type (NAT) questions focused on confusion matrix concepts in machine learning. Key topics include True Positive Rate, False Negative Rate, Precision, Recall, F1 Score, and various metrics for evaluating binary classifiers. It also includes calculations for accuracy, precision, recall, specificity, and other related metrics.

Uploaded by

Dr. R. Gowri CIT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
229 views5 pages

ML Test Questions 3 Confusion Matrix

The document contains multiple-choice questions (MCQs), multiple-select questions (MSQs), and numerical answer type (NAT) questions focused on confusion matrix concepts in machine learning. Key topics include True Positive Rate, False Negative Rate, Precision, Recall, F1 Score, and various metrics for evaluating binary classifiers. It also includes calculations for accuracy, precision, recall, specificity, and other related metrics.

Uploaded by

Dr. R. Gowri CIT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

MACHINE LEARNING

TEST QUESTIONS – III


Topics: Confusion Matrix

MCQ
1. Which of the following represents the True Positive (TP) in a confusion matrix?
A) Correctly identified negative instances
B) Incorrectly identified positive instances
C) Correctly identified positive instances
D) Incorrectly identified negative instances
Answer: C) Correctly identified positive instances

2. What does the term 'False Negative Rate (FNR)' refer to in a confusion matrix?
A) TP / (TP + FN)
B) FN / (TP + FN)
C) FP / (FP + TN)
D) TN / (TN + FP)
Answer: B) FN / (TP + FN)

3. Precision is defined as:


A) TP / (TP + FP)
B) TP / (TP + FN)
C) TP / (TP + TN)
D) TN / (TN + FP)
Answer: A) TP / (TP + FP)

4. Which metric measures the proportion of actual positives that are correctly identified by the
model?
A) Precision
B) Specificity
C) Recall
D) F1 Score
Answer: C) Recall

5. What is the formula for the F1 Score?


A) 2 * (Precision + Recall) / (Precision * Recall)
B) Precision * Recall
C) 2 * (Precision * Recall) / (Precision + Recall)
D) (Precision + Recall) / 2
Answer: C) 2 * (Precision * Recall) / (Precision + Recall)

6. If a model has TP=40, FP=10, FN=20, and TN=30, what is its accuracy?
A) 0.70
B) 0.75
C) 0.60
D) 0.65
Answer: D) 0.65

7. What does the term 'True Negative Rate (TNR)' refer to in a confusion matrix?
A) TN / (TN + FP)
B) TP / (TP + FN)
C) FN / (FN + TP)
D) FP / (FP + TN)
Answer: A) TN / (TN + FP)

8. A model with high recall but low precision would:


A) Correctly identify most of the positive cases but also include many false positives.
B) Correctly identify most of the negative cases but also include many false negatives.
C) Correctly identify most of the negative cases with few false positives.
D) Correctly identify most of the positive cases with few false positives.
Answer: A) Correctly identify most of the positive cases but also include many false
positives.

9. In a binary classification problem, if the number of true negatives is 50, false positives is 10,
true positives is 40, and false negatives is 20, what is the False Positive Rate (FPR)?
A) 0.11
B) 0.20
C) 0.17
D) 0.13
Answer: B) 0.20

10. Which metric would you use to evaluate the balance between precision and recall?
A) Accuracy
B) Specificity
C) F1 Score
D) TNR
Answer: C) F1 Score

MSQ
11. Which of the following are True Positive Rate (TPR) and its synonyms? (Select all that apply)
A) Recall
B) Sensitivity
C) Specificity
D) False Positive Rate
Answer: A) Recall, B) Sensitivity
12. For a binary classification model, which metrics would be affected if the threshold for
classifying a positive instance is increased? (Select all that apply)
A) Precision
B) Recall
C) F1 Score
D) True Negative Rate
Answer: A) Precision, B) Recall, C) F1 Score

13. Which of the following formulas are correct representations of accuracy? (Select all that
apply)
A) (TP + TN) / (TP + TN + FP + FN)
B) TP / (TP + FP)
C) (TP + TN) / Total Instances
D) TN / (TN + FP)
Answer: A) (TP + TN) / (TP + TN + FP + FN), C) (TP + TN) / Total Instances

14. Given the following values for a confusion matrix: TP=30, FP=15, FN=10, TN=45, which
metrics can be calculated directly? (Select all that apply)
A) Precision
B) Recall
C) F1 Score
D) Specificity
Answer: A) Precision, B) Recall, C) F1 Score, D) Specificity

15. Which of the following statements are true about the relationship between precision and
recall? (Select all that apply)
A) Increasing precision always increases recall.
B) Increasing recall typically decreases precision.
C) Both precision and recall are unaffected by the number of negative instances.
D) F1 Score is the harmonic mean of precision and recall.
Answer: B) Increasing recall typically decreases precision, D) F1 Score is the harmonic mean
of precision and recall.

16. In a highly imbalanced dataset, which metrics would provide more insight than accuracy
alone? (Select all that apply)
A) Precision
B) Recall
C) F1 Score
D) Specificity
Answer: A) Precision, B) Recall, C) F1 Score, D) Specificity

17. Which metrics are typically used to evaluate the performance of a binary classifier? (Select all
that apply)
A) Confusion Matrix
B) ROC Curve
C) Precision-Recall Curve
D) Histogram
Answer: A) Confusion Matrix, B) ROC Curve, C) Precision-Recall Curve

18. Which of the following actions could potentially improve the F1 Score of a model? (Select all
that apply)
A) Decreasing the false negative rate
B) Increasing the true positive rate
C) Decreasing the false positive rate
D) Increasing the true negative rate
Answer: A) Decreasing the false negative rate, B) Increasing the true positive rate, C)
Decreasing the false positive rate

19. Which of the following are true for the Receiver Operating Characteristic (ROC) curve?
(Select all that apply)
A) It plots the true positive rate against the false positive rate.
B) The Area Under the Curve (AUC) represents the model's ability to distinguish
between classes.
C) A random classifier will have an AUC of 0.5.
D) A perfect classifier will have an AUC of 1.0.
Answer: A) It plots the true positive rate against the false positive rate, B) The Area Under
the Curve (AUC) represents the model's ability to distinguish between classes, C) A random
classifier will have an AUC of 0.5, D) A perfect classifier will have an AUC of 1.0.

20. Which of the following statements about the confusion matrix are correct? (Select all that
apply)
A) It provides a summary of prediction results on a classification problem.
B) It can be used to calculate precision, recall, and F1 score.
C) It is useful for both binary and multi-class classification problems.
D) It cannot be used to evaluate a regression model.
Answer: A) It provides a summary of prediction results on a classification problem, B) It can
be used to calculate precision, recall, and F1 score, C) It is useful for both binary and multi-
class classification problems, D) It cannot be used to evaluate a regression model.
NAT
21. Calculate the accuracy of a model if TP=50, TN=70, FP=20, FN=10. The accuracy is
_______.
Answer:
0.80

22. Calculate the Precision for a model with TP=45 and FP=15. The Precision is _______.
Answer:
0.75

23. Calculate the Recall for a model with TP=30 and FN=10. The Recall is _______.
Answer:
0.75

24. Calculate the False Positive Rate (FPR) for a model with FP=25 and TN=75. The FPR is
_______.
Answer:
0.25

25. Calculate the Specificity for a model with TN=50 and FP=10. The Specificity is _______.
Answer:
0.833

26. Calculate the F1 Score for a model with Precision=0.60 and Recall=0.75. The F1 Score is
_______.
Answer:
0.667

27. Calculate the False Negative Rate (FNR) for a model with FN=20 and TP=80. The FNR is
_______.
Answer:
0.20

28. Calculate the True Negative Rate (TNR) for a model with TN=85 and FP=15. The TNR is
_______.
Answer:
0.85

29. Calculate the balanced accuracy for a model with TPR=0.70 and TNR=0.80. The balanced
accuracy is _______.
Answer:
0.75

30. Calculate the Matthews correlation coefficient (MCC) for a model with TP=50, TN=60,
FP=10, and FN=20. The MCC is _______.
Answer:
0.632

You might also like