0% found this document useful (0 votes)
87 views2 pages

Confusion Matrix

INFORMATION ABOUT CONFUSION MATRIX
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
87 views2 pages

Confusion Matrix

INFORMATION ABOUT CONFUSION MATRIX
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 2
‘varzar7 Confusion matrix In the field of machine leaming and spcificelly the problem of statistical clasiiaton, « confusion matri, sso know ae an eror mattis a specific table layout that allows ‘sualization ofthe performance ofan algorithm, typically a supervised earning one (in ‘unsupervised Teaming itis usually celled a matching matrix). Each ealurm ofthe matix represents the instances ina predicted clas while each row repretents te instances in an actual lass (or vice versa} "The name tems ffom the fact that it makes it esy tose if the system is confusing two classes (<- commonly mislabeling one as another) tis a special kind of contingency table with two dimensions (seta! and "predicted", and ‘dentcal sets of lasses” in both dimensions (each combination of dimension and cass a ‘viable in the contingency table), Contents 1 Example 2 Table of confusion 5 References 4 External inks Example Mf classification sytem his been tsined to distinguish between eas, dogs and ris, & confusion matrix will summarize the resus of testing he algorithm for fre nspection, Assuming a sample of27 animals — 8 cats, 6 dogs, and 13 rabbis, the resulting confusion ratrx could Took ike the table below: Predicted (Cat|Dog Rabbit cas 3 0 Dog 23 1 Rabbit 211 In this confusion matrix, ofthe 8 actual eats, the system predicted that thee were dogs, and of the six dogs, it predicted that one was a rbbit and to were cats, We can see from the matix ‘har the system in question has wouble distinguishing between cats and dogs, bu ean make the dstnction bereer rabbits and other types of amas prety well. Al correct guesses are located inthe dlagonal ofthe table, ss easy to visually spect the table for eros, a hey ‘wil be represented by values ouside the diagonal, Table of confusion In predictive analytics, a table of confusion (sometimes also called a confasion matrix), is 8 table wit eo rows and two columns that reports the numberof ase posts fase negatives, true positives, and tre negatives. The allows mace detailed analysis than mere proportion of omeet qussas accuracy). Accuracy is nota rliable metic forthe el performance of Taser, because ic wll yield misleading results ithe dala sis unbalanced (tats when the number of samples in eifleret classes vy greatly) For example, thee were 95 cas and only 5 dogs nthe dataset, te classifier could easly be base into easing all the samples as fats. The overall accuracy would be 95%, but in pattie the classifier would have a 100% recognition rte forthe eat clas buts 0M recogation rte forthe dog cas. Assuring the confusion ttix above ils corresponding table of confusion, forthe cat clas, would be: ' wue positives (actual cat woe false negatives rere uke top) 2 fase positives Tue negatives (ont nn te were the remaing ema, ‘nerecy clad ae) comely cnifed ar one) The inal table of confusion would contain the average values for al classes combined Confusion matrix - Wikipedia ‘Terminology and derivations ‘rom a confeson matrix condition postive (P) she umber of ral psitive cass inthe datz condition negatives (N) she umber of real negative eases inthe data tre positive ( av with terve negative (TN) uv wih comet rejection positive F "oa with fale alarm, Type Troe false negative (EN) ‘a8 wih mss, Type Teor sensitivity, recall it rate or true postive rate (FPR) TPR 7 PsN spstcy otra etn te CINR) 7 INR = "Ny ~ ONG EP precision or postive preditve value (PPV) fa PPV = [py EP egtivepredsive vale (NPV) Tn NPV= TREN inst rate o fala nepative rate ENR) PN EN ryn= BY 2 SPN 1-7PR faou or fale poiive rate (FPR) ee FP FPR= "y= apy TN 1 fase dineovery rate (FDR) voR= AP <1 PPV fase omision rate (FOR) EN FORS aypaw 77 NPV accuracy (ACC) ‘TR+IN e+ 7N ACO= Sy TPP ING FP TEN Fiscore 1s the harmonic mean of precision an sensitivity 2 SEV TPR 21? ° =? DpySTPR ~ IPP EPS FN Matthews correlation coefficient (MCC) ce ‘TP x TN-FP x FN. VOP+FPY TP +FNYING FPN TFN) Informedness or Bookmaker Informednese (BM) BM = TPR TNR—1 Markedness (MK) MK = PPV +NPV—1 Sources: Fawcet (2006, Powers (201) an Ting (211) (92181 Lets define an experiment from P positive instances and N negative instances for some condition, The four outcomes can be formulate in a 2*2 confion mari, 8 follows hitps:ifen wkipedia.orgwiciConfusion_matri 12. ‘ysi2017 Confusion matrix Wikipedia “True condition Accursey (ACC) = “a popu cont ont s cteaS ye ETc posive+S True nega ota population ition positive tion negative ‘onditon positive tal popultion ¥ Total population & Toml porta Positive predictive value PPV). - Predicted condition | ayy Fate posive precision: ae ney FDR) Positive poaltre (ype l error) > True positive Se false positive —_ Predicted SPredited condition postive | P¥edicted condition posiv Falke omission ate (FOR) Negative prodave value (NPV) Predicted condition | Fale negative ‘True negative ‘False negative eS True negative eating Crypto "Predicted condition negative | TPredicted condition negath “ius posve at (TPR), | Fake postive ae (FPR), ‘Rca Sensitivity, "| Fallot, probability of | Positive tketibood ratio (LR) probability of detection ‘ss slr TPR’ score = ETrue positive _|_ __E False positive FPR Diagnostic odds m ¥ Condition postive |~ ¥ Condition negative ‘aie O08) 2 False negative ate FNR),| True negative rate (TNR). | Nceuive ikethoodmtioaky | TERE | T Sivrne Speutenrsrcy | Newnve etd ato (Lk) reall” revit _ -EFalse negative _ _ True negative EXR ~ ¥ Condition positive ~ FCondition negative References | Fawcett, Tom (2006) "An Introduction to ROC Analysis" (tp/people inf ee huis! Ldwhdmvtoc pl (PDF). Patiers Recognon Letters. 27 (8): 861-874, oi 10.1016) pare 2005.10010 (bps org 10.1016%2Fpatree 2005 10010). 2, Powers, David MW 2011). "Evaluation: From Precision, Real nd F-Messure to ROC, Informedaes, Markedness & Coreaton”(htp/fwwwflindrsofu.a science_enginering/ins/School-CSEM publications tech reps-research_arifels/TRRA2007 pat) (PDE), Journal of Machine Learning Technologies 2(1) wssie 5. Ting, Kai Ming (2012). Eneyelopediaof machine learning (stink springer comireferencework!10-1007%2F978-0.387-30168-8). Springs. ISBN 978-0387. 501688 4, Stchman, Stepkea¥; (1997), "Selsting and interpreting measures of thematic clasification accuracy”, Remote Sensing of Environment. 62 (1): 7-89. oi 10,101 6790034-4257(97)00083-7 (psd org/10.1016%2FS0084-1257942807%2900083.7). External links + Theory about the confusion matrix (ht/www2cs regina ca!~dbdes83/oteslconfsion_matinvconfasion_ matrix tl) + GMGRKB Confusion Mati concep page (itp/ww gabormel coms RKB.Confusion Matrix) Retrieved fom "ipslen wikipedia ogwlindex php itle-Confsion_matix&oldid-792604088 + This page wa as edited on 27 July 2017, at 1408 4 Text eavalable under the Creative Commone Atribution-SharcAlike License; addtional terms may apply. By using thesis, you ogre tothe Terms of Use and Privacy Policy, Wikipedia reyistered trademark ofthe Wikimedia Foundation, In, a omproit organization, hitps:/en wikipedia.orghwikiConfusion_matrix 22

You might also like