Classification Example
Classification Example
Bayes Rule
We now pose the following question: Given that the event A has
occurred. What is the probability that any single one of the event
B’s occur?
i i i
i
Based on this information, how would you guess that the type of
the next fish to be caught?
• Decide w1 if P(w1) > P(w2) otherwise decide w2
A reasonable decision rule?
Lightness Width
Posterior Probabilities
Bayes rule allows us to compute the posterior probability (difficult to
determine) from prior probabilities, likelihood and the evidence (easier to
determine).
That is Choose the class that has the larger posterior probability !
If there are multiple features, x={x1, x2,…, xd} and multiple classes
Choose wi if P(wi| x) > P(wj| x) for all i = 1, 2, …, c
Error:
whenever we observe a particular x, the probability of error is :
P(error | x) = P(w1 | x) if we decide w2
P(error | x) = P(w2 | x) if we decide w1
Therefore:
P(error | x) = min [P(w1 | x), P(w2 | x)]
(Bayes decision)
Adjustment of decision boundary
A classifier, intuitively, is designed to minimize classification error,
the total number of instances (fish) classified incorrectly.