Lec Note 3 2025
Lec Note 3 2025
In the previous lecture note we had discussed about the σ-field and also
Borel σ-field. As I had mentioned it was mainly for completeness purposes.
Please keep in mind we need these concept to define probability on certain
class of subsets as it is NOT always possible to define probability on all
class of subsets. Hence, from now on we do not go back to the σ-field any
more. It is assumed that there exists a σ-field on which the probability has
been defined. Now let us discuss some important properties and concepts of
probability.
First we will define what is an event. An event is an element of a σ-field.
Therefore, if A is an event, then P (A) is properly defined. Similarly, if A and
B are events, then P (A ∪ B), P (A ∩ B), P (A0 ) etc. are all defined. Suppose
A and B are two events, and P (B) > 0, then the conditional probability of
A given B is defines as
P (A ∩ B)
P (A|B) = . (1)
P (B)
It is clear that (1) is properly defined as P (B) > 0. Intuitively, it means how
the P (A) changes, if you know the event B has taken place. Let us look at
the following example:
Example: Suppose you toss a coin three times. Then the all possible out-
comes are
1
event that exactly at most two heads have appeared, then A0 = {HHH},
8 19
and P (A) = 1 − = . Similarly, if B denotes the event that the first toss
27 27
18 2
is head, then B = {HHH, HHT, HT H, HT T } and P (B) = = . Now,
27 3
suppose we want to compute the following probability: we know that the
first toss is head, then what is the probability that at most two heads have
appeared. Therefore, we want to compute P (A|B). Observe that A ∩ B =
10
{HHT, HT H, HT T }, hence, P (A ∩ B) = . Therefore,
27
P (A ∩ B) 10 5
P (A|B) = = = .
P (B) 18 9
You can see that P (A) changes, if you know that the event B has taken
place. Similarly, if we know that at most two heads have appeared, what is
the probability that the first one is head. In this case, we want P (B|A), and
it becomes
P (A ∩ B) 10
P (B|A) = = .
P (A) 19
If the three events are independent then clearly they are pairwise indepen-
dent, but if the three events are pairwise independent, then they may not be
independent. Let us look at the following example.
Example: Let a ball be drawn from an urn containing four balls, numbered
1,2,3,4. Let E = {1, 2}, F = {1, 3} and G = {1, 4}. If all four outcomes are
assumed to be equally likely, then
1 1 1
P (E ∩ F ) = P (E)P (F ) = , P (E ∩ G) = , P (F ∩ G} = ,
4 4 4
but
1 1
P (E ∩ F ∩ G) = 6= = P (E)P (F )P (G).
4 64
Hence, E, F and G are pairwise independent but they are not independent.
Now we can define the independent events for the general case. Suppose
A1 , A2 , . . . , An are n events, they are called independent events if for every
subset {i1 , i2 , . . . , im } of {1, 2, . . . , n} for m ≤ n,
3
Now we will introduce another important concept known as Bayes theo-
rem. It is one of the most used theorem in probability. It is simple, but it is
extremely useful. Suppose A1 , A2 , . . . , An are n disjoint (mutually exclusive)
[n
and exhaustive, i.e. Ai = Ω, events. Let B be any other event. Then
i=1
observe that
n
X
P (B) = P (B ∩ Ω) = P (B ∩ ∪ni=1 Ai ) =P (∪ni=1 B ∩ Ai ) = P (B ∩ Ai ).
i=1
Note that the last equality follows because B∩Ai for i = 1, . . . , n are mutually
exclusive events. Hence,
P (Ai ∩ B) P (B|Ai )P (Ai )
P (Ai |B) = = Pn . (2)
P (B) k=1 P (B|Ak )P (Ak )
The above equality (2) is important because if we know P (B|Ai ) and P (Ai ),
for i = 1, . . . , n, then we can compute P (Ai |B).
Problem: Consider two urns. The first contains two white and seven black
balls, and the second contains five white and six black balls. We flip a fair
coin and then draw a ball from the first urn or the second urn depending on
whether the outcome was heads or tails. What is the conditional probability
that the outcome of the toss was heads given that a white ball was selected?
Solution: Let W be the event that a white ball is drawn, and let H be the
event that the coin comes up heads. The desired probability P (H|W ) may
be calculated as follows
P (H ∩ W ) P (W |H)P (H)
P (H|W ) = =
P (W ) P (W |H)P (H) + P (W |T )P (T )
2 1
9 × 2 22
= 2 1 5 1 = .
9 × 2 + 11 × 2
67
4
number of multiple-choice alternatives. What is the conditional probability
that a student knew the answer to a question given that she answered it
correctly?
Solution: Let C and K denote respectively the event that the student an-
swers the question correctly and the event that she actually knows the answer.
Now
P (K ∩ C) P (C|K)P (K)
P (K|C) = =
P (C) P (C|K)P (K) + P (C|K 0 )P (K 0 )
p mp
= = .
p + (1 − p)(1/m) 1 + (m − 1)p