0% found this document useful (0 votes)
153 views11 pages

ST120 Introduction to Probability Exam

The document outlines the details for the ST120 Introduction to Probability examination at the University of Warwick, scheduled for Summer 2024. It includes exam instructions, question types, and specific topics covered, such as probability spaces, random variables, and expectations. The document also contains example questions and their solutions related to probability theory.

Uploaded by

Biscuit1601
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views11 pages

ST120 Introduction to Probability Exam

The document outlines the details for the ST120 Introduction to Probability examination at the University of Warwick, scheduled for Summer 2024. It includes exam instructions, question types, and specific topics covered, such as probability spaces, random variables, and expectations. The document also contains example questions and their solutions related to probability theory.

Uploaded by

Biscuit1601
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

ST1200

UNIVERSITY OF WARWICK

Paper Details

Paper code: ST120

Paper Title: INTRODUCTION TO PROBABILITY

Exam Period: Summer 2024

Exam Rubric

Time allowed: 2 hours plus 15 minutes reading time


During the 15 minutes reading time, you are not permitted to begin answering questions
in the answer booklet. You may read and make notes on the question paper.

Exam Type: Standard Examination

Calculators may NOT be used in this examination.

Additional Stationery

None.

Instructions

Full marks may be obtained by correctly answering all questions.


There are a total of 100 marks available.
A guideline to the number of marks available is shown for each question section.

Be careful in crossing out work. Crossed out work will NOT be marked.

Page 1 of 11
ST1200

Question 1.

(a) Suppose that (Ω, F, P) is a probability space describing a random experiment.


(i) State the definition of the event space F.
Bookwork

F is the event space. It is a collection of subsets of Ω satisfying the following


properties:
• Ω∈F
• If A ∈ F then Ac ∈ F (closed under complements).
• If {An , n > 0} is such that An ∈ F for all n > 0, then

[
An ∈ F
n=1

(closed under countable unions).


[4 marks]

(ii) Suppose that (Ω, F, P) is a uniform probability space, i.e. |Ω| < +∞, F = P(Ω)
and P(ω) = P(ω′) for any ω, ω ′ ∈ Ω. Using the properties of probability, show that
|A|
P(A) = ,
|Ω|
where |A|, |Ω| are the cardinalities of A and Ω respectively.
Seen

Since A ⊆ Ω and Ω is a finite set, A is also finite. Thus, they can both be written as
Ω = {ω1 , . . . , ωn } and A = {ω1 , . . . , ωk } for some k ⩽ n < +∞ and ωi ̸= ωj ∀i ̸= j.
Using the properties of uniform probabilities, we can write
n
X
1 = P(Ω) = P ({ω1 , . . . , ωn }) = P (∪ni=1 {ωi }) = P ({ωi })
i=1
n
X
= P ({ω1 }) = nP ({ω1 })
i=1

and thus,P ({ω1 }) = n1 . Also,


k
X
∪ki=1 {ωi }

P(A) = P ({ω1 , . . . , ωk }) = P = P ({ωi })
i=1
k
X k |A|
= P ({ω1 }) = kP ({ω1 }) = = .
i=1
n |Ω|

[6 marks]

Question 1 continued on the next page

Page 2 of 11
ST1200

[10 marks]

(b) Two players draw a card from a stack of four cards labelled 1, 2, 3 and 4. In each round,
Player 1 goes first drawing a card uniformly at random from the stack and keeps it.
Player 2 draws a card uniformly at random from the remaining three cards. All cards
are returned to the stack at the end of each round.

(i) Describe the probability space corresponding to each round.


Seen similar

Ω = {(i, j)|i, j ∈ {1, 2, 3, 4}, i ̸= j}, F = P(Ω) and P is the uniform probability on F.
[3 marks]

(ii) Suppose that the player who gets the card with the number 4 wins the round. Let
A1 be the event ‘player 1 wins’ (i.e. player 1 picks card number 4), A2 be the event
‘player 2 wins’ (i.e. player 2 picks card number 4) and D be the event ‘the round
is a draw’ (i.e. neither player picks card number 4). Describe A1 , A2 and D and
compute their probability.
Seen similar

A1 = {(4, j)|j ∈ {1, 2, 3}}, A2 = {(i, 4)|i ∈ {1, 2, 3}} and D = {(i, j)|i, j ∈
{1, 2, 3}, i ̸= j}. Thus, |A1 | = 3, |A2 | = 3 and |D| = 3 × 2 = 6 while |Ω| = 4 × 3 = 12
and consequently P(A1 ) = P(A2 ) = 41 and P(D) = 12 . [3 marks]

(iii) Suppose that the two players play exactly four rounds. Compute the probability
player 1 wins two rounds, player 2 wins one round and the remaining round is a
draw.
Unseen

First, we count in how many  ways player 1 can win exactly twice and player 2
exactly once. This will be 42 × 32 × 21 × 3 × 6. The total number of outcomes is


124 . Thus, the probability is

6 × 32 × 2 × 3 × 6 3
4
=
12 32
[4 marks]

[10 marks]

Question 2 starts on the next page

Page 3 of 11
ST1200

Question 2.

(a) Let X be a random variable on the probability space (Ω, F, P) and let PX be its
distribution.

(i) State what it means for a random variable X to be discrete.


Bookwork

We say that X is discrete if there is a finite or countably infinite set S ⊆ R such that
P(X ∈ R \ S) = 0. [3 marks]

(ii) Suppose that X is a discrete random variable. State the definition of its probability
mass function.
Bookwork

We define the probability mass function of X as the function pX : R → [0, 1] given


by pX (x) = PX (x). [3 marks]

(iii) State the definition of the expectation E[X] and the variance Var(X) of discrete
random variable X with support D and probability mass function pX .
Bookwork

X
E[X] = xpX (x).
x∈D

Var(X) = E[(X − E[X])2 ]


[4 marks]

[10 marks]

(b) Let X and Y be discrete random variables with joint support

DX,Y = {(1, 0), (3, 0), (1, 1), (3, 1), (1, 2), (3, 2)},

such that
1 2
P(X = 1) = , P(X = 3) =
3 3
and
1 1
P(Y = 0|X = 1) = , P(Y = 0|X = 3) = ,
3 4
1 1
P(Y = 1|X = 1) = , P(Y = 1|X = 3) = ,
6 2
1 1
P(Y = 2|X = 1) = , P(Y = 2|X = 3) = ,
2 4

Question 2 continued on the next page

Page 4 of 11
ST1200

(i) Compute the probability mass function pY of Y .


Unseen

From the joint support DX,Y , we extract that DY ⊆ {0, 1, 2}. Using the law of total
probability, we compute
1 1 1 2 5
P(Y = 0) = P(Y = 0|X = 1)P(X = 1) + P(Y = 0|X = 3)P(X = 3) = × + × =
3 3 4 3 18
1 1 1 2 7
P(Y = 1) = P(Y = 1|X = 1)P(X = 1) + P(Y = 1|X = 3)P(X = 3) = × + × =
6 3 2 3 18
1 1 1 2 1
P(Y = 2) = P(Y = 2|X = 1)P(X = 1) + P(Y = 2|X = 3)P(X = 3) = × + × = .
2 3 4 3 3
[4 marks]

(ii) Compute the expectation E[X], variance Var(X) and covariance Cov(X, Y ).
Seen similar

1 2 7
E[X] = 1 × +3× =
3 3 3
5 7 1 19
E[Y ] = 0× +1× +2× =
18 18 3 18
1 2 19
E[X 2 ] = 2
1 × +3 × = 2
3 3 3
2 2 57 49 8
Var(X) = E[X ] − E[X] = − =
9 9 9
1 1 2 1 1 1
E[XY ] = (1 × 0 × × ) + (3 × 0 × × ) + (1 × 1 × × )
3 3 3 4 3 6
2 1 1 1 1 2
+ (3 × 1 × × ) + (1 × 2 × × ) + (3 × 2 × × )
3 2 3 2 4 3
1 1 43
= +1+ +1=
18 3 18
43 7 19 2
Cov(X, Y ) = E[XY ] − E[X]E[Y ] = − × =−
18 3 18 27

[6 marks]

[10 marks]

Question 3 starts on the next page

Page 5 of 11
ST1200

Question 3.

(a) Let X, Y be integrable random variables defined on the same probability space (Ω, F, P).

(i) State the monotonicity and linearity properties of the expectation.


Bookwork

• Monotonicity: If 0 ⩽ X ⩽ Y , then 0 ⩽ E[X] ⩽ E[Y ].


• Linearity: For any a, b ∈ R, E[aX + bY ] = aE[X] + bE[Y ].
[4 marks]

(ii) State the conditions required of Markov’s inequality


E[X]
P(X ⩾ x) ⩽
x
to hold and prove it.
Bookwork

For Markov’s inequality to hold, we need X to be integrable and non-negative


and x > 0. We set Y = x1X⩾x . Then, we show that X(ω) ⩾ Y (ω) for every ω
by considering the following two cases: if X(ω) ⩾ x, then Y (ω) = x ⩽ X(ω) (by
definition of Y ; if X(ω) < x, then Y (ω) = 0 ⩽ X(ω) (since X is non-negative).
Thus, by the monotonicity of expectations E[X] ⩾ E[Y ] = xP(X ⩾ x). The result
follows. [6 marks]

[10 marks]

(b) The random variables X1 , . . . , Xn are identically distributed (but not independent)
Pn with
1
common mean 0 and variance 1. For i ̸= j, Cov(Xi , Xj ) = ρ. Let Sn = n i=1 Xi .
1 n−1
(i) Compute E[Sn ] and show that Var(Sn ) = n
+ n
ρ.
Seen similar

n n
1X 1X
E[Sn ] = E[ Xi ] = E[Xi ] = 0.
n i=1 n i=1

n n
2 21X 1 X
Var(Sn ) = E[(Sn ) ] − E[Sn ] = E[( Xi )2 ] = 2 E[ Xi Xj ]
n i=1 n i,j=1
n
1 X 2 X
= E[ X + 2 X i Xj ]
n2 i=1 i i<j
n n j−1
1 X 2 2 XX
= E[Xi ] + 2 E[Xi Xj ]
n2 i=1 n j=2 i=1

Question 3 continued on the next page

Page 6 of 11
ST1200

1 2ρ (n − 1)n 1 n−1
= + 2 = + ρ.
n n 2 n n
[5 marks]

(ii) Let δ > 0. Explain why

P(|Sn | ⩾ δ) = P (Sn − E[Sn ])2 ⩾ δ 2 .




Seen similar

|Sn (ω)| ⩾ δ ⇐⇒ Sn (ω)2 ⩾ δ 2 , ∀ω ∈ Ω.


and thus the event |Sn (ω)| ⩾ δ is the same as the event Sn (ω)2 ⩾ δ 2 . Moreover,
E[Sn ] = 0, so the equality holds because the events are identical as subsets of Ω.
[2 marks]

(iii) Show that


 1 + (n − 1)ρ
P (Sn − E[Sn ])2 ⩾ δ 2 ⩽
nδ 2
Seen similar

Random variable (Sn − E[Sn ])2 is non-negative (since it is squared) and integrable
(its expectation is computed in (i) and thus exists) and δ 2 > 0. Thus, we can apply
Markov’s inequality
 E[(Sn − E[Sn ])2 ] Var(Sn )
P (Sn − E[Sn ])2 ⩾ δ 2 ⩽ 2
= .
δ δ2
The result follows by using the formula given in question (i) for the variance.
[3 marks]

[10 marks]

Question 4 starts on the next page

Page 7 of 11
ST1200

Question 4.

(a) Let X, Y be random variables defined on the same probability space (Ω, F, P).

(i) State what it means for X, Y to be jointly continuous with probability density
function fX,Y .
Bookwork

X, Y are continuous with probability density function fX,Y : R2 → R+ if


Z b2 Z b1
P(a1 ⩽ X ⩽ b1 , a2 ⩽ Y ⩽ b2 ) = fX,Y (x, y)dxdy
a2 a1

for every a1 < b1 , a2 < b2 ∈ R. [3 marks]

(ii) Let X, Y be two jointly continuous random variables with probability density function
fX,Y . Describe how to obtain the marginal probability density functions of X and
Y from fX,Y
Bookwork
R∞ R∞
fX (x) = −∞
fX,Y (x, y)dy and fY (y) = −∞
fX,Y (x, y)dx. [3 marks]

(iii) Suppose that X, Y are two jointly continuous random variables with probability
density function fX,Y and corresponding marginals fX and fY . Let fX and fY be
such that X and Y are integrable and fX,Y (x, y) = fX (x)fY (y) for all x, y ∈ R.
Show that
E[XY ] = E[X]E[Y ].
Bookwork

Z ∞ Z ∞
E[XY ] = xyfX,Y (x, y)dxdy
Z−∞
∞ Z−∞

= xyfX (x)fY (y)dxdy
Z−∞

−∞
Z ∞
= xfX (x)dx yfY (y)dy = E[X]E[Y ].
−∞ −∞

[4 marks]

[10 marks]

(b) Suppose X and Y are jointly continuous random variables with density fX,Y (x, y) =
xe−x(1+y) , for x, y > 0 and 0 otherwise.

Question 4 continued on the next page

Page 8 of 11
ST1200

(i) Compute the marginal density functions fX (x) and fY (y).


Seen similar

∞ ∞ ∞
e−xy
Z Z 
−x(1+y) −x −xy −x
fX (x) = xe dy = xe e dy = xe = e−x .
0 0 −x y=0

for x > 0 and 0 otherwise,


Z ∞ ∞ Z ∞ −x(1+y)
e−x(1+y)

−x(1+y) e
fY (y) = xe dx = −x + dx
0 1 + y x=0 0 1+y
 −x(1+y) ∞
e 1
= − =
(1 + y) x=0 (1 + y)2
2

for y > 0 and 0 otherwise, [7 marks]

(ii) Are X, Y independent? Justify your answer.


Seen similar

They are not independent because fX,Y (x, y) ̸= fX (x)fX (y). [3 marks]

[10 marks]

Question 5 starts on the next page

Page 9 of 11
ST1200

Question 5.

(a) Let X1 , X2 , X3 , . . . be mutually independent square-integrable discrete random variables


with the same distribution. Denote their mean by µ and variance by σ 2 .

(i) State the Law of Averages.


Bookwork

For any a > 0 and n ∈ N,

σ2
 
X1 + · · · + X n
P µ−a⩽ ⩽µ+a ⩾1− 2
n an

[5 marks]

(ii) State the Central Limit Theorem.


Bookwork

For every a < b,


  Z b
X1 + · · · + Xn − n · µ 1 2
P a⩽ √ ⩽b ≈ √ e−x /2 dx.
σ· n a 2π
[5 marks]

[10 marks]

(b) Let X1 , X2 , X3 , . . . be independent random variables with Bernoulli(p)


Pn distribution, i.e.
1
P(Xi = 1) = p and P(Xi = 0) = 1 − p, for p ∈ (0, 1). Let Sn = n i=1 Xi .

(i) Compute ϕ1 (λ) = E[eλX1 ].


Seen

ϕ1 (λ) = E[eλX1 ] = eλ·0 (1 − p) + eλ·1 p = (1 − p) + eλ p.


[2 marks]
 λ
n
(ii) Show that ϕn (λ) = E[eλSn ] = 1 + (e n − 1)p for λ ∈ R.
Unseen

n
1
λn
Pn
Xi
Y λ  λ
n
ϕn (λ) = E[e i=1 ]= ϕ1 ( ) = (1 − p) + e p .
n

i=1
n
[4 marks]

Question 5 continued on the next page

Page 10 of 11
ST1200

(iii) Compute Var(Sn ) using its moment generating function ϕn (λ) and show that

lim Var(Sn ) = 0.
n→∞

Unseen

Var(Sn ) = ϕ′′n (0) − ϕ′n (0)2


and  n−1
λ λ
ϕ′n (λ) = pe 1 + (e − 1)p
n n ,
p λ λ
n−1 λ
 λ
n−2 p λ
ϕ′′n (λ) = e n 1 + (e n − 1)p + pe n (n − 1) 1 + (e n − 1)p en .
n n
It follows that ϕ′n (0) = p and ϕ′′n (0) = np + p2 n−1
n
and thus

p n−1 p p2
Var(Sn ) = + p2 − p2 = − → 0.
n n n n
[4 marks]

[10 marks]

End of questions

Page 11 of 11

You might also like