0% found this document useful (0 votes)
216 views7 pages

Central Limit Theorem - Solved Problems

The document contains a series of solved problems related to probability and statistics, focusing on concepts such as the Central Limit Theorem (CLT), moment generating functions (MGFs), and the law of large numbers. It includes calculations for probabilities involving weights, sandwich requirements for guests, and convergence of distributions. Each problem is followed by a detailed solution, illustrating the application of statistical principles and formulas.

Uploaded by

zohaibsaleemoff
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
216 views7 pages

Central Limit Theorem - Solved Problems

The document contains a series of solved problems related to probability and statistics, focusing on concepts such as the Central Limit Theorem (CLT), moment generating functions (MGFs), and the law of large numbers. It includes calculations for probabilities involving weights, sandwich requirements for guests, and convergence of distributions. Each problem is followed by a detailed solution, illustrating the application of statistical principles and formulas.

Uploaded by

zohaibsaleemoff
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

7.1.

3 Solved Problems
Problem 1
There are 100 men on a plane. Let Xi be the weight (in pounds) of the ith man on the plane. Suppose that
the Xi 's are i.i.d., and E Xi = μ = 170 and σ X = σ = 30 . Find the probability that the total
i

weight of the men on the plane exceeds 18,000 pounds.

Solution
If W is the total weight, then W = X1 + X2 + ⋯ + Xn , where n = 100 . We have

E W = nμ

= (100)(170)

= 17000,

Var(W ) = 100Var(X i )

2
= (100)(30)

= 90000.

Thus, σ W = 300 . We have

W − 17000 18000 − 17000


P (W > 18000) = P ( > )
300 300

W − 17000 10
= P ( > )
300 3

10
= 1 − Φ( ) (by CLT)
3

−4
≈ 4.3 × 10 .

Problem 2

Let X1 , X2 , ⋯, X25 be i.i.d. with the following PMF

⎧ 0.6 k = 1

PX (k) = ⎨ 0.4 k = −1


0 otherwise

And let

Y = X1 + X2 + ⋯ + Xn .

Using the CLT and continuity correction, estimate P (4 ≤ Y ≤ 6) .

Solution
We have
E X i = (0.6)(1) + (0.4)(−1)

1
= ,
5

2
EX = 0.6 + 0.4
i

= 1.

Therefore,

1
Var(X i ) = 1 −
25
24
= ;
25

2√6
thus, σ Xi = .
5

Therefore,

1
EY = 25 ×
5

= 5,

24
Var(Y ) = 25 ×
25

= 24;

thus, σ Y = 2√6.

P (4 ≤ Y ≤ 6) = P (3.5 ≤ Y ≤ 6.5) (continuity correction)

3.5 − 5 Y −5 6.5 − 5
= P ( – ≤ – ≤ – )
2√6 2√6 2√6

Y −5
= P (−0.3062 ≤ – ≤ +0.3062)
2√6

≈ Φ(0.3062) − Φ(−0.3062) (by the CLT)

= 2Φ(0.3062) − 1

≈ 0.2405

Problem 3

You have invited 64 guests to a party. You need to make sandwiches for the guests. You believe that a guest
might need 0, 1 or 2 sandwiches with probabilities 14 , 12 , and 14 respectively. You assume that the number
of sandwiches each guest needs is independent from other guests. How many sandwiches should you make
so that you are 95% sure that there is no shortage?

Solution
Let Xi be the number of sandwiches that the ith person needs, and let

Y = X 1 + X 2 + ⋯ + X 64 .
The goal is to find y such that

P (Y ≤ y) ≥ 0.95

First note that

1 1 1
E Xi = (0) + (1) + (2)
4 2 4

= 1,

1 1 1
2 2 2 2
EX = (0 ) + (1 ) + (2 )
i
4 2 4
3
= .
2

Thus,

2 2
Var(X i ) = E X − (E X i )
i

3
= −1
2
1 1
= → σ Xi = .

2 √2

Thus,

EY = 64 × 1

= 64,

1
Var(Y ) = 64 ×
2

= 32 → σ Y = 4√2.

Now, we can use the CLT to find y

Y − 64 y − 64
P (Y ≤ y) = P ( – ≤ – )
4√2 4√2

y − 64
= Φ( – ) (by CLT).
4√2

We can write

y − 64
Φ( ) = 0.95

4√2

Therefore,

y − 64
−1
– = Φ (0.95)
4√2

≈ 1.6449
Thus, y = 73.3.

Therefore, if you make 74 sandwiches, you are 95% sure that there is no shortage. Note that
you can find the numerical value of Φ −1 (0.95) by running the norminv(0.95) command in
MATLAB.

Problem 4

Let X1 , X2 , ⋯, Xn be i.i.d. E xponential(λ) random variables with λ = 1. Let

¯¯¯¯
X1 + X2 + ⋯ + Xn
X = .
n

How large should n be such that

¯¯¯¯
P (0.9 ≤ X ≤ 1.1) ≥ 0.95?

Solution
¯¯¯¯
Let Y = X1 + X2 + ⋯ + Xn , so X =
Y
. Since Xi ∼ E xponential(1) , we
n
have

1 1
E (X i ) = = 1, Var(X i ) = = 1.
2
λ λ

Therefore,

E (Y ) = nE X i = n, Var(Y ) = nVar(X i ) = n,

¯¯¯¯
Y
P (0.9 ≤ X ≤ 1.1) = P (0.9 ≤ ≤ 1.1)
n

= P (0.9n ≤ Y ≤ 1.1n)

0.9n − n Y −n 1.1n − n
= P ( −
− ≤ −
− ≤ −
− )
√n √n √n

Y −n

− −

= P (−0.1√n ≤ −
− ≤ 0.1√n ) .
√n

Y −n
By the CLT is approximately N (0, 1) , so
√n

¯¯¯¯ −
− −

P (0.9 ≤ X ≤ 1.1) ≈ Φ (0.1√n ) − Φ (−0.1√n )


= 2Φ (0.1√n ) − 1 (since Φ(−x) = 1 − Φ(x)).

We need to have

− −

2Φ (0.1√n ) − 1 ≥ 0.95, so Φ (0.1√n ) ≥ 0.975.

Thus,

− −1
0.1√n ≥ Φ (0.975) = 1.96



√n ≥ 19.6

n ≥ 384.16

Since n is an integer, we conclude n ≥ 385 .

Problem 5

For this problem and the next, you will need to be familiar with moment generating functions (Section 6.1.3).
The goal here is to prove the (weak) law of large numbers using MGFs. In particular, let X1 , X2 , … , Xn
be i.i.d. random variables with expected value E Xi = μ < ∞ and MGF M X (s) that is finite on some
interval [−c, c] where c > 0 is a constant. As usual, let

¯¯¯¯
X1 + X2 + ⋯ + Xn
X = .
n

Prove

lim M ¯¯¯¯¯(s) = e , for all s ∈ [−c, c].
X
n→∞

¯¯¯¯
Since this is the MGF of constant random variable μ, we conclude that the distribution of X converges to μ.
Hint: Use the result of Problem 8 in Section 6.1.6: for a random variable X with a well-defined MGF,
M X (s), we have

s n
sEX
lim [M X ( )] = e .
n→∞ n

Solution
We have
¯
¯¯¯
¯
sX
M ¯¯¯¯¯(s) = E [e ]
X
X +X +⋯+Xn
1 2
s
= E [e n
]
X X Xn
1 2
s s s
= E [e n
e n
⋯e n
]
sX sX sXn
1 2

= E [e n
] ⋅ E [e n
] ⋯ E [e n
] (since the X i 's are independent)
n
s
= [M X ( )] (since the X i 's are identically distributed)
n

Therefore,
s
n
lim M ¯¯¯¯¯(s) = lim [M X ( )]
X
n→∞ n→∞ n
sEX
= e (by the hint)

= e .

Note that esμ is the MGF of a constant random variable Y , with value Y = μ. This means
¯¯¯¯
that the random variable X converges to μ (in distribution).

Problem 6
The goal in this problem is to prove the central limit theorem using MGFs. In particular, let X1 , X2 , ... ,
X n be i.i.d. random variables with expected value E X i = μ < ∞ , Var(X i ) = σ < ∞ , and
2

MGF M X (s) that is finite on some interval [−c, c], where c > 0 is a constant. As usual, let

¯¯¯¯
X −μ X 1 + X 2 + ⋯ + X n − nμ
Zn = − =
− −
− .
σ/√n √n σ

Prove
2
s

lim M Z n (s) = e 2 , for all s ∈ [−c, c].


n→∞

Since this is the MGF of a standard normal random variable, we conclude that the distribution of Zn
converges to the standard normal random variable.
Hint: Use the result of Problem 9 in Section 6.1.6: for a random variable Y with a well-defined MGF,
M Y (s), and E Y = 0, Var(Y ) = 1 , we have

n 2
s s

lim [M Y ( − )]
− = e 2 .
n→∞ √n

Solution
Let Yi 's be the normalized versions of the Xi 's, i.e.,

Xi − μ
Yi = .
σ

Then, Yi 's are i.i.d. and

E Y i = 0,

Var(Y i ) = 1.

We also have
¯¯¯¯
X −μ
Zn =
σ

√n

Y1 + Y2 + ⋯ + Yn
= .


√n

Thus, we have
Y +Y +⋯+Y n
1 2
s
√n
M Z (s) = E [e ]
n

sY sY sY n
1 2

√n √n √n
= E [e ] ⋅ E [e ] ⋯ E [e ] (the since Y i 's are independent)
n
s
= MY ( ) (theY i 's are identically distributed).
1 −

√n

Thus, we conclude
n
s
lim M Z n (s) = lim M Y1 ( −)

n→∞ n→∞ √n
2
s

= e 2 (by the hint).

Since this is the MGF of a standard normal random variable, we conclude the CDF of Zn
converges to the standard normal CDF.

← previous
next →

The print version of the book is available through Amazon here.

You might also like