0% found this document useful (0 votes)
436 views4 pages

Ee132b Hw1 Sol

This document contains solutions to homework problems involving probability distributions and random variables: (1) It solves problems about the geometric and exponential distributions, finding their means, variances, and moment generating functions. (2) It addresses whether the sum of random variables follows certain distributions and determines parameters when true. (3) It calculates the expected total amount spent by customers in a store using the mean number of customers and their spending. (4) It derives the marginal probability mass functions for two random variables with a joint PMF and checks if they are independent.

Uploaded by

Ahmed Hassan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
436 views4 pages

Ee132b Hw1 Sol

This document contains solutions to homework problems involving probability distributions and random variables: (1) It solves problems about the geometric and exponential distributions, finding their means, variances, and moment generating functions. (2) It addresses whether the sum of random variables follows certain distributions and determines parameters when true. (3) It calculates the expected total amount spent by customers in a store using the mean number of customers and their spending. (4) It derives the marginal probability mass functions for two random variables with a joint PMF and checks if they are independent.

Uploaded by

Ahmed Hassan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

EE132B-HW Set #1 - Sol.

UCLA 2015 Fall

Prof. Izhak Rubin

Problem 1
Let X denote a geometric random variable with parameter 1 p (0, 1) such that
P (X = n) = p(1 p)n , for n = 0, 1, . . .
(1) Calculate the mean directly.
(2) Calculate the variance directly.
(3) Calculate the moment generating function (Z-transform).
(4) Using the moment generating function, derive the mean and the variance.
Ans:
Let q = 1 p
(1)
E [X] =

n (1 q) q n = (1 q) q

nq n1

n=0

n=0

= (1 q) q

d n
q
1p
d X
qn =
q = (1 q) q
=
dq n=0
1q
p
n=0 dq

(2)
E [X(X 1)] =

n(n 1)(1 q)q n = (1 q)q 2

= (1 q)q 2

n(n 1)q n2

n=0

2 X

n=0

d n
d
q = (1 q)q 2 2
2
dq
n=0 dq

qn =

n=0

2q 2
(1 q)2

Therefore, we obtain
V ar[X] = E[X 2 ] E[X]2 = E [X(X 1)] + E[X] E[X]2 =
(3) For Re(s) 0, we have
h

(z) = E z X =

z n (1 q)q n =

n=0

1q
.
1 qz

(4) With the moment generating function, we obtain


d
q
1p
(z)|z=1 = E[X] =
=
dz
1q
p
2
d
(z)|z=1 = E [X(X 1)] = E[X 2 ] E[X],
dz 2
where E[X 2 ] =

q+q 2
.
(1q)2

Therefore, we obtain

V ar[X] = E[X 2 ] E[X]2 =


1

1p
q
=
.
2
(1 q)
p2

1p
q
=
.
2
(1 q)
p2

EE132B-HW Set #1 - Sol.

UCLA 2015 Fall

Prof. Izhak Rubin

Problem 2
Let X denote an exponential random variable with parameter [0, ). The
probability density function for X is given by fX (x) = ex , for x > 0.
(1) Calculate the mean directly.
(2) Calculate the variance directly.
(3) Calculate the moment generating function (Laplace transform).
(4) Using the moment generating function, derive the mean and the variance.
Ans:
1. E[X] =

R
0

xdFX (x) =

2. Since E[X 2 ] =
1
.
2

R
0

3. (s) = E[esX ] =

R
0

xex dx = 1 .

x2 dFX (x) =
R
0

d
4. E[X] = lims0 ds
V ar[X] = 12 .

R
0

x2 ex dx =

esX ex dx =

+s

2
,
2

V ar[X] = E[X 2 ]E[X]2 =

.
+s
2

d
and E[X 2 ] = lims0 ds
2

+s

2
.
2

Thus,

Problem 3
Let X and Y denote two independent Poisson random variables with parameter
X and Y , respectively. Answer the following True/False problems. You need to
justify your answers.
1. Z = X + Y is a Poisson random variable. If true, determine its parameter.
2. Z = X + 1 is a Poisson random variable. If true, determine its parameter.
3. Z = 10X + 5Y is a Poisson random variable. If true, determine its parameter.
Ans:
1. The statement is true. Let gX (z) and gY (z) denote the generating functions of
the probability mass functions for X and Y . Let gZ (z) denote the generating
functions of the probability mass function for X +Y . If X and Y are statistically
independent, gX+Y (z) = gX (z)gY (z). In this problem, for X and Y, we have
gX (z) =

zn

X
(X )n
eX (X )n
z
zn
= eX
= eX e X = eX (1z) ,
n!
n!
n=0

n=0
Y (1z)

gY (z) = e

gX+Y (z) = gX (z)gY (z) = e(X +Y )(1z) .


2

EE132B-HW Set #1 - Sol.

UCLA 2015 Fall

Prof. Izhak Rubin

Thus from the uniqueness of the generating function, we conclude that the
random variable X + Y has the Poisson distribution with parameter X + Y .
2. The statement is false since Pr{Z = 0} = 0.
3. The statement is false. The mean and variance of a Poisson random variable
must be equal. The mean of Z is equal to 10X + 5Y , but the variance of Z is
equal to 100X + 25Y . In general, the two statistics are not equal.

Problem 4
Suppose that the number of customers entering a department store in a day is a
random variable with mean of 30 customers/day. Suppose that the amounts of money
spent by each one of these customers are statistically independent random variables
with mean $10 (per customer). Also assume that the amount of money spent by each
customer is independent of the number of customers to enter the store. Calculate
the expected amount of money spent by the customers that enter the store during a
single day.
Ans:
Let Xi denote the amount of money spent by the ith customer. Then, the total
amount of the money spent by customers per day, denoted as Y , has a mean that is
computed as follows:
"

E[Y ] = E E

"N
X
i=1

Xi | N = E

"

N
X

E[Xi ] = E

i=1

"N
X

10 = 10E[N] = $300.

i=1

Problem 5
Define X and Y to be two discrete random variables whose joint probability mass
function is given as follows:
P (X = m, Y = n) =

e7 4m 3nm
,
m!(n m)!

for m = 0, 1, . . . , n and n = 0, 1, 2, . . . , while P (X = m, Y = n) = 0 for other values


of m, n. Calculate the marginal probability mass functions for the random variables
X and Y . Check whether X and Y are statistically independent random variables.
Ans:
Discrete random variables X and Y are independent, if and only if P (X = m, Y =
n), m, n. The marginal probability mass function for the random variable X is
calculated to be:

X
X
e7 4m 3nm
P (X = m, Y = n) =
P (X = m) =
n=m m!(n m)!
n=0
=

e3 3nm
e4 4m
e4 4m X
=
.
m! n=m (n m)!
m!

EE132B-HW Set #1 - Sol.

UCLA 2015 Fall

Prof. Izhak Rubin

The marginal probability mass function for the random variable Y is computed
to be:
P (Y = n) =

P (X = m, Y = n) =

m=0
7 n

n
X

e7 4m 3nm
m=0 m!(n m)!

!  

n
e 7 X
4 nm e7 7n
4 m
n
=
1
.
=
n! m=0 m
7
7
n!

Since P (X = m, Y = n) is not equal to P (X = m, Y = n), X and Y are not


statistically independent random variables.

You might also like