Lecture 4
probability and Random Processes
By Tafari Lemma
由NordriDesign提供
[Link]
Lecture Outline
• Joint PMF
• Joint CDF
• Joint PDF
• Marginal Statistics
• Independence
• Conditional Distributions
• Correlation and Covariance
2
Multiple Random Variables
Sometimes we must deal with multiple random variables
simultaneously
Example: Let X and Y denote the blood pressure and heart rate
of a randomly chosen ASTU student (during an exam).
These two quantities are likely to be related, and describing the
probability distribution of X and Y separately will not capture the
relation between the two.
Therefore we need a joint description of the distribution. Let’s
focus first on discrete random variables.
3
Joint Probability Mass Functions
Definition: Joint Probability Mass Function
This is quite similar to what we had before, but now we are
jointly describing the two random variables.
4
Example
Suppose you want to study the relation between the number of
bars in your mobile, and the quality of the call. You collect data
over time and come up with the following stochastic model:
1 2 3 4
0 0 0 0 0
1 0.10 0.05 0 0
2 0.05 0.10 0.04 0.01
3 0 0.05 0.15 0.15
4 0 0 0.05 0.25
It’s easy to check that this table describes a valid joint probability
mass function
5
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.
Definition: Marginal Probability Mass Function
These are valid probability mass functions on their own, and are
called the marginal p.m.f.’s of X and Y.
6
Example
Suppose you want to study the relation between the number of
bars in your mobile, and the quality of the call. You collect data
over time and come up with the following stochastic model:
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
7
Conditional Probability Distribution
What information does one variable carry about the other?
In our example we might be interested to know about the call
quality when we have 4 bars.
Definition: Conditional Probability Mass Function
8
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
9
Properties
Note that the conditional probability mass function is actually a
proper mass function therefore
10
Independence of Random V.’s
There are situations were knowing the value of X doesn’t tell
something about Y and vice-versa. This brings up to notion of
independence of random variables.
Definition: Independence of Random Variables
10
11
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
12
Example
The joint pmf of two discrete random variables X and
Y is given by:
k (2 xi + y j ) , xi = 1, 2; y = 1, 2
PXY ( xi , y j ) =
0 , otherwise
whe re k is a constant.
a. Find the value of k .
b. Find the marginal pmf of X and Y .
c. Are X and Y independent?
13
Example
a. P
xi yj
XY ( xi , y j ) = 1
2 2
k (2 xi + y j ) = 1
xi =1 y j =1
k[( 2 + 1) + (2 + 2) + (4 + 1) + (4 + 2)] = 1
18k = 1
k = 1 / 18
14
Example
b. Marginal pmf of X and Y
i. Marginal pmf of X
2
1
PX ( xi ) = PXY ( xi , y j ) = (2 xi + y j )
yj y j =118
1 1
PX ( xi ) = (2 xi + 1) + (2 xi + 2)
18 18
1
(4 xi + 3), xi = 1, 2
PX ( xi ) = 18
0, otherwise
15
Example
b. Marginal pmf of X and Y
ii. Marginal pmf of Y
2
1
PY ( y j ) = PXY ( xi , y j ) = (2 xi + y j )
xi xi =1 18
1 1
PY ( y j ) = (2 + y j ) + (4 + y j )
18 18
1
(2 y j + 6), y j = 1, 2
PY ( y j ) = 18
0, otherwise
16
Example
c. PXY ( xi , y j ) PX ( xi ) PY ( y j )
X and Y are not independent.
17
The Joint Cumulative Distribution Function
▪ The joint cdf of two random variables X and Y denoted by
FXY(x, y) is a function defined by:
FXY ( x, y ) = P[ X ( ) x and Y ( ) y ]
FXY ( x, y ) = P( X x, Y y )
where x and y are arbitrary real numbers.
Properties of the Joint cdf, FXY(x, y):
i. 0 FXY ( x, y) 1
ii. lim FXY ( x, y ) = FXY (, ) = 1
x →
y →
iii. lim FXY ( x, y ) = FXY (−,−) = 0
x → −
y → −
18
Continuous Random Variables
All the concepts we introduced in this lecture can also be defined
for continuous random variables. However, this requires the use
of multiple integrals (essentially replacing the double summations
we’ve seen before).
Definition: Joint Probability Density Function
19
The Joint Probability Density Function
▪ The joint probability function (pdf) of two continuous
random variables X and Y is defined as:
2 FXY ( x, y )
f XY ( x, y ) =
xy
▪ Thus, the joint cumulative distribution function (cdf) is
given by:
y x
FXY ( x, y ) = f XY (u , v)dudv
- -
20
Example
21
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.
Definition: Marginal Density Function
These are valid probability density functions on their own, and
describe X and Y individually (but disregard how these two
random variables are related)
22
Conditional Probability Distributions
What information does one variable carry about the other?
Definition: Conditional Probability Density Function
23
Properties
Note that the conditional probability density function is actually
a proper density function therefore
24
Example
E x a m p l e : Let X be the input to a communication channel and Y the
output. The input to the channel is + 1 volt or −1 volt with equal probability.
The ou tput of the channel is the input plus a noise voltage N t h a t is
uniformly distributed in the interval [−2, +2] volts. Find P [X = +1, Y ≤ 0].
Solution:
P [X = + 1, Y ≤ y] = P [Y ≤ y|X = + 1]P [X = + 1],
where P [X = +1] = 1/2. When the input X = 1, the out put Y is
uniformly distributed in the interval [−1,3]. Therefore,
y + 1 for −1 ≤ y ≤ 3.
P [Y ≤ y|X = +1] =
4
25
Example
E x a m p l e : Let X be the input to a communication channel and let
Ybe the output. The input to the channel is + 1 volt or −1 volt with
equal probability. The o ut put of the channel is the input plus a
noise voltage N t h a t is uniformly distributed in the interval [−2,+2]
volts. Find the probability t h a t Yis negative given t h a t X is + 1 .
Solution
If X = + 1 , then Y is uniformly distributed in the interval [−1, 3] a nd
f (y|1) = 1
−1 ≤ y ≤ 3
Y 4
Thus
.
Independence of Random Variables
As with discrete random variables there are situations were
knowing the value of X doesn’t tell something about Y and vice-
versa. This brings up again the notion of independence of random
variables.
Definition: Independence of Random Variables
22
27
Examples on Two Random Variables
Example-1:
The joint pdf of two continuous random variables X and Y is
given by: kxy , 0 x 1, 0 y 1
f XY ( x, y ) =
0 , otherwise
whe re k is a constant.
a. Find the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d. Find P( X + Y 1)
e. Find the conditional pdf of X and Y .
28
Examples on Two Random Variables Cont’d……
1 1
a.
- −
f XY ( x, y )dxdy = 1
0 0 kxydxdy= 1
x2 1
1
k y = 1
0
2 0
k 1 y 2 1 k
ydy = k = = 1
2 0 4 0 4
k = 4
29
Examples on Two Random Variables Cont’d……
b. Marginal pdf of X and Y
i. Marginal pdf of X
1
f X ( x) = f XY ( x, y )dy = 4 xydy
− 0
y2
1
f X ( x) = 4 x = 2 x
2 0
2 x , 0 x 1
f X ( x) =
0, otherwise
30
Examples on Two Random Variables Cont’d……
b. Marginal pdf of X and Y
ii. Marginal pdf of Y
1
fY ( y ) = f XY ( x, y )dx = 4 xydx
− 0
x2 1
fY ( y ) = 4 y = 2 y
2 0
2 y , 0 y 1
fY ( y ) =
0, otherwise
31
Examples on Two Random Variables Cont’d……
c. f XY ( x, y) = f X ( x) fY ( y)
X and Y are independent
1 1− y 1 x2 1
d . P( X + Y 1) = 4 xydxdy = 4 y dy
0 0 0
2 0
1 1
= 4 y[1 / 2(1 − y ) ]dy = 2( y − 2 y 2 + y 3 )dy
2
0 0
= 2( y 2 / 2 − 2 y 3 / 3 + y 4 / 4) = 1 / 6
P( X + Y 1) = 1 / 6
32
Examples on Two Random Variables Cont’d……
e. Conditional pdf of X and Y
i. Conditional pdf of X
f XY ( x, y ) 4 xy
f X / Y ( x / y) = = = 2x
fY ( y ) 2y
2 x, 0 x 1, 0 y 1
f X / Y ( x / y) =
0, otherwise
33
Examples on Two Random Variables Cont’d……
e. Conditional pdf of X and Y
ii. Conditional pdf of Y
f XY ( x, y ) 4 xy
fY / X ( y / x) = = = 2y
f X ( x) 2x
2 y, 0 x 1, 0 y 1
fY / X ( y / x) =
0, otherwise
34
Examples on Two Random Variables Cont’d……
Example-2:
The joint pdf of two continuous random variables X and Y is
given by:
k , 0 y x 1
f XY ( x, y ) =
0, otherwise
whe re k is a constant.
a. Determine the value of k .
b. Find the marginal pdf of X and Y .
c. Are X and Y independent?
d . Find P(0 X 1 / 2)
e. Find the conditional pdf of X and Y .
35
Examples on Two Random Variables Cont’d……
1 1
a.
- −
f XY ( x, y )dxdy = 1
0 kdxdy = 1
y
1
k (x ) = 1
1
0 y
1 y 2 1 k
k (1 − y )dy = k y − = = 1
0
2 0 2
k = 2
36
Examples on Two Random Variables Cont’d……
b. Marginal pdf of X and Y
i. Marginal pdf of X
x
f X ( x) = f XY ( x, y )dy = 2dy
− 0
x
f X ( x) = (2 y ) = 2 x
0
2 x , 0 x 1
f X ( x) =
0, otherwise
37
Examples on Two Random Variables Cont’d……
b. Marginal pdf of X and Y
ii. Marginal pdf of Y
1
fY ( y ) = f XY ( x, y )dx = 2dx
− y
1
fY ( y ) = (2 x ) = 2(1 − y )
y
2(1 − y ), 0 y 1
fY ( y ) =
0, otherwise
38
Examples on Two Random Variables Cont’d……
c. f XY ( x, y) f X ( x) fY ( y)
X and Y are not independent
1/ 2 x
d . P(0 X 1 / 2) = f XY ( x, y )dydx
0 0
1/ 2 x 1/ 2 x
= 2dydx = (2 y ) dx
0 0 0 0
1/ 2 1/ 2
= 2 xdx = x = 1/ 4 2
0 0
P(0 X 1 / 2) = 1 / 4
39
Examples on Two Random Variables Cont’d……
e. Conditional pdf of X and Y
i. Conditional pdf of X
f XY ( x, y ) 2 1
f X /Y ( x / y) = = =
fY ( y ) 2(1 − y ) (1 − y )
1
, 0 y x 1
f X / Y ( x / y ) = 1 − y
0,
otherwise
40
Examples on Two Random Variables Cont’d
e. Conditional pdf of X and Y
ii. Conditional pdf of Y
f XY ( x, y ) 2 1
fY / X ( y / x) = = =
f X ( x) 2x x
1
, 0 y x 1
fY / X ( y / x) = x
0, otherwise
41
Covariance
Definition: Covariance
42
Covariance
The covariance is a measure of linear relationship between
random variables. If one of the variables is easy to predict as a
linear function of the other then the covariance is going to be
non-zero.
Definition:
30
43
Correlation Coefficient
It is useful to normalize the covariance, and define the
Definition: Correlation Coefficient
44
Example
Recall our previous example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
45
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
46
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
47
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
As the correlation is significantly different than zero this means
that Y can be well-predicted from X using a linear relationship.
48
Correlation of Independent R.V’s
Proposition:
49
Correlation
Uncorrelation IS NOT EQUIVALENT to Independence
It’s important to note that the implication goes only in one direction:
50
Thank You !!!