0% found this document useful (0 votes)
12 views22 pages

BA (H) Eco - Sem-I - DSC-3 - Introductory Statistics For Eco - L-10 - Joint Probability Distribution - Sugandh KR Choudhary

Uploaded by

PRAKASH CHANDRA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views22 pages

BA (H) Eco - Sem-I - DSC-3 - Introductory Statistics For Eco - L-10 - Joint Probability Distribution - Sugandh KR Choudhary

Uploaded by

PRAKASH CHANDRA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Page 1 of 22 - Cover Page Submission ID trn:oid:::3618:89623614

Sugandh Kr Choudhary
BA(H) Eco_Sem-I_DSC-3_Introductory Statistics for
Eco_L-10_Joint Probability Distribution_Sugandh Kr
SOL Delhi University

Document Details

Submission ID

trn:oid:::3618:89623614 18 Pages

Submission Date 3,842 Words

Apr 4, 2025, 5:08 PM GMT+5:30


15,780 Characters

Download Date

May 1, 2025, 2:50 PM GMT+5:30

File Name

BA(H) Eco_Sem-I_DSC-3_Introductory Statistics for Eco_L-10_Joint Probability Distribution_Sugand….pdf

File Size

816.1 KB

Page 1 of 22 - Cover Page Submission ID trn:oid:::3618:89623614


Page 2 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614

17% Overall Similarity


The combined total of all matches, including overlapping sources, for each database.

Filtered from the Report


Bibliography

Quoted Text

Cited Text

Small Matches (less than 14 words)

Abstract

Exclusions
30 Excluded Matches

Match Groups Top Sources

29 Not Cited or Quoted 17% 13% Internet sources


Matches with neither in-text citation nor quotation marks
10% Publications
0 Missing Quotations 0% 14% Submitted works (Student Papers)
Matches that are still very similar to source material

0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation

0 Cited and Quoted 0%


Matches with in-text citation present, but no quotation marks

Integrity Flags
0 Integrity Flags for Review
Our system's algorithms look deeply at a document for any inconsistencies that
No suspicious text manipulations found. would set it apart from a normal submission. If we notice something strange, we flag
it for you to review.

A Flag is not necessarily an indicator of a problem. However, we'd recommend you


focus your attention there for further review.

Page 2 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614


Page 3 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614

Match Groups Top Sources

29 Not Cited or Quoted 17% 13% Internet sources


Matches with neither in-text citation nor quotation marks
10% Publications
0 Missing Quotations 0% 14% Submitted works (Student Papers)
Matches that are still very similar to source material

0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation

0 Cited and Quoted 0%


Matches with in-text citation present, but no quotation marks

Top Sources
The sources with the highest number of matches within the submission. Overlapping sources will not be displayed.

1 Submitted works

University of KwaZulu-Natal on 2022-11-23 3%

2 Internet

epdf.pub 2%

3 Internet

www.vidyalankar.org 1%

4 Internet

www.geeksforgeeks.org 1%

5 Publication

Mengdi Huang, Jianxia Chang, Aijun Guo, Mingzhe Zhao, Xiangmin Ye, Kaixuan Le… 1%

6 Internet

huggingface.co <1%

7 Internet

zh.scribd.com <1%

8 Internet

stats.libretexts.org <1%

9 Submitted works

InSite Institution Placeholder on 2017-12-29 <1%

10 Submitted works

Kenyatta University on 2020-11-07 <1%

Page 3 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614


Page 4 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614

11 Submitted works

University of West London on 2022-11-20 <1%

12 Internet

www3.ul.ie <1%

13 Submitted works

Ain Shams University on 2020-07-08 <1%

14 Publication

Steve Brooks, Andrew Gelman, Galin Jones, Xiao-Li Meng. "Handbook of Markov C… <1%

15 Publication

Jim Pitman. "Probability", Springer Science and Business Media LLC, 1993 <1%

16 Publication

Narayan C. Giri. "Introduction to Probability and Statistics", CRC Press, 2019 <1%

17 Publication

Oliver C. Ibe. "Multiple Random Variables", Elsevier BV, 2014 <1%

18 Submitted works

University of Leeds on 2014-05-09 <1%

19 Publication

Jingli Ren, Haiyan Wang. "Probability", Elsevier BV, 2023 <1%

20 Submitted works

Sri Sri University, Cuttack on 2023-06-06 <1%

Page 4 of 22 - Integrity Overview Submission ID trn:oid:::3618:89623614


Page 5 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

LESSON 10
JOINT PROBABILITY DISTRIBUTION AND MATHEMATICAL EXPECTATIONS

STRUCTURE
10.1 Learning Objectives
10.2 Introduction
10.3 Joint Probability Mass Function
10.3.1 Conditional Probability Distributions
10.3.2 Independence of Random Variables
10.3.3 Marginal Probability Mass Functions
10.3.4. Expectations of Probability Mass Functions
10.4 Continuous Random Variables
10.4.1 Marginal Probability Density Functions
10.4.2 Expected Value of a Probability Density Function
10.4.3 Conditional Probability Distributions
10.5 Summary
10.6 Glossary
10.7 Answers to In-Text Questions
10.8 Self-Assessment Questions
10.9 References
10.10 Suggested Reading

10.1 LEARNING OBJECTIVE


After reading this lesson, student will be able to :
1. Identify the probability distribution of two or more events occurring together
2. Calculate marginal distributions of more than one variable of discrete and continuous
distributions

167 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 5 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 6 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

3. Calculate conditional probability and verify independence of probability distributions


and
4. Calculate mathematical expectations of joint probability mass function and joint
probability density functions
10.2 INTRODUCTION
This chapter deals with the probability distribution of two or more random variables called
joint probability distribution. There are two types of joint probability distribution. One is
probability mass function (PMF) and the other is probability density function (PDF). In the
case of joint probability distribution of two discrete variables, the probability distribution
function is called probability mass function. In the case of continuous variables, it is called
probability density function. The chapter first deals with the joint probability mass function
and then probability density function in the second half of the chapter.
10.3 JOINT PROBABILITY MASS FUNCTION
Joint probability mass function is related to the probability distribution of two discrete
variables. It is characterized by the following features.
19
5 Let X and Y be the two discrete random variables on the sample space S. The joint probability
mass function (PMF) is given by
P( x, y ) = P( X − x and Y = y ) where
( x, y ) is a pair of possible values for the pair of random variables ( X , Y ) and P ( x, y )
must satisfy the following conditions –
(a) 0  P( x, y )  1
13

 P( x, y ) = 1
x y
(b)
The probability P[( X , Y )  A] is obtained by summing the joint PMF.

P[( X , Y )  A] =  P ( x, y )
( x , y ) A
(c)
It must be noted that conditions (a) and (b) are required for P ( x, y ) to be a valid joint PMF.
1 Example-1
Consider two random variables X and Y with joint PMF as shown in the table below:
Y=0 Y=1 Y=2
X=0 1/6 1/4 1/8

168 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 6 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 7 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

1 X=1 1/8 1/6 1/6

Find the following

(i) P ( X = 0, Y  1)
(ii) P (Y = 0, X  1)
Solution:
1
(i) P( X = 0, Y  1) = PXY (0,0) + PXY (0,1)
1 1 5
= + =
6 4 12 Answer.
1 1 7
P(Y = 2, X  1) = PXY (0, 2) + PXY (1, 2) = + =
(ii) 8 6 24

Example-2
A function f is given by 𝑓(𝑥, 𝑦) = 𝑐𝑥𝑦 for 𝑥 = 1,2,3 ; 𝑦 = 1,2,3
Determine the value of 𝑐 for which the above function 𝑓(𝑥, 𝑦) 𝑣𝑎𝑙𝑖𝑑𝑎𝑡𝑒 𝑎𝑠 𝑡𝑟𝑢𝑒 𝑝. 𝑚. 𝑓.
Solution:
From the question given,
𝑓(𝑥, 𝑦) = 𝑐𝑥𝑦
Where, 𝑥 = 1,2,3 𝑎𝑛𝑑 𝑦 = 1,2,3.
There are 9 possible pairs of 𝑋 𝑎𝑛𝑑 𝑌, namely (1,1), (1,2), (1,3), (2,1), (2,2), (2,3),
(3,1), (3,2) 𝑎𝑛𝑑 (3,3). The probabilities associated with each of the pairs are:
𝑓(1,1) = 𝑐(1)(1) = 𝑐
𝑓(1,2) = 2𝑐, 𝑓(1,3) = 3𝑐, 𝑓(2,1) = 2𝑐
𝑓(2,2) = 4𝑐, 𝑓(2,3) = 6𝑐, 𝑓(3,1) = 3𝑐
𝑓(3,2) = 6𝑐, 𝑓(3,3) = 9𝑐
For 𝑓(𝑥, 𝑦) to be a valid joint 𝑝𝑚𝑓,

∑ ∑ 𝑓(𝑥, 𝑦) = 1
𝑦
𝑥

Hence,

169 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 7 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 8 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

3 3

∑ ∑ 𝑓(𝑥, 𝑦) = 𝑐 + 2𝑐 + 3𝑐 + 2𝑐 + 4𝑐 + 6𝑐 + 3𝑐 + 6𝑐 + 9𝑐 = 1
𝑥=1 𝑦=1

36𝑐 = 1
1
𝑐=
36
1
Thus, for, 𝑐 = the given function is a valid probability mass function.
36

10.3.1 Conditional Probability


Conditional probability has already been discussed earlier. It is once again reiterated that
4 conditional probability is a measure of the probability of an event occurring given that another
event has already occurred.
18
Conditional probability is denoted by P(( A | B ) where
P( A  B)
P(( A | B) = ; P( B)  0
P( B)
In this chapter, we deal in joint probability of two random variable X and Y. The conditional
probability of which is given by
P( X  C , Y  D)
P[ X  C | Y  D] =
P(Y  D)

where C , D  R .
For discrete random variables X and Y, the conditional PMFs of X given Y and Y given by X
respectively are given by
PXY ( xi , y j )
PX |Y ( xi / y j ) =
{for any xi  RX and
PY ( y j )

PXY ( xi , y j )
PY | X ( y j / xi ) =
PY ( xi ) y j  RY

Example-3
1 Consider two random variables X and Y with joint PMF as shown in the table below
Y=2 Y=4 Y=5
X=1 1/12 1/24 1/24

170 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 8 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 9 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

10 X=2 1/6 1/12 1/8


X=3 1/4 1/8 1/12

Find the following –

(a) P( X  2, Y  4)
(b) P(Y = 2 | X = 1)
Solution:

(a) P( X  2, Y  4)
1

= PXY (1,2) + PXY (1,4) + PXY (2,2) + PXY (2,4)


1 1 1 1 3
= + + + =
12 24 6 8 8
P( X = 1, Y = 2)
P(Y = 2 | X = 1) =
(b) P( X = 1)
P (1, 2) 1 1 1
= XY =  =
PX (1) 12 6 2 Ans.

10.3.2 Independence of Random Variables


7 In the case of joint PMF, criteria for the independence of two discrete random variables X and
Y are given by –
PXY ( x, y) = PX ( x)  PY ( y)  x, y
The above condition must fulfil for two discrete random variables X and Y is independent.
Example-4
From the question in example 3, check if X and Y are independent.
Solution:
11 For X and Y to be independent.
P( X = xi , Y = y j ) = P( X = xi )  P(Y = y j )

for all
xi  RX and for all y j  RY

1
P( X = 2, Y = 2) =
6

171 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 9 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 10 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

3 1 3
P( X = xi )  P(Y = y j ) =  =
8 2 16
1 3

6 16
 X and Y are not independent.

10.3.3 Marginal Probability Mass Functions

If ( X , Y ) are discrete variables, then marginal probability is the probability of a single event
that occur independent of another event.

The marginal probability mass function of X i is obtained from the joint PMF as shown below–

PX i ( x) =  PX ( x1, x2 ,..., xk )
X1... X k

In words the marginal PMF of Xi at the point X is obtained by taking the sum of the joint PMF
PX out all the vectors that belong to RX in such a way that is component is equal to X.

Example-5
Carrying forward from example 3, find the marginal PMFs of X and Y.
Solution
RX = {1,2,3}, RY = {2,4,5}
Marginal PMFs are given by
1
6 , for X = 1

 3 for X = 2
PX ( x) =  8
 11
 for X = 3
 24
0 Otherwise

172 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 10 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 11 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

1
2 , for Y = 2

 1 for Y = 4
PY ( y ) =  4
1
 for Y = 5
4
0 Otherwise

10.3.4 Expectation of a PMF

Let X and Y be a jointly distributed Random variable with probability mass function P ( x, y )
12
with discrete variables. Then the expected value of function g ( x, y ) is given by
E[ g ( X , Y )] =   g ( X , Y )  P ( x, y )
x y

Example-6
Find E(XY) for data given in example 2
Solution:

1 1 1 1 1
= (1 × 2 × ) + (1 × 4 × ) + (1 × 5 × ) + (2 × 2 × ) + (2 × 4 × )
12 24 24 6 12
1 1 1 1
+ (2 × 5 × ) + (3 × 2 × ) + (3 × 4 × ) + (3 × 5 × )
8 4 8 12
177
= = 7.38
24

IN–TEXT QUESTIONS
Answer the following MCQs
3
1. Let U  {0,1} and V {0, 1} be two independent binary variables. If P (U = 0) = P and
P (V = 0) = q, when P (U + V )  1 is

(a) pq + (1 − p )(1 − q )

(b) pq
(c) p (1 − q )

173 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 11 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 12 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

(d) 1 − pq

2. If a variable can take certain integer values between two given points, then it is called–
(a) Continuous random variable
(b) Discrete random variable
(c) Irregular random variable
(d) Uncertain random variable
3. If E (U ) = 2 and E (V ) = 4 then E (U − V ) = ?
(a) 2
(b) 6
(c) 0
(d) Insufficient data
4. Height is a discrete variable (T / F)
5. If X and Y are two events associated with the same sample space of a random
experiment. then P ( X | Y ) is given by
14
(a) P ( X  Y ) / P(Y ) provided P(Y )  0

(b) P ( X  Y ) / P(Y ) provided P(Y ) = 0

(c) P ( X  Y ) / P (Y )

(d) P( X  Y ) / P( X )

6. Let X and Y be events of a sample space S of an experiment. If P ( S | Y ) = P(Y | Y ) then


the value of P (Y | Y ) is
(a) 0
(b) −1
(c) 1
(d) 2
7. What are independent events? Choose the correct option:
(a) If the outcome of one event does not affect the outcome of another.
(b) If the outcome of one event affects the outcome of another.
(c) Any one of the outcomes of one event does not affect the outcome of another.
(d) Any one of the outcomes of one event does affect the outcome of the other.\
174 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 12 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 13 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

2 10.4 CONTINUOUS RANDOM VARIABLES


The probability that the observed value of a continuous random variable X lies in a one-
dimensional set A, is obtained by integrating the probability density function (PDF) f ( x ) over
the set A.
Similarly, the probability that the pair ( X , Y ) of a continuous random variable fall in a two-
dimensional set A is obtained by integrating the joint PDM.
Joint density function is a piecewise continuous function of two variables f ( x, y ) , such that
for any "reasonable" two-dimensional set B
P( X , Y )  A =  f ( x, y)dydy
A .
2
Definition: Let X and Y be continuous random variables. A joint density function f ( x ) for
these two variables is a function satisfying

(a) f ( x, y )  0
and
 

  f ( x, y )dxdy = 1
− −

then for two-dimensional set A


P[( X , Y )  A] =  f ( x, y)dydx
A

Example-7
The joint PDF of (X, Y) is given by
6
 (x + y ) 0  x  1, 0  y  1
2
f ( x, y ) =  5

 0 Otherwise

answer the following

(a) Verify that f ( x, y ) is a legitimate PDF.


 1 1
P0  x  , 0  y  
(b) Find  n n
Solution: (a) Two conditions must be satisfied for f ( x, y ) to be a legitimate PDF

175 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 13 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 14 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

(i) f ( x, y )  0 and
o 

  f ( x, y )dxydy = 1
(ii) − −
the first condition is fulfilled as f ( x, y )  0 for the verification of the second condition –
  1 1
6
  f ( x, y )dxdy =   5
( x + y 2 )dxdy
− − 0 0

1 1 1 1
6 6 2
=  x dxdy +   y dxdy
0 0
5 0 0
5
1 1
6 6
= x dxdy +  y 2 dxdy
0
5 0
5

= 1. This second condition is also verified.

P  0  X  , 0  Y  
1 1
(b)  4 4
1/ 4 1/ 4
6
=   5
( x + y 2 )dxdy
0 0

1/ 4 1/ 4 1/ 4
6 6
=
5   x dxdy +
5  y 2 dxdy
0 0 0

1 1
2 x= 4 3 y= 4
= 6  x + 6 y
20 2 x =0 20 3 y =0

7
=
640 Ans.

Example-8
Consider two continuous random variables X and Y with joint p.d.f.

176 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 14 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 15 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

2 2
𝑓(𝑥, 𝑦) = {81 𝑥 𝑦, 0 < 𝑥 < 𝐾, 0 < 𝑦 < 𝐾
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

a) find the value of K so that 𝑓(𝑥, 𝑦)𝑖𝑠 𝑎 𝑣𝑎𝑙𝑖𝑑 𝑝. 𝑑. 𝑓.


b) find P(X>3Y)
Solution:
a) for 𝑓(𝑥, 𝑦) 𝑡𝑜 𝑏𝑒 𝑎 𝑣𝑎𝑙𝑖𝑑 𝑝. 𝑑. 𝑓. conditions of continuous p.d.f. must satisfy
𝑘 𝑘 2 𝐾5
therefore, ∫0 ∫0 𝑥 2 𝑦𝑑𝑥𝑑𝑦 = 1 = 243 => K = 3
81

𝑥
3 2
b) P(X>3Y) = ∫0 (∫03 81 𝑥 2 𝑦𝑑𝑦) 𝑑𝑥
3 1
= ∫0 𝑥 4 𝑑𝑥
729
1
= 15

10.4.1 Marginal Probability Density Function


Marginal PDF in the continuous distribution variable can be obtained in a similar manner as in
the case of discrete variables.
MDF can be obtained by integrating the joint PDF of one variable keeping the other constant.
9
The marginal PDF of X and Y denoted by f X ( x) and FY (Y ) , respectively is given by

f X ( x) =  f ( x, y )dy
− for − x  

fY ( x ) =  f ( x, y )dx
− for − y  
Example 9

Find the MDF f X ( x) and fY ( y) in example 3.


Solution

20 177 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 15 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 16 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics


f X ( x) =  f ( x, y )dy
−

1
6 6x 2
= ( x + y 2 )dy = +
0
5 5 5

6 2
 x+ , 0  x 1
f X ( x) =  5 5

 0 otherwise

fY ( y ) =  f ( x, y )dx
−

1
6
= ( x + y 2 )dx
0
5

6 2 3
= y +
5 5

6 2 3
 y + for 0  y  1
fY ( y ) =  5 5

 0 Otherwise

10.4.2 Expected value of a PDF
16
Let X and Y be a continuous random variable with joint PDF f ( x, y ) . Let g be some function,
then
 
E[ g ( x, y )] =   g ( x, ly ), f ( x, y ) dxdy
− −

Example-10
The length of a thread is 1 mm, and two points are chosen Uniformly and independently along
the thread. Find the expected distance between these two points.
Solution
Let U and V be the two points that are chosen. The joint PDF of U and V is

178 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 16 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 17 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

1 0  U ,V  1
f (U ,V ) = 
0 otherwise
1 1
E[U − V ] =   | U − V | dUdV
01 0

E[U − V ] =  (U − V ) dUdV +  (U − V )dUDV


U V V U

1 1 1 1
=  (U − V )dUdV +   (V − U )dUdV
0 0 0 0

1
E[U − V ] =
3
Example 11
7 The joint PDF of X and Y is given by
3
 ( x + y) 0  x  1
f ( x, y ) =  7

 0 otherwise
2
find the expected value of X / Y .
Solution
2 1
3x( x + y )
E[ X , Y 2 ] =   dxdy
1 0 7 y2
2
3  1 1
=   2 +  dy
7 1  3y y

3
E[ X , Y 2 ] =
28 Ans.

10.4.3 Conditional Distributions

Conditional PDF of X, given that Y = y is denoted by

179 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 17 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 18 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

f ( x, y )
f X |Y ( x | y ) =
fY ( y )
and the conditional expected value of X given Y = Y is given by

E[ X | Y ] =  xf X |Y ( x | y)dx

Similarly, one can define the conditional PDF, expected value of Y given X = X by
interchanging the rate of X and Y.
Properties of Conditional PDFs
The conditional PDF for X, given Y = Y is a valid PDF if two conditions are satisfied–
0  f X |Y ( x, y)
(1) (a)

(b)  f X |Y ( x | y)dx = 1
8 (2) The conditional distribution of X given Y does not equal the conditional distribution of Y
given X.
f f ( x | y)  fY | X ( y | x)
i.e. X |Y
Example 12
If the joint PDF of U and V is given by
2
 (U + V ) 0  U  1, 0  V  1
f (U ,V ) =  3

 0 Otherwise

find the conditional mean of U given V = 1/2.


Solution:
Let U and V be the joint PDF
 2U + 4V
 0 U 1
f (U | V ) =  1 + 4V

 0 elsewhere

so that
2
 1   (U + 1) 0  U  1
f U  =  3
 2 
 0 otherwise

180 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 18 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 19 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

1
 1 2
E U  =  U (U + 1)dV
 2 0 3
then
 1 5
E U  =
 2 9

IN TEXT QUESTIONS

8. A random variable that assume an infinite number of values is called


(a) Continuous random variable
(b) Discrete random variable
(c) Irregular random variable
(d) Uncertain random variable
9. For the function f ( x) = a + bx, 0  x  1 to be a valid PDF, which of the following
statement is correct:
3 (a) a = 0.5, b = 1

(b) a = 1, b = 4

(c) a = 1, b = −1

(d) a = 0, b = 0
10. Two random variables X and Y are distributed according to
( x + y) 0  x  1, 0  y  1
3 f X ,Y ( x, y) = 
 0 otherwise
The probability P( X + Y  1) is
(a) 0.66
(b) 0.33
(c) 0.5
(d) 0.1
11. What are the two important conditions that must be satisfied for f ( x, y ) to be a
legitimate PDF.
12. When do the conditional density function get converted into the marginal density
function?
(a) Only if random variable exhibits statistical dependency.
(b) Only if random variable exhibits statistical independency
181 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 19 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 20 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

(c) Only if random variable exhibit deviation from its mean value
(d) None of the above.
13. Let U and V be jointly distributed continuous variable and joint PDF is given as

6e − (2U +3V )


fU ,V (U ,V ) = 
 0 otherwise
Answer the following
(a) Are U and V independent?
(b) Verify if E[V|U > 2] = 1/3
(c) Verify if P(U > V) = 3/5
10.5 SUMMARY
Joint probability distribution function refers to the combined probability distribution of more
than one random variable. These variables may be discrete or continuous. Marginal probability
distribution is obtained by adding probability distribution of one variable keeping the other
variable as constant. P ( x, y ) must satisfy the following conditions in the case of discrete
variables to be a valid joint probability mass function-

(d) 0  P ( x, y )  1

  P( x, y ) = 1
x y
(e)
Respective counterpart is important in the case of continuous random variable. Conditional
probability is the probability of happening one event when the other event has already occurred.
6 X and Y are called independent if the joint p.d.f. is the product of the individual p.d.f.’s,
i.e., if f(x, y) = fX (x). fY (y) for all x, y.

10.6 GLOSSARY
4 Conditional Probability: a measure of the probability of an event occurring given that another
event has already occurred
Independence of Random Variables: if PXY ( x, y) = PX ( x)  PY ( y)  x, y

Marginal probability Density Function: obtained by integrating the joint PDF of one variable
keeping the other constant.

182 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 20 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 21 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

Introductory Statistics for Economics

 
E[ g ( x, y )] =   g ( x, ly ), f ( x, y ) dxdy
Expected Value of a PDF: − −

10.7 ANSWERS TO IN – TEXT QUESTIONS

1. d 8. a
2. b 9. a
3. a 10. b
4. False 11. Refer to example 5
5. a 12. b
6. c 13. a) Yes
7. a b). Yes
c) Yes

10.8 SELF – ASSESSMENT QUESTIONS


1. A fair coin is tossed 4 times. Let the random variable X denote the number of leads in
the first 3 tosses and let the random variable Y denote the number of leads in the last 3
tosses. Answer the following.
(a) What is the joint PMF of X and Y.
(b) What is the probability of 2 or 3 leads appearing in the first three tosses and 1 or 2
leads appear in the last three tosses.
2. Let X and Y be random variables with joint PDF.
1
 −1  x, y  1
f XY ( x, y) =  4
 0 otherwise
Find

(a) P( X + Y  1)
2 2

(b) P(2 X − Y  0)
3. Let X and Y be two jointly distributed continuous random variable with joint PDF

183 | P a g e

© Department of Distance & Continuing Education, Campus of Open Learning,


School of Open Learning, University of Delhi

Page 21 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614


Page 22 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

B.A.(Hons.) Economics

6 xy 0  x  1, 0  y  x
f X ,Y ( x, y ) = 
 0 otherwise
15
(a) Find f X ( x) and fY ( y)
(b) Are X and Y independent?
(c) Find conditional PDF of X given Y
17
(d) Find E[ X | Y = y ] for 0  y  1
4. The joint pdf of two random variables X and Y is given by:
24𝑢𝑣, 0 < 𝑢 < 1, 0 < 𝑣 < 1, 𝑢 + 𝑣 < 1
𝑓(𝑢, 𝑣) = {
0, otherwise
1
Find 𝑃(𝑈 + 𝑉) < 2.

10.9 REFERENCES

• Devore J. L. (2012). Probability and statistics for engineering and the sciences (8th
ed.; First Indian reprint 2012). Brooks/Cole Cengage Learning.
• Rice J. A. (2007). Mathematical statistics and data analysis (3rd ed.).
Thomson/Brooks/Cole.
• Johnson R. A. & Pearson Education. (2017). Miller & Freund’s probability and
statistics for engineers (Ninth edition Global). Pearson Education.
• Miller, I., Miller, M. (2017). J. Freund's Mathematical Statistics with Application, 8th
ed., Pearson
• Hogg R. V. Tanis E. A. & Zimmerman D. L. (2021). Probability and statistical
inference (10th Edition). Pearson.
• James McClave, P. George Benson, Terry Sincich (2017), Statistics for Business and
Economcs, Pearson Publication
10.10 SUGGESTED READING

• Webster A. L. (1998). Applied statistics for business and economics an essentials


version (Third). Irwin/McGraw-Hill.

184 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi

Page 22 of 22 - Integrity Submission Submission ID trn:oid:::3618:89623614

You might also like