BA (H) Eco - Sem-I - DSC-3 - Introductory Statistics For Eco - L-10 - Joint Probability Distribution - Sugandh KR Choudhary
BA (H) Eco - Sem-I - DSC-3 - Introductory Statistics For Eco - L-10 - Joint Probability Distribution - Sugandh KR Choudhary
Sugandh Kr Choudhary
BA(H) Eco_Sem-I_DSC-3_Introductory Statistics for
Eco_L-10_Joint Probability Distribution_Sugandh Kr
SOL Delhi University
Document Details
Submission ID
trn:oid:::3618:89623614 18 Pages
Download Date
File Name
File Size
816.1 KB
Quoted Text
Cited Text
Abstract
Exclusions
30 Excluded Matches
0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation
Integrity Flags
0 Integrity Flags for Review
Our system's algorithms look deeply at a document for any inconsistencies that
No suspicious text manipulations found. would set it apart from a normal submission. If we notice something strange, we flag
it for you to review.
0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation
Top Sources
The sources with the highest number of matches within the submission. Overlapping sources will not be displayed.
1 Submitted works
2 Internet
epdf.pub 2%
3 Internet
www.vidyalankar.org 1%
4 Internet
www.geeksforgeeks.org 1%
5 Publication
Mengdi Huang, Jianxia Chang, Aijun Guo, Mingzhe Zhao, Xiangmin Ye, Kaixuan Le… 1%
6 Internet
huggingface.co <1%
7 Internet
zh.scribd.com <1%
8 Internet
stats.libretexts.org <1%
9 Submitted works
10 Submitted works
11 Submitted works
12 Internet
www3.ul.ie <1%
13 Submitted works
14 Publication
Steve Brooks, Andrew Gelman, Galin Jones, Xiao-Li Meng. "Handbook of Markov C… <1%
15 Publication
Jim Pitman. "Probability", Springer Science and Business Media LLC, 1993 <1%
16 Publication
Narayan C. Giri. "Introduction to Probability and Statistics", CRC Press, 2019 <1%
17 Publication
18 Submitted works
19 Publication
20 Submitted works
LESSON 10
JOINT PROBABILITY DISTRIBUTION AND MATHEMATICAL EXPECTATIONS
STRUCTURE
10.1 Learning Objectives
10.2 Introduction
10.3 Joint Probability Mass Function
10.3.1 Conditional Probability Distributions
10.3.2 Independence of Random Variables
10.3.3 Marginal Probability Mass Functions
10.3.4. Expectations of Probability Mass Functions
10.4 Continuous Random Variables
10.4.1 Marginal Probability Density Functions
10.4.2 Expected Value of a Probability Density Function
10.4.3 Conditional Probability Distributions
10.5 Summary
10.6 Glossary
10.7 Answers to In-Text Questions
10.8 Self-Assessment Questions
10.9 References
10.10 Suggested Reading
167 | P a g e
B.A.(Hons.) Economics
P( x, y ) = 1
x y
(b)
The probability P[( X , Y ) A] is obtained by summing the joint PMF.
P[( X , Y ) A] = P ( x, y )
( x , y ) A
(c)
It must be noted that conditions (a) and (b) are required for P ( x, y ) to be a valid joint PMF.
1 Example-1
Consider two random variables X and Y with joint PMF as shown in the table below:
Y=0 Y=1 Y=2
X=0 1/6 1/4 1/8
168 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
(i) P ( X = 0, Y 1)
(ii) P (Y = 0, X 1)
Solution:
1
(i) P( X = 0, Y 1) = PXY (0,0) + PXY (0,1)
1 1 5
= + =
6 4 12 Answer.
1 1 7
P(Y = 2, X 1) = PXY (0, 2) + PXY (1, 2) = + =
(ii) 8 6 24
Example-2
A function f is given by 𝑓(𝑥, 𝑦) = 𝑐𝑥𝑦 for 𝑥 = 1,2,3 ; 𝑦 = 1,2,3
Determine the value of 𝑐 for which the above function 𝑓(𝑥, 𝑦) 𝑣𝑎𝑙𝑖𝑑𝑎𝑡𝑒 𝑎𝑠 𝑡𝑟𝑢𝑒 𝑝. 𝑚. 𝑓.
Solution:
From the question given,
𝑓(𝑥, 𝑦) = 𝑐𝑥𝑦
Where, 𝑥 = 1,2,3 𝑎𝑛𝑑 𝑦 = 1,2,3.
There are 9 possible pairs of 𝑋 𝑎𝑛𝑑 𝑌, namely (1,1), (1,2), (1,3), (2,1), (2,2), (2,3),
(3,1), (3,2) 𝑎𝑛𝑑 (3,3). The probabilities associated with each of the pairs are:
𝑓(1,1) = 𝑐(1)(1) = 𝑐
𝑓(1,2) = 2𝑐, 𝑓(1,3) = 3𝑐, 𝑓(2,1) = 2𝑐
𝑓(2,2) = 4𝑐, 𝑓(2,3) = 6𝑐, 𝑓(3,1) = 3𝑐
𝑓(3,2) = 6𝑐, 𝑓(3,3) = 9𝑐
For 𝑓(𝑥, 𝑦) to be a valid joint 𝑝𝑚𝑓,
∑ ∑ 𝑓(𝑥, 𝑦) = 1
𝑦
𝑥
Hence,
169 | P a g e
B.A.(Hons.) Economics
3 3
∑ ∑ 𝑓(𝑥, 𝑦) = 𝑐 + 2𝑐 + 3𝑐 + 2𝑐 + 4𝑐 + 6𝑐 + 3𝑐 + 6𝑐 + 9𝑐 = 1
𝑥=1 𝑦=1
36𝑐 = 1
1
𝑐=
36
1
Thus, for, 𝑐 = the given function is a valid probability mass function.
36
where C , D R .
For discrete random variables X and Y, the conditional PMFs of X given Y and Y given by X
respectively are given by
PXY ( xi , y j )
PX |Y ( xi / y j ) =
{for any xi RX and
PY ( y j )
PXY ( xi , y j )
PY | X ( y j / xi ) =
PY ( xi ) y j RY
Example-3
1 Consider two random variables X and Y with joint PMF as shown in the table below
Y=2 Y=4 Y=5
X=1 1/12 1/24 1/24
170 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
(a) P( X 2, Y 4)
(b) P(Y = 2 | X = 1)
Solution:
(a) P( X 2, Y 4)
1
for all
xi RX and for all y j RY
1
P( X = 2, Y = 2) =
6
171 | P a g e
B.A.(Hons.) Economics
3 1 3
P( X = xi ) P(Y = y j ) = =
8 2 16
1 3
6 16
X and Y are not independent.
If ( X , Y ) are discrete variables, then marginal probability is the probability of a single event
that occur independent of another event.
The marginal probability mass function of X i is obtained from the joint PMF as shown below–
PX i ( x) = PX ( x1, x2 ,..., xk )
X1... X k
In words the marginal PMF of Xi at the point X is obtained by taking the sum of the joint PMF
PX out all the vectors that belong to RX in such a way that is component is equal to X.
Example-5
Carrying forward from example 3, find the marginal PMFs of X and Y.
Solution
RX = {1,2,3}, RY = {2,4,5}
Marginal PMFs are given by
1
6 , for X = 1
3 for X = 2
PX ( x) = 8
11
for X = 3
24
0 Otherwise
172 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
1
2 , for Y = 2
1 for Y = 4
PY ( y ) = 4
1
for Y = 5
4
0 Otherwise
Let X and Y be a jointly distributed Random variable with probability mass function P ( x, y )
12
with discrete variables. Then the expected value of function g ( x, y ) is given by
E[ g ( X , Y )] = g ( X , Y ) P ( x, y )
x y
Example-6
Find E(XY) for data given in example 2
Solution:
1 1 1 1 1
= (1 × 2 × ) + (1 × 4 × ) + (1 × 5 × ) + (2 × 2 × ) + (2 × 4 × )
12 24 24 6 12
1 1 1 1
+ (2 × 5 × ) + (3 × 2 × ) + (3 × 4 × ) + (3 × 5 × )
8 4 8 12
177
= = 7.38
24
IN–TEXT QUESTIONS
Answer the following MCQs
3
1. Let U {0,1} and V {0, 1} be two independent binary variables. If P (U = 0) = P and
P (V = 0) = q, when P (U + V ) 1 is
(a) pq + (1 − p )(1 − q )
(b) pq
(c) p (1 − q )
173 | P a g e
B.A.(Hons.) Economics
(d) 1 − pq
2. If a variable can take certain integer values between two given points, then it is called–
(a) Continuous random variable
(b) Discrete random variable
(c) Irregular random variable
(d) Uncertain random variable
3. If E (U ) = 2 and E (V ) = 4 then E (U − V ) = ?
(a) 2
(b) 6
(c) 0
(d) Insufficient data
4. Height is a discrete variable (T / F)
5. If X and Y are two events associated with the same sample space of a random
experiment. then P ( X | Y ) is given by
14
(a) P ( X Y ) / P(Y ) provided P(Y ) 0
(c) P ( X Y ) / P (Y )
(d) P( X Y ) / P( X )
(a) f ( x, y ) 0
and
f ( x, y )dxdy = 1
− −
Example-7
The joint PDF of (X, Y) is given by
6
(x + y ) 0 x 1, 0 y 1
2
f ( x, y ) = 5
0 Otherwise
175 | P a g e
B.A.(Hons.) Economics
(i) f ( x, y ) 0 and
o
f ( x, y )dxydy = 1
(ii) − −
the first condition is fulfilled as f ( x, y ) 0 for the verification of the second condition –
1 1
6
f ( x, y )dxdy = 5
( x + y 2 )dxdy
− − 0 0
1 1 1 1
6 6 2
= x dxdy + y dxdy
0 0
5 0 0
5
1 1
6 6
= x dxdy + y 2 dxdy
0
5 0
5
P 0 X , 0 Y
1 1
(b) 4 4
1/ 4 1/ 4
6
= 5
( x + y 2 )dxdy
0 0
1/ 4 1/ 4 1/ 4
6 6
=
5 x dxdy +
5 y 2 dxdy
0 0 0
1 1
2 x= 4 3 y= 4
= 6 x + 6 y
20 2 x =0 20 3 y =0
7
=
640 Ans.
Example-8
Consider two continuous random variables X and Y with joint p.d.f.
176 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
2 2
𝑓(𝑥, 𝑦) = {81 𝑥 𝑦, 0 < 𝑥 < 𝐾, 0 < 𝑦 < 𝐾
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝑥
3 2
b) P(X>3Y) = ∫0 (∫03 81 𝑥 2 𝑦𝑑𝑦) 𝑑𝑥
3 1
= ∫0 𝑥 4 𝑑𝑥
729
1
= 15
20 177 | P a g e
B.A.(Hons.) Economics
f X ( x) = f ( x, y )dy
−
1
6 6x 2
= ( x + y 2 )dy = +
0
5 5 5
6 2
x+ , 0 x 1
f X ( x) = 5 5
0 otherwise
fY ( y ) = f ( x, y )dx
−
1
6
= ( x + y 2 )dx
0
5
6 2 3
= y +
5 5
6 2 3
y + for 0 y 1
fY ( y ) = 5 5
0 Otherwise
10.4.2 Expected value of a PDF
16
Let X and Y be a continuous random variable with joint PDF f ( x, y ) . Let g be some function,
then
E[ g ( x, y )] = g ( x, ly ), f ( x, y ) dxdy
− −
Example-10
The length of a thread is 1 mm, and two points are chosen Uniformly and independently along
the thread. Find the expected distance between these two points.
Solution
Let U and V be the two points that are chosen. The joint PDF of U and V is
178 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
1 0 U ,V 1
f (U ,V ) =
0 otherwise
1 1
E[U − V ] = | U − V | dUdV
01 0
1 1 1 1
= (U − V )dUdV + (V − U )dUdV
0 0 0 0
1
E[U − V ] =
3
Example 11
7 The joint PDF of X and Y is given by
3
( x + y) 0 x 1
f ( x, y ) = 7
0 otherwise
2
find the expected value of X / Y .
Solution
2 1
3x( x + y )
E[ X , Y 2 ] = dxdy
1 0 7 y2
2
3 1 1
= 2 + dy
7 1 3y y
3
E[ X , Y 2 ] =
28 Ans.
179 | P a g e
B.A.(Hons.) Economics
f ( x, y )
f X |Y ( x | y ) =
fY ( y )
and the conditional expected value of X given Y = Y is given by
E[ X | Y ] = xf X |Y ( x | y)dx
Similarly, one can define the conditional PDF, expected value of Y given X = X by
interchanging the rate of X and Y.
Properties of Conditional PDFs
The conditional PDF for X, given Y = Y is a valid PDF if two conditions are satisfied–
0 f X |Y ( x, y)
(1) (a)
(b) f X |Y ( x | y)dx = 1
8 (2) The conditional distribution of X given Y does not equal the conditional distribution of Y
given X.
f f ( x | y) fY | X ( y | x)
i.e. X |Y
Example 12
If the joint PDF of U and V is given by
2
(U + V ) 0 U 1, 0 V 1
f (U ,V ) = 3
0 Otherwise
so that
2
1 (U + 1) 0 U 1
f U = 3
2
0 otherwise
180 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
1
1 2
E U = U (U + 1)dV
2 0 3
then
1 5
E U =
2 9
IN TEXT QUESTIONS
(b) a = 1, b = 4
(c) a = 1, b = −1
(d) a = 0, b = 0
10. Two random variables X and Y are distributed according to
( x + y) 0 x 1, 0 y 1
3 f X ,Y ( x, y) =
0 otherwise
The probability P( X + Y 1) is
(a) 0.66
(b) 0.33
(c) 0.5
(d) 0.1
11. What are the two important conditions that must be satisfied for f ( x, y ) to be a
legitimate PDF.
12. When do the conditional density function get converted into the marginal density
function?
(a) Only if random variable exhibits statistical dependency.
(b) Only if random variable exhibits statistical independency
181 | P a g e
B.A.(Hons.) Economics
(c) Only if random variable exhibit deviation from its mean value
(d) None of the above.
13. Let U and V be jointly distributed continuous variable and joint PDF is given as
(d) 0 P ( x, y ) 1
P( x, y ) = 1
x y
(e)
Respective counterpart is important in the case of continuous random variable. Conditional
probability is the probability of happening one event when the other event has already occurred.
6 X and Y are called independent if the joint p.d.f. is the product of the individual p.d.f.’s,
i.e., if f(x, y) = fX (x). fY (y) for all x, y.
10.6 GLOSSARY
4 Conditional Probability: a measure of the probability of an event occurring given that another
event has already occurred
Independence of Random Variables: if PXY ( x, y) = PX ( x) PY ( y) x, y
Marginal probability Density Function: obtained by integrating the joint PDF of one variable
keeping the other constant.
182 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi
E[ g ( x, y )] = g ( x, ly ), f ( x, y ) dxdy
Expected Value of a PDF: − −
1. d 8. a
2. b 9. a
3. a 10. b
4. False 11. Refer to example 5
5. a 12. b
6. c 13. a) Yes
7. a b). Yes
c) Yes
(a) P( X + Y 1)
2 2
(b) P(2 X − Y 0)
3. Let X and Y be two jointly distributed continuous random variable with joint PDF
183 | P a g e
B.A.(Hons.) Economics
6 xy 0 x 1, 0 y x
f X ,Y ( x, y ) =
0 otherwise
15
(a) Find f X ( x) and fY ( y)
(b) Are X and Y independent?
(c) Find conditional PDF of X given Y
17
(d) Find E[ X | Y = y ] for 0 y 1
4. The joint pdf of two random variables X and Y is given by:
24𝑢𝑣, 0 < 𝑢 < 1, 0 < 𝑣 < 1, 𝑢 + 𝑣 < 1
𝑓(𝑢, 𝑣) = {
0, otherwise
1
Find 𝑃(𝑈 + 𝑉) < 2.
10.9 REFERENCES
• Devore J. L. (2012). Probability and statistics for engineering and the sciences (8th
ed.; First Indian reprint 2012). Brooks/Cole Cengage Learning.
• Rice J. A. (2007). Mathematical statistics and data analysis (3rd ed.).
Thomson/Brooks/Cole.
• Johnson R. A. & Pearson Education. (2017). Miller & Freund’s probability and
statistics for engineers (Ninth edition Global). Pearson Education.
• Miller, I., Miller, M. (2017). J. Freund's Mathematical Statistics with Application, 8th
ed., Pearson
• Hogg R. V. Tanis E. A. & Zimmerman D. L. (2021). Probability and statistical
inference (10th Edition). Pearson.
• James McClave, P. George Benson, Terry Sincich (2017), Statistics for Business and
Economcs, Pearson Publication
10.10 SUGGESTED READING
184 | P a g e
© Department of Distance & Continuing Education, Campus of Open Learning,
School of Open Learning, University of Delhi