PRP Module 2
PRP Module 2
(15B11MA301)
Lecture-6
(Course content covered: One dimensional discrete random variable)
Department of Mathematics
Jaypee Institute of Information Technology, Noida
Random Variable
• A random variable X is a function that assigns a real number to each
outcome of the sample space S, i.e., X: S→R is a mapping from the
sample space S to the set of real numbers R.
• The sample space S is the domain of the random variable and the set of
all values taken on by the random variable is the range of the random
variable X.
• A random variable is usually denoted by an uppercase letter such as X
and the corresponding lowercase letter, such as x, denotes a possible
value of the random variable X.
• If only one characteristics is considered corresponding to the sample
space of a random experiment, then the random variable is called one
dimensional random variable.
2
Random Variable Contd..
• More than one characteristics may be considered corresponding to the
sample space of a random experiment, e.g., suppose that S consists of a
large group of students of a college. Let the experiment consist of
choosing a student at random. Let X denotes the weight of the student
and Y denotes the students' height. Then (X, Y) is a two dimensional
random variable. The same idea may be extended further.
Example
The “number of heads obtained” in an experiment of tossing three
unbiased coins simultaneously, so the random variable X can take
values 0, 1, 2 and 3, i.e.,
Domain of X= S ={TTT, TTH,THT, HTT,HHT,HTH,THH,HHH},
and
Range of X={0, 1, 2, 3}
3
Discrete Random Variable
• A random variable which can assume some specific values is called a
discrete random variable.
• The main characteristic of a discrete random variable is that the set of
possible values in the range can all be listed and the list may be a finite
list or a countably infinite list.
Examples
• Number of heads obtained when 10 coins are tossed.
• Number of phone calls received, per day at a telephone booth,
• number of sixes obtained when a pair of dice are thrown.
• Number of spade cards when 4 cards are chosen from a well shuffled
pack of 52 playing cards.
• The number of odd numbers selected out of the set of positive
integers.
4
Continuous Random Variable
• A random variable X which can take any value in an interval of real
numbers is said to be a continuous random variable.
• The range of X can take infinitely many real values within one or
more intervals of real numbers.
• The set of all possible values in the range cannot be listed in case
of a continuous variable as the list is uncountably infinite.
Examples
• The duration of a phone call received.
• The time to failure of a machine.
• The temperature gained by an electric motor after one hour of
operation.
• The amount of rainfall in a day.
5
Probability Mass Function
Let a discrete random variable X takes the values x1, x2, x3, . . ., xn,
then the probability function or the probability mass function (PMF) of
X is denoted by
f(xi) = P(X = xi) = pi ; for i = 2, 3, . . .,n,
i.e.,.
X : x1 x2 x3 x4 . . . xn
P( X xi ) : p1 p2 p3 p4 . . . pn
such that
(i) P( X xi ) 0;
(ii ) P( X x ) p p
xi
i 1 2 p3 p4 . . . pn 1.
6
Probability Distribution
If pi represents the probability corresponding to X= xi , for i=1,
2, 3,…,n; then the collection of pairs (xi , pi) is called the
probability distribution of the discrete random variable X.
(X=xi) : 0 1 2
7
Example. Let X represents the number of heads when three fair
coins are tossed. Find
(a) the probability distribution of the number of heads,
(b) P(0< X <3), (c) P(X >1), (d) P(X ≤ 2).
Sol.: S = {TTT, TTH,THT, HTT,HHT,HTH,THH,HHH}
(a) The probability distribution is as follows:
X (Number of Heads): 0 1 2 3
Probability: 1/8 3/8 3/8 1/8
9
Example. The probability mass function of a random variable is as
follows:
X 1 2 3 4 5 6
P k 2k/3 3k 1/3 k/3 1/6
Solution:
(a) Since k + (2k/3) + 3k + (1/3) + (k/3) + (1/6) =1, so
k =1/10 =0.1,
(b) F(4) = P(X ≤ 4) = P(1)+ P(2)+ P(3)+P(4)= 0.8,
(c) F(6) = P(X ≤ 6)= P(1)+ P(2)+ P(3)+P(4) + P(5)+P(6)=1,
(d) P(X = 3) = 3k = 0.3, or P(X = 3) = F(3)-F(2)=0.3.
10
References/Further Reading
1. Veerarajan, T., Probability, Statistics and Random Processes, 3rd
Ed. Tata McGraw-Hill, 2008.
2. Ghahramani, S., Fundamentals of Probability with Stochastic
Processes, Pearson, 2005.
3. Papoulis, A. and Pillai, S.U., Probability, Random Variables and
Stochastic Processes, Tata McGraw-Hill, 2002.
4. Miller, S., Childers, D., Probability and Random Processes,
Academic Press, 2012.
5. Johnson, R.A., Miller and Frieund’s, Probability and Statistics
for Engineers, Pearson, 2002.
6. Spiegel, M.R., Statistics, Schaum Series, McGraw-Hill
7. Walpole R.E, Myers, R.H., Myers S.I, Ye. K. Probability and
Statistics for Engineers and Scientists, 7th Ed., Pearson, 2002.
8. https://nptel.ac.in/courses/117/105/117105085/
11
Probability and Random Processes
(15B11MA301)
Lecture-7
Department of Mathematics
Jaypee Institute of Information Technology
Noida, India
1
Continuous Random Variable
(i ) f ( x) 0, - x (or x R X ),
(ii ) f ( x)dx 1
(or over R X ),
6
Example
• The diameter of an electric cable (X) is
assumed to be a continuous random variable
with probability function
f(x)= 6x(1-x), 0<x<1
(i) Check that above is a valid PDF.
(ii) Determine a number b such that
P(X<b)= P(X>b)
7
8
Exercise
• Experience has shown that while walking in a certain
park, the time X in minutes, between seeing two
people smoking has a density function of the form
xe x , x 0
f ( x)
0, otherwise
• (a) Calculate the value of λ
• (b) What is the probability that Jeff who has just seen a
person smoking, will see another person smoking in
next 2 to 5 minutes?
• (c) in at least 7 minutes?
9
Cumulative Distribution Function (cdf)
If X is discrete , F(x) p j
j
X j x
X
If X is continuous, F(x) P(- X x) f(x)dx
-
10
Properties of Cumulative Distribution
Function (cdf)
5.
11
Example
12
Example
13
14
Example Suppose that the error in the reaction temperature, in °C, for a
controlled laboratory experiment is a continuous random variable X
having the probability density function
x2
if 1 x 2
f ( x) 3
0
otherwise
(ii)
(b)
15
Example If a continuous random variable X having the probability density
function
x2
if 1 x 2
f ( x) 3
0
otherwise
Solution (a)
So
(b)
16
Example
Formula used
Solution
17
Thank You
Probability and Random Processes
(15B11MA301)
Lecture-8
Department of Mathematics
Jaypee Institute of Information Technology
Noida, India
1
Mean and Variance of discrete random variable
E X m X x j m X f ( x j ) x j m X P( X x j )
2 2 2
x x
j j
It is denoted by
X , var[ X ] or E X m X (read as expectation of X m X square)
2 2
2
Mean and Variance of continuous random variable
3
Properties of Variance
If X is a random variable (discrete or continuous), then
1. Var[ X ] =E X m X =E[ X ] E[ X ]
2 2 2
Provided E[ X 2 ] exists.
2. V ar[aX b] =a 2 Var[X ]
4
Mean and Variance of discrete random variable
Example: Let X be the total of the two dice in the experiment of
tossing two balanced dice. Find mean and variance of X.
S={11 12 13 14 15 16 X(S)={2 3 4 5 6 7
21 22 23 24 25 26 345678
31 32 33 34 35 36 456789
41 42 43 44 45 46 5 6 7 8 9 10
51 52 53 54 55 56 6 7 8 9 10 11
61 62 63 64 65 66} 7 8 9 10 11 12}
={2 3 4 5 6 7 8 9 10 11 12}
X 2 3 4 5 6 7 8 9 10 11 12
f(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
5
Mean and Variance of discrete random variable
Example: Let X be the total of the two dice in the experiment
of tossing two unbalanced dice. Find mean and variance of X.
X 2 3 4 5 6 7 8 9 10 11 12
f(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
Mean: E[ X ] x j f ( x j ) 7
xj
6
Solution:
7 2 1
Var ( X ) E[ X ] E[ X ] 1
2 2
6 6
7
Thank You
Probability and Random Processes
(15B11MA301)
Lecture- 9
Department of Mathematics
Jaypee Institute of Information Technology
Noida, India
1
Contents:
• Two dimensional RVs
• Marginal Distributions
• Examples
Two dimensional random variable
3
JOINT PROBABILITY MASS FUNCTION (PMF)
Mathematically, pmf f(x,y) of bivariate discrete random variables is a real valued
function that satisfies the following properties:
1. f ( x, y ) [0,1] x, y R
2. f ( x, y ) 1
( x, y )
4
5
BIVARIATE JOINT PROBABILITY DENSITY FUNCTION (PDF)
1. f ( x, y) 0 x, y R
2.
f ( x, y)dxdy 1
6
Joint Cumulative Distribution Function
y x
1. F ( x, y )
f (u , v)dudv,
2 F ( x, y )
2. f ( x, y )
xy
7
Marginal probability distribution
P X x, Y y1 X x, Y y2
P X x, Y y1 P X x, Y y2
It is also written as
Thus the marginal probability function of X, fX(x) is obtained from the joint probability
function of X and Y by summing f(x,y) over the possible values of Y.
Also
P X x1 , Y y X x2 , Y y
P X x1, Y y P X x2 , Y y
It is also written as
Thus the marginal probability function of Y, fY(y) is obtained from the joint probability
function of X and Y by summing f(x,y) over the possible values of X.
Marginal probability distribution for Continuous
Random Variable
Definition: Let (X ,Y) denote two dimensional continuous random variables with
joint probability density function f(x,y) then
the marginal density of X is
f X x f x, y dy
and
f(y|x) = P[Y = y|X = x] is called the conditional probability
function of Y given X = x
Note:
P X x and Y y
P X x | Y y
PY y
and
P X x and Y y
PY y | X x
P X x
Conditional probability distribution for Continuous
Random Variable
f x, y
f x y
fY y
Example : Given the following probability distribution.
(i) Find the marginal distributions of X and Y.
(ii) Find the conditional distribution of X given Y= 2.
Solution:
Marginal distribution of X:
Marginal distribution of Y:
Solution (b): The conditional distribution of X given Y=2 is
X -1 0 1
2/5 1/5 2/5
Probability and Random Processes
(15B11MA301)
Lecture- 10
Department of Mathematics
Jaypee Institute of Information Technology
Noida, India
1
Contents:
• Independent Random Variables
• Conditional Means
• Conditional Variances
• covariance
• Examples
2
Independent Random Variables
f x, y f X x . fY y
f y x fY y
3
4
• The marginal density functions of Y is
5
Example : If X,Y have the joint PDF
x y, 0 x 1, 0 y 1
f ( x)
0 otherwise
Check whether X and Y are independent or not.
Now,
f ( x). f ( y) ( x 1/ 2)( y 1/ 2) f ( x, y)
Evaluate
Solution:
7
8
X=2 X= 4
Y=1 0.10 0.15
Y=3 0.20 0.30
Y=5 0.10 0.15
9
Expected values of two-dimensional random variable
If (X,Y) is a two-dimensional random variable, then the mean or expectation of (X,Y)
is defined as follows
• Case 1: when X,Y are discrete random variables, then
10
Case 2: When X,Y are continuous random variables, then
11
Conditional Expected Values
12
Covariance
13
Example 1: The joint distribution of X and Y is given by
x y
f ( x, y ) , x 1,2,3, y 1,2
21
Find the marginal distributions of X and Y. Find the mean of X and Y also.
Solution:
14
15
Example 2: The joint PDF of (X,Y) is given by
24 xy , 0 x, 0 y, x y 1
f ( x, y )
0 otherwise
Find the conditional mean and variance of Y given X.
Solution:
16
17
18
Probability and Random Processes
(15B11MA301)
Lecture-11
Department of Mathematics
Jaypee Institute of Information Technology, Noida
1
Contents of the Lecture:
Moments
Related Results
Examples
2
Moments
If X is a random variable which is discrete or continuous, the moments about the origin denoted by 𝜇𝑟′ is defined
as
𝜇𝑟′ = 𝐸 𝑋 ′ , 𝑟 = 1, 2,3, …
The moments about mean or central moments denoted by 𝜇𝑟 and is defined as 𝜇𝑟 = 𝐸 𝑋 − 𝑋 𝑟 , 𝑟 = 1,2,3, …
If X is a discrete random variable which can assume any of the values say 𝑥1 ,… , 𝑥𝑛 with respective
probabilities as p(𝑥1 ), p(𝑥2 ),…,p(𝑥𝑛 ), then
∞
𝜇𝑟′ = 𝐸 𝑋 ′ = 𝑥 𝑟 𝑝(𝑥𝑟 )
𝑟=1
𝑟 ∞
and, 𝜇𝑟 = 𝐸 𝑋 − 𝑋 = 𝑟=1(𝑥 − 𝑥 )𝑟 p(𝑥𝑟 )
If X is a random variable which is discrete or continuous, then the first moment about the origin
𝝁′𝟏 = 𝑬 𝑿 = 𝑿
We know, the moments about mean are defined as 𝜇𝑟 = 𝐸 𝑋 − 𝑋 𝑟 , 𝑟 = 1,2,3, ….
Therefore, 𝝁𝟏 = 𝑬[𝑿 − 𝑿 ] =𝝁′𝟏 − 𝝁′𝟏 = 0
𝟐
Similarly, 𝜇2 = 𝐸 𝑋 − 𝑋 2 = 𝝁′𝟐 - 𝝁′𝟏
Var (X) = 𝑬 𝑿𝟐 - 𝑬 𝑿 𝟐
𝟑 𝟑
𝜇3 = 𝐸 𝑋 − 𝑋 3
= 𝝁′𝟑 - 𝟑𝝁′𝟐 𝝁′𝟏 + 𝟑𝝁′𝟏 − 𝝁′𝟏
𝟐 𝟒
𝜇4 = 𝐸 𝑋 − 𝑋 4
= 𝝁′𝟒 - 𝟒𝝁′𝟑 𝝁′𝟏 + 𝟔𝝁′𝟏 𝝁′𝟐 − 𝟑𝝁′𝟏 and so on.
4
Properties of Moments
1. If X is a random variable, then 𝑬 𝒂𝑿 + 𝒃 = 𝒂 𝑬 𝑿 + 𝒃.
Proof: By definition, 𝐸 𝑎𝑋 + 𝑏 = 𝑎𝑥 + 𝑏 𝑝(𝑥)
= 𝑎 𝑥 𝑝(𝑥) +𝑏 𝑝(𝑥)
= 𝑎 𝐸[𝑋] + 𝑏, (since, 𝑝(𝑥) = 1)
Therefore, 𝑬 𝒂𝑿 + 𝒃 = 𝒂 𝑬[𝑿] + 𝒃
Remark: 𝑬 𝑿 ± 𝒀 = 𝑬[𝑿] ± 𝑬[𝒀]
2 2 2
𝐸 𝑌−𝑌 = 𝐸 𝑎2 𝑋 − 𝑋 = 𝑎2 𝐸{ 𝑋 − 𝑋 }
𝑽𝒂𝒓 𝒀 = 𝒂𝟐 𝑽𝒂𝒓 𝑿
Hence, 𝑽𝒂𝒓(𝒂𝑿 + 𝒃) = 𝒂𝟐 𝑽𝒂𝒓(𝑿) { as Var (b)=0}
5
Properties of Moments (Continued.....)
3. If X and Y are independent random variables, then
𝑽𝒂𝒓 𝒂𝑿 ± 𝒃𝒀 = 𝒂𝟐 𝑽𝒂𝒓[𝑿] ± 𝒃𝟐 𝑽𝒂𝒓[𝒀]
4. If X and Y are independent random variables, then
𝑬 𝑿𝒀 = 𝑬 𝑿 𝑬(𝒀)
5. If X and Y are independent random variables such that 𝑌 ≤ 𝑋, then 𝐸[𝑌] ≤ 𝐸[𝑋]
Proof: Given, 𝑌 ≤ 𝑋 ⇒ Y- X ≤ 0
X-Y≥0
𝑬 X−Y ≥0
⇒ 𝐸 𝑋 − 𝐸(𝑌) ≥ 0
⇒ 𝐸 𝑋 ≥ 𝐸(𝑌)
6
Example 1: Find the first four moments about the origin for a random variable X having the density
4𝑥 9−𝑥 2
function 𝑓 𝑥 = , 0 ≤ 𝑥 ≤ 3.
81
4𝑥 9−𝑥 2
Solution: Given, 𝑓 𝑥 = 81
,0 ≤ 𝑥 ≤ 3
By the definition of moments,
7
Example 2: If a random variable X has the probability density function given by
𝑥+1
, −1 < 𝑥 < 1
𝑓 𝑥 = 2
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
∞ 𝟏 𝟐 𝒙+𝟏
E(𝑿𝟐 ) = −∞
𝒙 𝟐
𝒇 𝒙 𝒅𝒙 = −𝟏
𝒙 𝟐
𝒅𝒙 = 𝟏/𝟑
8
Example 3: The monthly demand for fossil watches is known to have following probability distribution
Demand 1 2 3 4 5 6 7 8
Probability 0.08 0.12 0.19 0.24 0.16 0.10 0.07 0.04
9
Example 4: If X and Y are independent random variables with means 2, 3 and variance 1, 2 respectively.
Find the mean and variance of the random variable Z = 2X-5Y.
Now, Z=2X-5Y
= 54
10
Example 5: The cumulative distribution function (CDF) of a random variable is 𝐹 𝑥 = 1 − (1 + 𝑥)𝑒 −𝑥 ,
x >0. Find probability density function of X, mean and variance.
= 𝑥𝑒 −𝑥
∞ ∞ −𝑥
(ii) Mean = E(X) = −∞
𝑥 𝑓 𝑥 𝑑𝑥 = 0
𝑥. 𝑥𝑒 𝑑𝑥 = 2
(iii) Variance = E(X2) –(E(X))2
= ∞
0
𝑥 2 . 𝑥𝑒 −𝑥 𝑑𝑥 − 4 = 6 – 4 =2
11
Thank you
12
Probability and Random Processes
(15B11MA301)
Lecture-12
Department of Mathematics
Jaypee Institute of Information Technology, Noida
1
Contents of the Lecture:
Moment generating Function (MGF)
Properties of MGF
Solved Examples
References
2
Moment Generating Function
The moment generating function of a random variable X denoted by 𝑀𝑋 (t) is
defined as
𝑀𝑋 t = 𝐸 𝑒 𝑡𝑋 where t is a real variable.
𝑀𝑋 t = 𝐸 𝑒 𝑡𝑋 = 𝑒 𝑡𝑥 𝑓 𝑥 𝑑𝑥
−∞
3
Properties of Moment Generating Function
𝑡𝑟
The coefficient in 𝑀𝑋 t is𝜇𝑟′ , r = 1, 2, 3…and 𝜇𝑟′ = 𝐸 𝑋 𝑟 gives moments about the origin.
𝑟!
Proof: We know that 𝑀𝑋 t = 𝐸 𝑒 𝑡𝑋
𝑡𝑋 𝑡𝑋 2
=𝐸 1+ + +⋯
1! 2!
𝑡 𝑡2 2 𝑡𝑟
= 𝐸 1 + 1! 𝐸 𝑋 + 2! 𝐸 𝑋 + ⋯ + 𝑟! 𝐸 𝑋 𝑟 + ⋯
𝑡 ′ 𝑡2 ′ 𝑡𝑟
=1+ 𝜇1 + 𝜇2 + ⋯ + 𝑟! 𝜇𝑟′ + ⋯
1! 2!
𝑟
∞ 𝑡
Hence, 𝑀𝑋 t = 𝑟=0 𝜇𝑟′ ..............(1)
𝑟!
this gives the MGF in terms of the moments.
5
The moment generating function of the sum of n independent random variables is equal to the
product of their respective moment generating functions, i.e.
Therefore,
6
MGF of a distribution (RV) is unique, if exists.
Properties of Moment Generating Function
Two RVs X1 and X2 with pdf f1 and f2 are identical iff their MGFs are same.
−𝑎𝑡 𝑋
𝑡
=𝑒 ℎ 𝐸 𝑒 ℎ
−𝑎𝑡
=𝑒 ℎ 𝑀𝑋 𝑡/ℎ 7
3
Example 1: If a random variable X has the MGF 𝑀𝑋 𝑡 = 3−𝑡 , obtain the standard deviation of X.
3 𝑡 𝑡2
Solution: 𝑀𝑋 𝑡 = = 1+ + +......
3−𝑡 3 9
𝑡
E(X) = coefficient of = 1/3
1!
𝑡2
E(X 2 ) = coefficient of = 2/9
2!
2 2
Var(X) =E(𝑋 ) – (E(X))
𝟐 𝟏 𝟏
= − =
𝟗 𝟗 𝟗
∞
Solution: 𝑀𝑋 𝑡 = 𝐸 𝑒 𝑡𝑋 = 𝑥=0 𝑒 𝑡𝑥
𝑃 𝑋=𝑥
1 𝑒𝑡
𝑥
∞ 𝑡𝑥 ∞
= 𝑥=1 𝑒 = 𝑥=1
2𝑥 2
d
Mean = μ1′ = dt MX t at t =0.
d 𝒆𝒕
= dt 𝟐−𝒆𝒕
at t =0 Mean = 2
9
Example 3: A random variable X has the PDF given by 𝑓 𝑥 = 2𝑒−2𝑥 , 𝑥 ≥ 0 𝑎𝑛𝑑 0 𝑖𝑓 𝑥 < 0.
∞ 𝑡𝑥 −𝑡𝑥 𝑒 − 2−𝑡 𝑥
𝑀𝑋 t = 0
2𝑒 𝑒 𝑑𝑥 = from 0 𝑡𝑜 ∞
−(2−𝑡)
2
Therefore, 𝑀𝑋 t =
2−t
10
𝑟
∞ 𝑡 2 2
We know that, 𝑀𝑋 t = 𝑟=0 𝑟! 𝜇𝑟′ = = t
2−t 2(1− )
2
𝑡 𝑡2 𝑡𝑟 t −1
1+ 𝜇1′ + 𝜇2′ + ⋯+ 𝜇𝑟′ +⋯= 1−
1! 2! 𝑟! 2
𝑡 𝑡2 𝑡3 𝑡4
=1+ + + + +⋯
2 22 23 24
1 𝑡 2! 𝑡 2
=1+ + + ⋯.
2 1! 4 2!
t t2
On equating the coefficients of , , and so on, we have..
1! 2!
1
𝜇1′ = 1/2, 𝜇2′ = , 𝜇3′ = ¾, 𝜇2′ = 3/2
2
11
References
1. A. M. Mood, F. A. Graybill and D. C. Boes, Introduction to the theory of
statistics, 3rd Indian Ed., Mc Graw Hill, 1973.
2. R. V. Hogg and A. T. Craig, Introduction to mathematical Statistics, Mc-
Millan, 1995.
3. V. K. Rohatgi, An Introduction to Probability Theory and Mathematical
Statistics, Wiley Eastern, 1984.
4. S. M.Ross, A First Course in Probability, 6th edition, Pearson Education Asia,
2002.
5 S. Palaniammal, Probability and Random Processes, PHI Learning Private
Limited, 2012.
6 T. Veerarajan, Probability, Statistics and Random Processes, 3rd Ed. Tata
McGraw-Hill, 2008.
7. R. E. Walpole, R H. Myers, S. L. Myers, and K. Ye, Probability & Statistics
for Engineers & Scientists, 9th edition, Pearson Education Limited, 2016.
8. I. Miller and M. Miller, John E. Freund's Mathematical Statistics with
Applications, 8th Edition, Pearson Education Limited 2014.
12
Probability and Random Processes
(15B11MA301)
Lecture-13
Department of Mathematics
Jaypee Institute of Information Technology,
Noida 1
Contents of the Lecture:
Characteristic Function
Properties
Solved Examples
2
Limitation of MGF
MGF may not exist in case of some distributions as the
integral etx f ( x)dx or the series e tx
p ( x) does not converge absolutely for
x
1
e.g. for continuous distribution given by f ( x) c , m 1 , MGF
(1 x 2 )m
c
tx
does not exist , since the integral e dx does not converge
(1 x )
2 m
6
p ( x) , x 1, 2,..
Also, for discrete distribution 2 x2
0, else
etx
6
the series 2 2 is not convergent for t>0 (D’Alembert ratio test),
x x
so MGF does not exist.
3
• So,
A more serviceable function than the MGF is
needed.
It is known as Characteristic function
4
Characteristic Function
5
The characteristic function of a random variable X is defined by
X ( ) E (ei X )
6
cos x i sin x cos x sin x
i x 2 2 1/2
e
1 1
e
i X i x
X ( ) E (e ) f ( x)dx
ei x f ( x)dx
f ( x)dx 1
7
Properties of Characteristic Function
i
n n
1. n E ( X ) the co-efficient of
' n
in the
n!
expansion of ( ) in series of asending powers of i.
X ( w ) E (e iwx
) e f (x)
iwx
x
1 iwx
iwx iwx
2
3
...f ( x )
x
2! 3!
f ( x ) iw xf ( x )
iw
2
x f ( x ) .....
2
x x 2! x
8
1 dn
2. 'n n d n ( )
i 0
3. If the characteristic function of a RV X
is x ( ) and if Y aX b, then y ( ) eib x ( a )
0
e dx e
( iw ) x ( iw ) x
dx
2 0
2
2
( 2 )
10
Ex. The characteristic function of a random variable X is
given by
1 w , w 1
x ( w)
0, w 1
Find the pdf of X.
Sol:
The pdf of X is
1
x
i x
f(x) ( ) e dw
2
11
1 1 iwx
(1 w )e dw
2 1
1 0 iwx
1
iwx
(1 w )e dw (1 w )e dw
2 1 0
(2 e e ) 2 1 cos x
1 ix 1
ix
2x 2
x
12
Joint Characteristic Function
e
i1x i2 y
xy (1 , 2 ) f ( x, y )dxdy
e i1x i2 y
p( xi , y j )
i j
13
Properties
(i ) xy (0,0) 1
1 m n
(ii ) E{ X Y } mn m
m n
(1, 2 )
i 1 2 n xy
1 0,2 0
1 212 822
E(X) e
i 1 0, 0 1 2
e2 1 8 2
2 2
4i
1 1 0,2 0 0
E(Y) e
2 1 8 2
2 2
16i
2 1 0,2 0 0
15
1 m n
E{ X Y } mn
m n
(1 , 2 )
i 1 2
m n xy
1 0,2 0
1 2 2 12 8 2 2
E(XY) 2 e
i
1 2 1 0,2 0
2 12 82 2
e 162
1 1 0,2 0
{6412 e 2 8 2 2 }
2
1
1 0 , 2 0
0
17