0% found this document useful (0 votes)
87 views23 pages

Non-Existent Mathematical Expectations

This document discusses expectation and moment generating functions. It begins by defining expectation for both discrete and continuous random variables as the weighted average of all possible values of the random variable, with the weights being the probabilities. It provides examples of calculating expectations. It then discusses some properties of expectations, such as expectations of functions of random variables and constant random variables. It also discusses when expectations may not exist, such as for random variables with heavy tails. The document concludes by introducing moment generating functions and their use in characterizing distributions based on moments.

Uploaded by

Uday Bhalla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
87 views23 pages

Non-Existent Mathematical Expectations

This document discusses expectation and moment generating functions. It begins by defining expectation for both discrete and continuous random variables as the weighted average of all possible values of the random variable, with the weights being the probabilities. It provides examples of calculating expectations. It then discusses some properties of expectations, such as expectations of functions of random variables and constant random variables. It also discusses when expectations may not exist, such as for random variables with heavy tails. The document concludes by introducing moment generating functions and their use in characterizing distributions based on moments.

Uploaded by

Uday Bhalla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Expectation and Moment Generating Function

Paper: Probability and Statistics

Lesson: Expectation and Moment Generating


Function

Lesson Developer: Dr. Shiv Kumar Kaushik

College/Department: Kirori Mal College,

Department of Mathematics,

University of Delhi

Institute of Lifelong Learning, University of Delhi 1


Expectation and Moment Generating Function

Contents

1. Introduction ................................................................ 3
2. Definition and Examples of Expectation ............................ 3
Solved Problems .............................................................. 11
3. Some Special Expectations .............................................. 12
Chebyshev’s Theorem ...................................................... 15
Some Solved Problems ..................................................... 16
4. Moment Generating Function ........................................... 18
Exercises .......................................................................... 23
References........................................................................ 23

Institute of Lifelong Learning, University of Delhi 2


Expectation and Moment Generating Function

1. Introduction
In previous chapter, we have studied the concepts of random or non-deterministic
experiments followed by the classical definition of Probability function and the axiomatic
approach to the Probability Theory. One-variate distribution theory in which discrete and
continuous random variables and the distributions functions associated with them have been
studied. Various examples and some important properties of these concepts were also
studied.

In the present chapter, we will study mathematical expectation of a random variable


with some special expectations- order moments including mean, variance and standard
deviation of a random variable. Further, moment generating function and characteristic
function will be studied.

2. Definition and Examples of Expectation


We begin this section with the following definition of expectation of a discrete and
continuous random variable followed by the various results and examples.

Definition 2.1 Let be a continuous random variable with probability density function
(p.d.f.) . If the integral

(i.e., convergent), the expectation of , denoted by , is defined as

If is a discrete random variable with probability mass function p.m.f. . If the series

(i.e., convergent), the expectation of , denoted by , is defined as

The expectation of a random variable is also called the mathematical


expectation of , the expected value of , or the mean of and is also denoted
by .

Note that if is conditionally convergent (i.e., is convergent but not


absolutely convergent), does not exist. The condition of absolute convergence of
(i.e., is convergent) is therefore essential for the existence of . Thus, exists if
and only if exists.

Institute of Lifelong Learning, University of Delhi 3


Expectation and Moment Generating Function

Let us see some of the examples of expectation of a random variable .

Example 2.2 Let be a discrete random variable with p.m.f. given as following

Here, , if is other than the square of the first four natural numbers. Now,

since , the expectation

Example 2.3 Let be a continuous random variable with p.d.f. given by

Since

the expectation

In the next example, we show that the expectation of a constant random variable is
again constant.

Example 2.4 Consider a constant random variable , i.e., a random variable having all its
mass at a constant Clearly, it is a discrete random variable with p.m.f. .

Since

Institute of Lifelong Learning, University of Delhi 4


Expectation and Moment Generating Function

the expectation

Similarly, in case of a continuous random variable with p.d.f. , such that is a


constant random variable. Also, since

(since )

the expectation is given by

Thus, the expectation of a constant is again a constant.

In the next theorem, we will determine the expectation of a function of a discrete


random variable using the distribution of .

Theorem 2.5 Let be a discrete random variable with p.m.f. and let be any
real-valued function of . Then, the expected value of is given by

Proof. Let be discrete random variable and suppose that assumes a finite number of
values. Let , be possible values of For each , , let denote
the values of such that , Then,

Therefore, we have

Institute of Lifelong Learning, University of Delhi 5


Expectation and Moment Generating Function

If takes up countably infinite values with positive probability, properties of absolutely


convergent series allow the same conclusion.

Next result gives the geometrical interpretation of

Theorem 2.6 Let be a continuous random variable with p.d.f. , then

where denotes the distribution function of .

Proof. By definition, we have

We know that

and

Now, consider

(By change of order of integration in the region )

Institute of Lifelong Learning, University of Delhi 6


Expectation and Moment Generating Function

Therefore,

Consider

(By change of order of integration in the region )

Therefore,

From (1), (2) and (3), we have

In the next theorem, we will determine the expectation of a function of a


continuous random variable using the distribution of .

Theorem 2.7 Let be a continuous random variable with p.d.f. and let be
any real-valued function of . Then, the expected value of is given by

Institute of Lifelong Learning, University of Delhi 7


Expectation and Moment Generating Function

Proof. Consider

Replacing by and denote the sets: and , then by


(1), we have

(Interchange of limits)

Does there exists any random variable whose expected value is not finite?

Yes. The answer to this problem has been addressed in the next example.

Example 2.8 Consider a continuous random variable and suppose that

Then, since

Thus, is p.d.f. of . We have

Institute of Lifelong Learning, University of Delhi 8


Expectation and Moment Generating Function

In the next example, we will show that for a finite mean , may not be finite
i.e., does not exist.

Example 2.9 Let be a continuous random variable with p.d.f.

Therefore,

But

Next, we give some properties of expectation in terms of following results.

Theorem 2.10 Let and be two real-valued functions and let be arbitrary real
numbers, then

Proof. Suppose that is a continuous random variable with p.d.f. , then using
definition, we have

Corollary 2.11 If and are constants, then

Institute of Lifelong Learning, University of Delhi 9


Expectation and Moment Generating Function

Proof. By replace by and by in (1) of Theorem 2.10, we obtain the required


result.

Corollary 2.12 If is constant, then

Proof. If we take in Corollary 2.11, then

Corollary 2.13 If is constant, then

Proof. If we take in Corollary 2.11, then

Theorem 2.14 If , then .

Proof. Let be continuous random variable with p.d.f. . Since is non-negative,


, for Hence provided exists,

Theorem 2.15 The expected value of a bounded random variable always exists.

Proof. Let be a continuous random variable and since is given to be bounded, so


that , for

Now, consider

If is discrete and bounded i.e., , for all , so that , for , then for all ,
we have

Thus, in either case, is convergent and therefore expectation necessarily exists.

Theorem 2.16 Let , for all , then .

Proof. Since , for all , , for all .

This gives

Institute of Lifelong Learning, University of Delhi 10


Expectation and Moment Generating Function

Thus,

Solved Problems
Problem 1 Prove that the expected value is not defined for each of the following
random variable:

a)
b)

Solution. a) We have

The -series is divergent i.e., not convergent if . Hence does not exist and
therefore is not defined.

b) We have

Since the integral does not converge, does not exist for the given p.d.f.

Problem 2 Let be a random variable with p.d.f. such that , if and


, if otherwise.

(i) If , then find .

(ii) If and , then find and .

Solution. (i) We have

Institute of Lifelong Learning, University of Delhi 11


Expectation and Moment Generating Function

(ii) We have

From (1) and (2), we have

and .

Problem 3 Let be a discrete random variable with p.m.f given by the following table

Then, find the expected value of .

Solution. We have

3. Some Special Expectations


In this section, we will study the order moments of a random variable including
variance and standard deviation, their properties and some useful results followed by
various examples and solved problems.

Definition 3.1 The order moment of a random variable about a constant ,


denoted by , is defined by

Institute of Lifelong Learning, University of Delhi 12


Expectation and Moment Generating Function

for when is continuous with p.d.f. and

when is discrete with p.m.f.

Recall that exists if and only if exists.

If , then is called the order moment about the origin and in particular if
, then is known as the mean of or expected value of .

Definition 3.2 The order moment of a random variable about the mean,
denoted by , is defined by

for when is continuous with p.d.f. and

when is discrete with p.m.f.

Note that if , then

and if , then

Definition 3.3 Let be a random variable with mean , then the variance of , written as
or , is defined as .

Thus, if is discrete with p.m.f. , then

and if is continuous with p.d.f. , then

Next, we give some properties of in the fom of a following result.

Theorem 3.4 Let be a random variable with mean , then

Institute of Lifelong Learning, University of Delhi 13


Expectation and Moment Generating Function

for another random variable with mean , we have

Proof. Consider

(using linearity of )

Consider

(using linearity of and Definition 3.3)

Consider

where and .

This gives (using linearity of and Definition 3.3).

Value Addition: Minimal Property of Variance


Consider

This gives

i.e.,
Thus, is the smallest second order moment, is the minimum when
.

Institute of Lifelong Learning, University of Delhi 14


Expectation and Moment Generating Function

In the next theorem, we will show that the probability that the random variable
takes on a value within standard deviations of the mean is at least . In other words,
we will establish that the variance or standard deviation tells us the spread or dispersion of
the distribution of a random variable.

Chebyshev’s Theorem
Theorem 3.5 Let and be the mean and the standard deviation of a random variable
with p.d.f., then for a constant , we have

Proof. Consider

Since ,

Now, since for or .

Therefore, it follows that

This gives

provided that .

Thus,

and hence it follows that

Institute of Lifelong Learning, University of Delhi 15


Expectation and Moment Generating Function

Example 3.6 For a given discrete random variable , the p.m.f. is given by

Now, we have

Further,

Using Chebyshev’s Theorem, we have

Also, we have

Since the results given by (1) and (2) coincides, therefore Chebyshev’s inequality cannot be
improved.

Some Solved Problems


Problem 4 If , then show that

exists and ,

exists and .

Institute of Lifelong Learning, University of Delhi 16


Expectation and Moment Generating Function

Solution. Since , then using p.d.f. , we have

This gives

Thus, exists.

Thus, .

By Minimal Property of Variance, we have

Let , then it follows that exists and that

Problem 5 Let the distribution of be given by for


. Then, find the value of for which is maximum.

Solution. We have

and .

Using Theorem 3.4, we have

Thus,

Problem 6 Let be a random variable such that and . Then, show that

Institute of Lifelong Learning, University of Delhi 17


Expectation and Moment Generating Function

Solution. Consider

Now, let

Using Chebyshev’s Theorem and on taking , we have .

Thus,

Therefore,

4. Moment Generating Function


In this section, we will study an alternative method or procedure to calculate the
moments of discrete and continuous distributions. This method employs moment generating
functions. Properties and examples of moment generating function have been given.

Definition 4.1 The moment generating function (m.g.f.) of a random variable about
the point is denoted by , where it exists is given by

About the origin, i.e., at , m.g.f. is defined as

if is discrete random variable with p.m.f. and

if is continuous random variable with p.d.f. .

Institute of Lifelong Learning, University of Delhi 18


Expectation and Moment Generating Function

Why do we call this function as moment generating function?

Let us replace in the formula for m.g.f. of the continuous random variable by the
Maclaurin’s series expansion for given by

then

Now, note that the coefficient of in the Maclaurin’s series of expansion of the m.g.f. of
is , i.e., the moment about the origin.

Thus, we see that the function given by (1) generates moments and that is why it is
called the moment generating function. The same argument can be given for the
discrete case.

In the next result, we will show that the derivative of the moment generating
function with respect to at is same as the coefficient of in the Maclaurin’s series of
expansion of the moment generating function of .

Theorem 4.2 Let , the moment generating function associated with variate exists,
then

Proof. Since exists, then is continuously differentiable in some neighborhood of


the origin.

Then, using (1), we have

Institute of Lifelong Learning, University of Delhi 19


Expectation and Moment Generating Function

On differentiating times w.r.t. , we have

Taking , we have

Hence

Note that the above theorem serves as a convenient method of calculating moments.

Next, we give some properties of moment generating function (m.g.f.).

Theorem 4.3 Let , the moment generating function associated with variate exists,
then

, is any constant,

Proof. (i) We have

Now,

(ii) We have

Theorem 4.4 Let and be two independent random variables, then

Proof. Consider

Institute of Lifelong Learning, University of Delhi 20


Expectation and Moment Generating Function

Hence

What is the effect of change of origin and scale on moment generating function ?

Consider (where and are constants) be the transformation corresponding to


change of origin and scale.

Then, the m.g.f. of is given by

Thus,

Next, we give an example of a random variable with p.d.f. such that does
not exists.

Example 4.5 Consider a continuous random variable with p.d.f. given by

Now,

The m.g.f. of is given by

Clearly, does not exists for any

Let us see another example of m.g.f.

Example 4.6 Let us toss a coin till the first head appears, then the sample space is of the
form .

If denotes the number of tosses required, then takes the values

Clearly,

Institute of Lifelong Learning, University of Delhi 21


Expectation and Moment Generating Function

Note that

So, the probability mass function of is given by

Now, the m.g.f. of is given by

Value Addition
Note that there are several distributions for which moment generating function does not
exists but there exists a function of the form (where denote the imaginary
unit and is arbitrary) for every distribution.
Such function is known as characteristic function of the distribution.
If is continuous, we have

Now, since p.d.f. is non-negative, and


.
This gives

Thus, exists for all .


Similar argument follows for the discrete case.

Institute of Lifelong Learning, University of Delhi 22


Expectation and Moment Generating Function

Exercises
1. Let denotes the absolute difference of the upturned faces in the experiment of
tossing two dice. Find and .
2. Let the p.d.f. of be given by

3. A person draws cards one by one from a pack until he draws all the aces. How many
cards may he be expected to draw?
4. Show that the expected number of throws of a coin necessary to produce heads is

5. The p.m.f. of a variate is Does exist ?


6. In a game of chance, a man is allowed to throw a coin indefinitely. He receives
Rs. if he throws a head at trials respectively. If the entry fee to
participate in the game is Rs. , then show that the expected value of the net gain is
zero.
7. Show that if random variable is bounded, it has moments of every order.
8. Let the p.d.f. of be given by Find the M.G.F. of and

hence show that and . Also, obtain


9. Let , then
i. find if is a probability density function.
ii. find the cumulative density function.
iii. find
10. Suppose that a pair of dice is thrown once. If denotes the sum of numbers showing
up, then prove that . Compare this value with the exact probability.
11. Suppose we toss two balls into five bags in such a way that each ball is equally likely
to fall into any bag. If denote the number of balls in the first bag, then
i. what is the density function of ?
ii. find the mean and variance of .
iii. find the m.g.f. of .

12. Can be the M.G.F. of some random variable?

References
1. Robert V. Hogg, Joseph W. McKean and Allen T. Craig, Introduction to Mathematical
Statistics, Pearson Education, Asia, 2007.
2. Irwin Miller and Marylees Miller, John E. Freund’s Mathematical Statistics
with Applications (7th Edition), Pearson Education, Asia, 2006.
3. Sheldon Ross, Introduction to Probability Models (9th Edition), Academic Press,
Indian Reprint, 2007.

Institute of Lifelong Learning, University of Delhi 23

You might also like