CHAPTER TWO
Random Variables
outline
Introduction
Random Variables
DISTRIBUTION FUNCTION
Discrete and Continuous Random Variables
MEAN AND VARIANCE
SOME SPECIAL DISTRIBUTIONS
CONDITIONAL DISTRIBUTIONS
Introduction
Numerical measurements or observations (e.g. thermal
noise, voltages, loads on an electric power grid) that have
uncertain variability are called random variables.
In order to exploit the axioms and properties of
probability, we need to define random variables in terms
of an underlying sample space.
3
Probabilities Involving Random Variables
A real-valued function X(ω) defined for points ω in
a sample space is called a random variable.
Example: Assume tossing of three distinct coins
once, so that the sample space is S = {HHH, HHT,
HTH, THH, HTT, THT, TTH, TTT}. Then, the
random variable X can be defined as X(s), X(s) = the
number of heads (H’s) in S.
Example: In rolling two distinct dice once. The
sample space S is S = {(1, 1), (1, 2), . . , (2, 1), . . , (6,
1), (6, 2), . .. , (6, 6)}, a r.v. X of interest may be
defined by X(s) = sum of pair numbers in S.
4
Example: consider the number of heads in 3 coin tosses.
The sample space is {HHH,HHT,HTH,HTT, THH,THT,TTH,TTT}
Since we are interested on the number of heads
The sample space S is termed the domain of the r.v. X, and the collection of all
numbers [values of X()] is termed the range of the r.v. X. Thus the range of X
is a certain subset of the set of all real numbers .
Contd…
Assuming it is a fair experiment, the probability of
the random variable is
Example: What is the probability that the number of
heads is less than 2?
Solution:
P(X(ω)<2)=P(X(ω)=1)+P(X(ω)=0)
P(X(ω)<2)=3/8 + 1/8=1/2
7
EXAMPLE 4.1a Letting X denote the random variable that is defined as the
sum of two fair dice, then
P{X = 2} = P{(1, 1)} = (4.1.1)
P{X = 3} = P{(1, 2), (2, 1)} =
P{X = 4} = P{(1, 3), (2, 2), (3, 1)} =
B. Events Defined by Random Variables
If X is a r.v. and x is a fixed real number, we can define the event (X = x) as
(X = x) = {: X() = x)
Similarly, for fixed numbers x, x1 and x2, we can define the following events:
(X ≤ x) = {: X() ≤ x)
(X > x) = {: X() > x)
(xl < X ≤ x2) = {: X1 < X() ≤ x2)
These events have probabilities that are denoted by
P(X = x) = P{: X() = X}
P(X ≤ x) = P(: X() ≤ x}
P(X > x) = P{: X() > x)
P(x1 < X ≤ x2) = P { ( : x1 < X() ≤ x2)
EXAMPLE 2.2 In the experiment of tossing a fair coin three times (Prob. 1.1), the
sample space S, consists of eight equally likely sample points S , = (HHH, ...,TTT).
If X is the r.v. giving the number of heads obtained, find
(a) P(X = 2); (b) P(X < 2).
(a) Let A c S, be the event defined by X = 2. Then, from Prob. 1.1, we have
A = ( X = 2) = {: X() = 2 ) = {HHT, HTH, THH)
Since the sample points are equally likely, we have
P(X = 2) = P(A)=
(b) Let B c S , be the event defined by X < 2. Then
B = ( X < 2) = { : X() < 2 ) = (HTT, THT, TTH, TTT)
and P(X < 2) = P(B)=
Example 2.1
Solution
DISTRIBUTION FUNCTION
A. Definition:
The distribution function [or cumulative distribution function (cdf)] of X is the
function defined by
Fx(x)=P(X ≤ x), -∞ < x < ∞
Most of the information about a random experiment described by the r.v. X is
determined by the behavior of Fx (x).
B. Properties of Fx (x).
Several properties of FX(x)follow directly from its definition
1.0 ≤FX(x) ≤1, because FX(x)is a probability.
2. Fx(x1) ≤ Fx(x2) if x1 < x2, shows that FX(x)is a non decreasing
function
x
3. = Fx(∞ ) = 1
4. = Fx(-∞ ) = 0 P(X > a)=1-Fx(a)
Properties 3 and 4 follow from Eqs P(a < X ≤ b)=Fx(b)-Fx(a)
3. = P(X ≤ ∞ ) =P(S)= 1
4. = P(X ≤ -∞ ) =P()= 0
EXAMPLE 2.3 Consider the r.v. X defined in Example 2.2. Find and
sketch the cdf FX(x) of X.
Table 2.1 gives Fx(x)= P(X ≤ x) for x = -1, 0, 1 , 2, 3, 4.
Since the value of X must be an integer, the value of Fx(x) for non
integer values of x must be the same as the value of Fx(x)for the nearest
smaller integer value of x.
Note that Fx(x) has jumps at x = 0, 1,2,3,and that at each jump the
upper values the correct value for Fx(x).
Example: roll of a die
Cumulative distribution function (CDF)
P(x)
1.0
5/6
2/3
1/2
1/3
1/6
1 2 3 4 5 6 x
EXAMPLE 4.1c Suppose the random variable X has distribution function
What is the probability that X exceeds 1?
SOLUTION The desired probability is computed as follows:
P{X > 1} = 1 - P{X ≤ 1}
= 1 - F (1)
=
= .368
Discrete Random Variables
A random variable X is called discrete (or of the discrete type),
if X takes on a finite or countable infinite number of values; that
is, either finitely many values such as x1, . . . , xn, or countable
infinite many values such as x0, x1, x2, . . . .
we can describe discrete random variable as, it
Take whole numbers (like 0, 1, 2, 3 etc.)
Take finite or countable infinite number of values
Jump from one value to the next and cannot take any values in
between.
18
Contd…
Examples of discrete random variables
19
Discrete example: roll of a die
Probability of Discrete Random Variable
If X is a discrete random variable, the function given
by f(x) = P(X = x) for each x within the range of X is
called the probability distribution or probability mass
function of X.
The probability distribution (mass) function f(x), of a
discrete random variable X, satisfy the following two
conditions
21
EXAMPLE : For the sample space
S = {HHH , HHT , HT H , HTT , T HH , T HT , TT H , TTT} ,
with X(s) = the number of Heads ,
PX(0) ≡ P( {TTT}=
Px(1) ≡ P( {HTT , T HT , TT H} =
Px(2) ≡ P( {HHT , HT H , T HH}=
Px(3) ≡ P( {HHH} =
where
Px(0) + Px(1) + Px(2) + Px(3) =1
The events E0, E1, E2, E3 are disjoint since X(s) is a function !
(X : S → R must be defined for all s ∈ S and must be single-
valued.)
Contd…
Example: A shipment of 20 similar laptop computers to
a retail outlet contains 3 that are defective. If a school
makes a random purchase of 2 of these computers, find
the probability distribution (p.m.f) for the number of
defectives. Check that f(x) defines a p.m.f;
Solution: Let X be a random variable whose values x
are the possible numbers of defective computers
purchased by the school. Then x can only take the
numbers 0, 1, and 2. Now
25
Contd…
26
1. Cumulative Distribution Functions for Discrete Random
Variables
28
Contd…
Example 10: Find the cumulative distribution function
of the random variable X , if the following information
is given as follows f(0)= 1/16, f(1) = 1/4, f(2)= 3/8, f(3)=
1/4, and f(4)= 1/16. Therefore,
29
Contd…
30
Continuous Random Variables
A r.v X is called continuous (or of the continuous type)
if X takes all values in a proper interval I⊆ IR. Or we can
describe continuous random variables as follows:
Take whole or fractional numbers.
Obtained by measuring.
Take infinite number of values in an interval.
Example: Recording the lifetime of an electronic
device, or of an electrical appliance. Here S is the interval
(0, T), a r.v. X of interest is X(s) = s, s ∈ S. Here the
random variables defined are continuous r.vs
31
Contd…
Examples of continuous random variables
Additional Examples:
• Noise due to thermal excitation.
• Power fluctuations in power station.
• Voltage measurements on a resistor.
32
Probability Density Functions
Probability density function also referred as probability densities
(p.d.f.), probability function, or simply densities. It is the equivalent of
probability mass function in discrete r.v.
Let
The function fx(x) is called the probability density function (pdf)of the continuous
r.v. X.
Properties of fx(x):
1. is piecewise continuous.
2. P(a<x)=
The cdf FX(x)of a continuous r.v. X can be obtained by
Fx(x)=P(X)=
EXAMPLE 4.2b Suppose that X is a continuous random variable
whose probability density function is given by
(a) What is the value of C?
(b) Find P{X > 1}?
SOLUTION (a) Since f is a probability density function, we must have
that implying that
C
C=1
C=
(b) Hence
P{X > 1} =
=
=
For example, the probability of x falling within 1 to 2:
p(x)=e-x
x
1 2
2 2
x x 2 1
P(1 x 2) e e e e .135 .368 .23
1
1
2. Cumulative Distribution Functions (CDFs) of
Continuous Random Variables
38
Contd…
39
MEAN AND VARIANCE
A. Mean
Expected value is just the weighted average or mean (µ) of random variable x.
Imagine placing the masses p(x) at the points X on a beam; the balance point of
the beam is the expected value of x.
The mean (or expected value) of a r.v. X, denoted by µx or E(X), is defined by
Properties of Mean
E[aX] = a E[X] ,
E[aX + b] = a E[X] + b
E(X+Y)= E(X) + E(Y), This works even if X and Y are dependent
EXAMPLE : The expected value of rolling a die is
EXAMPLE 4.4b If I is an indicator random variable for the event A, that is, if
Example: If X is a random integer between 1 and 10, what’s the expected
value of X?
10
1 1 10 10(10 1)
E ( x) i ( ) i (.1) 55(.1) 5.5
i 1 10 10 i 2
B. Moment:
The nth moment of a r.v. X is defined by
Note that the mean of X is the first moment of X
EXAMPLE Suppose X has the following probability mass function
p(0) = .2, p(1) = .5, p(2) = .3
Calculate E[].
Cont…
C. Variance
“The average (expected) squared distance (or deviation) from the mean”
The standard deviation of a r.v. X, denoted by a,, is the positive square root of
Var(X).
Cont…
• Handy calculation formula (if you ever need to calculate by hand!):
Hence E(c) = c
which is a useful formula for determining the variance.
Some properties of variance
If c= a constant number (i.e., not a variable) and X and Y are random variables, then
Var(c) = 0, Constants don’t vary!
Var (c+X)= Var(X)
Var(cX)= c2Var(X)
Var(X+Y)= Var(X) + Var(Y) ONLY IF X and Y are independent!!!!
SOME SPECIAL DISTRIBUTIONS
A. Bernoulli Distribution:
Suppose that a trial, or an experiment, whose outcome can be classified as either a
“success”or as a “failure” is performed. and the probability of a success is p and the
probability of a failure is 1- p. Such experiments are often called Bernoulli trials
then the probability mass function of X is given by
A Bernoulli trial has
only two outcomes ,
with probability
P(X = 1) = p ,
P(X = 0) = 1 - p
B. Binomial Distribution: Perform a Bernoulli trial n times in sequence .
Assume the individual trials are independent .
Suppose now that n independent trials, each of which results in a “success” with
probability p and in a “failure” with probability 1 - p, are to be performed. If X
represents the number of successes that occur in the n trials, then X is said to be a
binomial random variable with parameters (n, p).
EXAMPLE A binary source generates digits 1 and 0 randomly with probabilities
0.6 and 0.4, respectively.
(a) What is the probability that two 1s and three 0s will occur in a five-digit
sequence?
(b) What is the probability that at least three 1s will occur in a five-digit sequence?
Solution: (a) Let X be the r.v. denoting the number of 1s generated in a five-digit
sequence. Since there are only two possible outcomes (1 or O), the probability of
generating 1 is constant, and there are five digits, it is
clear that X is a binomial r.v. with parameters (n, p) = (5, 0.6).
that two 1s and three 0s will occur in a five-digit sequence is
Exercise 2.1a It is known that disks produced by a certain company will be
defective with probability .01 independently of each other. The company sells the
disks in packages of 10 and offers a money-back guarantee that at most 1 of the 10
disks is defective. What proportion of packages is returned? If someone buys three
packages, what is the probability that exactly one of them will be returned?
Mean and variance of the Binomial random variable :
The Poisson Random Variable
The Poisson variable approximates the Binomial random variable :
This approximation is accurate if n is large and p small . Recall that for
the Binomial random variable
E[X] = n p , and Var(X) = np(1 - p) ∼= np when p is small.
Indeed, for the Poisson random variable we will show that
E[X] = λ and Var(X) = λ .
NOTE : Unlike the Binomial random variable, the Poisson random
variable can have an arbitrarily large integer value k.
1. Important of Poisson Random Variables
• Is used to model many different physical phenomena, such
as:
• Photoelectric effect
• Radioactive decay
• Computer message traffic arriving at a queue
• Its probability mass function is ( λ>0 )
58
EXAMPLE The number of telephone calls arriving at a switchboard during any 10-
minute period is known to be a Poisson r.v. X with A = 2.
(a) Find the probability that more than three calls will arrive during any 10-minute
period.
(b) Find the probability that no calls will arrive during any 10-minute period.
Uniform Distribution
Contd…
3. Exponential random variable
An exponential random variable with parameter λ>0
has probability density function
Examples:
• To model lifetimes
• How long it takes for a computer to transmit a message
from one node to another.
62
EXAMPLE
Contd…
5. Gaussian (Normal) Random Variable
One of the most important continuous random
variable.
It models most of the noise in communication and
control systems.
Its pdf is given in terms of its mean (m) and standard
deviation(σ)
Where ,
65
CONDITIONAL DISTRIBUTIONS
the conditional probability of an event A given event B is defined as
END