0% found this document useful (0 votes)
19 views4 pages

Uniform and Normal Distribution

1. The document discusses two common probability distributions: the uniform distribution and the normal (or Gaussian) distribution. 2. For a uniform distribution over the interval (α, β), the probability density function is constant and the mean is (α + β)/2. 3. For a normal distribution with parameters μ and σ2, the probability density function is symmetric around the mean μ and depends on the standard deviation σ. The normal distribution with μ = 0 and σ = 1 is called the standard normal distribution.

Uploaded by

qwertypushkar123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views4 pages

Uniform and Normal Distribution

1. The document discusses two common probability distributions: the uniform distribution and the normal (or Gaussian) distribution. 2. For a uniform distribution over the interval (α, β), the probability density function is constant and the mean is (α + β)/2. 3. For a normal distribution with parameters μ and σ2, the probability density function is symmetric around the mean μ and depends on the standard deviation σ. The normal distribution with μ = 0 and σ = 1 is called the standard normal distribution.

Uploaded by

qwertypushkar123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Uniform and Normal Distribution

1. Uniform or Rectangular Distribution

Let α and β be two real numbers such that −∞ < α < β < ∞. A continuous random
variable X is said to have a uniform (or rectangular) distribution over the interval (α, β)
(written as X ∼ U (α, β)) if probability density function of X is given by
(
1
β−α
,if α < x < β
fX (x) =
0, otherwise

Now, the r-th moment of X ∼ U (α, β) is

Z∞
r
E(X ) = xr fX (x)dx
−∞


xr
= dx
β−α
α
β r+1 − αr+1
=
(r + 1)(β − α)
β r + β r−1 α + · · · + βαr−1 + αr
=
r+1

Hence

α+β
E(X) = ;
2
β 2 + βα + α2
E(X 2 ) = ;
3
(β − α)2
V ar(X) = E(X 2 ) − (E(X))2 = .
12

The m.g.f. of X ∼ U (α, β) is

MX (t) = E(etX )
Z∞
= etx fX (x)dx
−∞


etx
= dx
β−α
α
(
etβ −etα
(β−α)t
if t 6= 0
,
= .
1, if t = 0
1
The d.f. of X ∼ U (α, β) is

Zx
FX (x) = fX (t)dt
−∞

0, if x < α

x−α
= β−α , if α ≤ x < β

1, if x ≥ β

X−α
Remark 1. Let X ∼ U (α, β) and Y = β−α
. Then the d.f. of Y is

FY (y) = P (Y ≤ y)
= P (X ≤ α + (β − α)y)

0, if α + (β − α)y < α

= α+(β−α)y−α
β−α
, if α ≤ α + (β − α)y < β

1, if α + (β − α)y ≥ β


0, if y < 0

= y, if 0 ≤ y < 1

1, if y ≥ 1

Clearly, FY is not differentiable at 0 and 1. Hence, the p.d.f. of Y is


(
1, if 0 < y < 1
fY (y) =
0, otherwise

Therefore, Y ∼ U (0, 1).

Example 2. Let a > 0 be a real constant. A point X is chosen at random on the interval
(0, a) (i.e., X ∼ U (0, a)).

(1) If Y denotes the area of equilateral triangle having sides of length X, find the
mean and variance of Y .
(2) If the point X divides the interval (0, a) into subintervals I1 = (0, X) and I2 =
[X, a), find the probability that the larger of these two subintervals is at least the
double of the size of the smaller subinterval.

Solution:

3 2
(1) We have Y = 4
X . Then
√ √
3 2 3 2
E(Y ) = E(X ) = a;
4 12
3 3
E(Y 2 ) = E(X 4 ) = a4 ;
16 80
a4
V ar(Y ) = E(Y 2 ) − (E(Y ))2 = .
80
2
(2) The required probability is
p = P ({max(X, a − X) ≥ 2 min(X, a − X)})
a a
= P ({a − X ≥ 2X, X ≤ }) + P ({X ≥ 2(a − X), X > })
2 2
a 2a
= P (X ≤ }) + P ({X ≥ })
3 3
a 2a
= FX ( ) + 1 − FX ( )
3 3
1 2 2
= +1− =
3 3 3

2. Normal or Gaussian Distribution


(1) Let µ ∈ R and σ > 0 be real constants. A continuous random variable X is said
to have a normal (or Gaussian) distribution with parameters µ and σ 2 (written as
X ∼ N (µ, σ 2 )) if probability density function of X is given by
1 (x−µ)2
fX (x) = √ e− 2σ2 , −∞ < x < ∞
σ 2π
(2) The N (0, 1) distribution is called the standard normal distribution. The p.d.f.
and the d.f. of N (0, 1) distributions will be denoted by φ and Φ respectively, i.e.,
1 z2
φ(z) = √ e− 2 , −∞ < z < ∞

Zz Zz
1 x2
Φ(z) = φ(x)dx = √ e− 2 dx.

−∞ −∞
R∞ 2 √ R∞ x2 √
(3) We know that e−x dx = π and e− 2 dx = 2π.
−∞ −∞

Clearly if X ∼ N (µ, σ 2 ), then


1 x2
fX (µ − x) = fX (µ + x) = √ e− 2σ2 , ∀ x ∈ R
σ 2π
Thus the distribution of X is symmetric about µ. Hence,
1
X ∼ N (µ, σ 2 ) ⇒ FX (µ − x) + FX (µ + x) = 1, ∀ x ∈ R and FX (µ) = .
2
In particular,
1
Φ(−z) = 1 − Φ(z), ∀ z ∈ R and Φ(0) = .
2
X−µ
Suppose that X ∼ N (µ, σ 2 ). Then the p.d.f. of Z = σ is given by
fZ (z) = fX (µ + σz)|σ|, −∞ < z < ∞
1 z2
= √ e− 2 , −∞ < z < ∞

i.e.,
X −µ
X ∼ N (µ, σ 2 ) ⇒ Z = ∼ N (0, 1).
σ
Thus
   
2 x−µ x−µ
X ∼ N (µ, σ ) ⇒ FX (x) = P ({X ≤ x}) = P Z≤ =Φ , ∀ x ∈ R.
σ σ
3
Now, the m.g.f. of X ∼ N (µ, σ 2 ) is
MX (t) = E(etX )
Z∞
= etx fX (x)dx
−∞
Z∞
1 (x−µ)2
= √ etx e− 2σ 2 dx
σ 2π
−∞

eµt √ x−µ
Z
2+
=√ e−y 2σty
dy (by putting √ = y)
π 2σ
−∞
σ 2 t2 Z∞
e(µt+ 2
)
−(y−

2σt 2
)
= √ e 2 dy
π
−∞
2 2
(µt+ σ 2t )
=e , ∀ t ∈ R.
Therefore,
(1) σ 2 t2
MX (t) = (µ + σ 2 t)e(µt+ 2
)
, ∀ t ∈ R;
(2) σ 2 t2
MX (t) = (σ 2 + (µ + σ 2 t)2 )e(µt+ 2
)
, ∀ t ∈ R;
(1)
E(X) = MX (0) = µ;
(2)
E(X 2 ) = MX (0) = µ2 + σ2;
and V ar(X) = E(X ) − (E(X))2 = σ 2 .
2

You might also like