0% found this document useful (0 votes)
13 views48 pages

Unit 5 Noise

Uploaded by

adengrayson17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views48 pages

Unit 5 Noise

Uploaded by

adengrayson17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Introduction to

Communication System
Unit 5 Noise

Taught by:
Dr. Prateek Verma
Assistant Professor
DYPIU Akurdi
1
Introduction to Noise
• Noise is an unwanted electrical disturbance which gives rise to audible or visual
disturbances in the communication systems.

• In any communication system, during the transmission of the signal or while receiving
the signal, some unwanted signal gets introduced into the communication, making it
unpleasant for the receiver, and questioning the quality of the communication. Such a
disturbance is called as Noise.

• Noise is an unwanted signal, which interferes with the original message signal and
corrupts the parameters of the message signal.

• This alteration in the communication process, leads to the message getting altered. It
most likely enters at the channel or the receiver.
2
Noise as a Signal
• The signals which can be described by some fixed mathematical equations are called
deterministic signals.

• There is one more class of signals, the behavior of which cannot be predicted. Such type
of signals are called random signals such as noise in communication systems.

• This means that the noise interference during transmission is totally unpredictable.

• Such signals cannot be modelled mathematically e.g. thermal noise in the receiver.

• It is possible to analyze these type of signals using probability theory.


3
Probability Distribution Functions (PDF)
• A probability distribution is a statistical function that describes all the possible values
and likelihoods that a random variable can take within a given range.

• In statistical terms, a distribution function is a mathematical expression that describes the


probability of different possible outcomes for an experiment.

• For example, Let us say we are running an experiment of tossing a fair coin. The
possible events are Heads, Tails. And for instance, if we use X to denote the events, the
probability distribution of X would take the value 0.5 for X=heads, and 0.5 for X=tails.

• The most common probability distribution is the normal or Gaussian distribution, or


“bell curve," although several distributions exist that are commonly used such as chi
square distribution, binomial distribution, and Poisson distribution.
4
Contd.
• Probability distributions can also be used to create Cumulative Distribution Functions
(CDFs), which adds up the probability of occurrences cumulatively and will always start
at zero and end at 100%.

Discrete
Probability Mass
Function (PMF)
Cumulative
Distribution Function
(CDF)
Continuous Probability Density
Function (PDF)

5
Probability Mass Function (PMF)
• Let us take the example of a dice,

PMF

It has 6 possible
outcomes

1
= 0.167
6
6
Cumulative Distribution Function (CDF)
• Now, Can we draw the cumulative distribution function using PMF?

PMF CDF
𝑷 𝑿≤𝟑 =𝑷 𝑿=𝟏 +𝑷 𝑿=𝟐 +𝑷 𝑿=𝟑 7
Probability Density Function (PDF)
• Let us take a continuous random variable, for example height of boys/girls in a class.

• Now, the important question is how can we obtain CDF from PDF or vice versa.
8
Contd.
1st Case – 50% of
data

• From CDF, Higher the gradient = more is the density of points.

• Gradient is very low at lower and higher ends of the CDF.

9
Contd.
Gradient

Area to the
left

• The method to find the gradient is shown in the first Here, Gradient means Differentiation
figure.
& Area to the left means Integration.
𝑥
• The y-axis in PDF graph represents the gradient value
of CDF. 𝐹𝑋 𝑥 = 𝑓𝑋 𝑥 𝑑𝑥
10
−∞
Probability Density Function Properties
• Let x be the continuous random variable with density function f(x), and the probability
density function should satisfy the following conditions:

a) For a continuous random variable that takes some value between certain limits, say a
and b, the PDF is calculated by finding the area under its curve and the X-axis within
the lower limit (a) and upper limit (b). Thus, the PDF is given by
𝑏
𝑃 𝑥 = 𝑓 𝑥 𝑑𝑥
𝑎
b) The probability density function is non-negative for all the possible values, i.e. f(x)≥ 0,
for all x.

c) The area between the density curve and horizontal X-axis is equal to 1, i.e.

𝑓 𝑥 𝑑𝑥 = 1
−∞ 11
Tutorial
1. Let X be a continuous random variable with the PDF given by:
𝑥; 0 < 𝑥 < 1
𝑓 𝑥 = 2 − 𝑥; 1 < 𝑥 <2 Find P(0.5<x<1.5).
0; 𝑥 >2

1 2 ∞ 1 1.5 3
𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 + 0 𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 Ans
0 1 2 0.5 1 4

2. Let X be a continuous random variable with the PDF given by:


𝛼 2𝑥 − 𝑥 2 ; 0 ≤ 𝑥 ≤ 2 Find (a) 𝛼 (b) P(x>1).
𝑓 𝑥 =
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
3
Ans (a) 𝛼 = (b) P(x>1) = 1/2 12
4
Contd.
3. Let X be a continuous random variable with the PDF given by:
𝑘𝑥 2 ; −3 ≤ 𝑥 ≤ 3 Find (a) k (b) P(1 ≤ x ≤ 2) (c) P(x ≤2)
𝑓 𝑥 =
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 (d) P(x ≥1)

1 7 35 26
Ans (a) (b) (c) (d)
18 54 54 54

2. Let X be a continuous random variable with the PDF given by:


𝑥; 0 ≤ 𝑥 ≤ 1 Find (a) P(x ≥ 1.5)
𝑓 𝑥 = 2 − 𝑥; 1 ≤ 𝑥 ≤ 2 (b) Find the CDF.
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
13
Contd. 𝑥; 0 ≤ 𝑥 ≤ 1
𝑓 𝑥 = 2 − 𝑥; 1 ≤ 𝑥 ≤ 2
Ans (a) P(x ≥ 1.5) = 0.125 0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Now we will find the CDF of given PDF.
𝑥
Case 1: 𝑥 ≤ 0 P(X ≤0) = −∞
𝑓 𝑥 𝑑𝑥 = 0

Case 2: 0 ≤ 𝑥 ≤ 1 0 𝑥 𝑥 𝑥2
F(𝑥) = −∞
𝑓 𝑥 𝑑𝑥 + 0
𝑓 𝑥 𝑑𝑥 = 0 + 0
𝑥𝑑𝑥 =
2

Case 3: 1 ≤ 𝑥 ≤ 2 0 1 𝑥
F(𝑥) = −∞
𝑓 𝑥 𝑑𝑥 + 0
𝑓 𝑥 𝑑𝑥 + 1
𝑓 𝑥 𝑑𝑥
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
1 𝑥
F(𝑥) = 0 + 𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 𝑥2
0 1 ;0 ≤ 𝑥 ≤ 1
𝐹𝑋 𝑥 = 2
1 𝑥 𝑥2
𝑥2 𝑥2 1 𝑥2 1 𝑥2 −1 + 2𝑥 − ;
F(𝑥) = + 2𝑥 − = + 2𝑥 − − 2− = −1 + 2𝑥 − 2
2 0 2 1 2 2 2 2 1 ≤ 𝑥 ≤142
Types of Noise
• Noise can be divided into two categories:

External Internal

• External noise cannot be reduced except by changing the location of the receiver or the
entire system.

• Internal noise can be reduced upto an extent by proper designing of the instruments.

15
External Noise
• External noise, i.e. noise whose sources are external.

• External Noise can be divided into three categories such as:

Man-made/Industrial Extraterrestrial
Atmospheric noise
source noise

• The natural phenomena that give rise to noise are lighting discharges in the
thunderstorms and other natural electrical disturbances occurring in the atmosphere.

• These electrical disturbances are random in nature.


16
Contd.
• Hence, the noise received by the receiving antenna due to atmospheric noise can only be
reduced by repositioning the antenna.

• The noise originating from the Sun and outer space is called Extraterrestrial Noise.

• The Extraterrestrial Noise can further be sub-divided into two classes:

Solar noise Cosmic noise

• The noise is called as black body noise and it is distributed uniformly over the entire sky.

17
Man-made source (Industrial Noise)
• Man-made noise is created due to wear and tear of electrical machinery used in
transmission systems.

• It is generated due to the make and break process in a current carrying circuit.

• They are due to the type of electrical motors, welding machines, fluorescent lights, often
caused by ignition circuits, and switching elements causing spark gaps.

18
Fundamental or Internal Sources of Noise
• It is within the electronic equipment.

• They are called fundamental sources because they are the integral part of the physical
nature of the material used for making electronic components.

• This type of noise follows certain rules.

• Hence, it can be reduced by properly designing the electronic circuits and equipments.

19
Different Types of Internal Noise
• The fundamental noise sources produce different types of noise. These can be classified
as under:

Shot noise
Partition noise
Flicker or low
freq. noise
Thermal noise
Transit time
or high freq.
noise 20
Flicker or Low Freq. Noise
• The flicker noise will appear at frequencies below a few kilohertz or at low audio
frequencies.

• It is inversely proportional to the frequency and hence also called as 1/f noise.

• In semiconductor devices, flicker noise is generated due to the fluctuations in the carrier
density that will result in fluctuations in the conductivity of the material.

• This will produce fluctuation voltage drop when a direct current flows through a device.
This fluctuating voltage is called as flicker noise voltage.

• It may be neglected at frequencies above about 500 Hz. Therefore, it possess no serious
problem.
21
Thermal noise or Johnson Noise
• The thermal noise is continuous in nature. It occurs at all frequencies.

• It is predictable, additive and present in all devices. Hence, Thermal noise is the most
significant of all noise sources.

• The free electrons within a conductor are always in random motion, which is due to the
thermal energy received by them.

• This distribution of these free electrons is not uniform at a given instant of time. Hence, it
is highly likely that an excess number of electrons may appear at one end or the other end
of conductor.

• There is some finite value of average power, which is called thermal noise power.
22
Contd.
• The average power is given by: V
𝑃𝑛 = 𝑘𝑇𝐵 𝑤𝑎𝑡𝑡𝑠
′ −23
𝐸𝑛
Where 𝑘 = 𝐵𝑜𝑙𝑡𝑧𝑚𝑎𝑛𝑛 𝑠 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡 = 1.38 ∗ 10 𝐽𝑜𝑢𝑙𝑒𝑠/𝐾𝑒𝑙𝑣𝑖𝑛
B = bandwidth of the noise spectrum (Hz)
T = Temperature of the conductor (°𝐾)

𝑉2 𝐸𝑛2
• We also the know the relation between power and voltage i.e. 𝑃𝑛 = =
𝑅𝐿 4𝑅
Where 𝐸𝑛 is rms noise voltage and R is resistance of the conductor

𝑬𝟐𝒏 Similarly, for rms noise current


𝒌𝑻𝑩 = 𝑬𝒏 = √𝟒𝑹𝒌𝑻𝑩
𝟒𝑹 𝑰𝒏 = √𝟒𝑮𝒌𝑻𝑩
Where G = conductance = 1/R 23
Transit time or High Freq. Noise
• Transit time is the duration of time that it takes for a current carrier such as a hole or
current to move from the input to the output.

• The devices themselves are very tiny, so the distances involved are minimal.

• Yet the time it takes for the current carriers to move even a short distance is finite. At low
frequencies this time is negligible.

• But when the frequency of operation is high and the signal being processed is the
magnitude as the transit time, then problem can occur.

• The transit time shows up as a kind of random noise within the device, and this is directly
proportional to the frequency of operation. 24
Partition Noise

• Partition noise occurs whenever current has to divide between two or more paths,
and results from the random fluctuations in the division.

• It would be expected, therefore, that a diode would be less noisy than a transistor
(all other factors being equal) If the third electrode draws current (i.e.., the base
current).

25
Shot noise
• The most common type of noise is referred to as shot noise which is produced by the
random arrival of electrons or holes at the output element, at the plate in a tube, or at the
collector or drain in a transistor.

• Shot noise is also produced by the random movement of electrons or holes across a PN
junction.

• The formula for the mean square shot noise current can be obtained only for diodes and
is given by:

𝐼𝑛2 = 2 𝐼 + 2𝐼0 𝑞𝐵 𝐴𝑚𝑝

Where I = direct current across the junction (Amp) q = electronic charge = 1.6 ∗ 10−19 C
𝐼0 = reverse saturation current (Amp) B = effective noise bandwidth in Hz
26
Tutorial
1. A receiver has a noise power bandwidth of 12 kHz. What is the noise power
contributed by this resistor in the receiver bandwidth when 𝑇 = 30°𝐶?

Ans - 𝑃𝑛 = 5.01768 ∗ 10−17 W


2. A noise generator using diode is required to produce 15𝜇𝑉 noise voltage in a receiver
which has an input impedance of 75 𝑂ℎ𝑚. The receiver has a noise power bandwidth
of 200 kHz. Calculate the current through the diode.

Ans - 𝐼 = 0.625 A
3. An amplifier has a bandwidth of 4 MHz with 10 KOhm as the input resistor. Calculate
the rms noise voltage at the input to this amplifier if the room temperature is 25°𝐶.
Ans - En = 25.65𝜇𝑉

27
Contd.
4. Calculate the thermal noise power available from any resistor at room temperature
(290°K) for a bandwidth of 2 MHz. Also calculate the corresponding noise voltage
given that R = 100 Ohm.
Ans - 𝑃𝑛 = 8 ∗ 10−15 W; Voltage = 0.894 𝜇𝑉
5. Calculate the rms noise voltage at the input of a receiver RF amplifier using a device
that has a 100 Ohm equivalent noise resistance and a 200 Ohm input resistor. The
bandwidth of the amplifier is 1 MHz, the temperature is 25°𝐶.

Ans - En = 2.2𝜇𝑉
3. An amplifier has a bandwidth of 4 MHz with 10 KOhm as the input resistor. Calculate
the rms noise voltage at the input to this amplifier if the room temperature is 25°𝐶.
Ans - En = 25.65𝜇𝑉

28
Thermal Noise Calculations
• Since, the resistors acts as a source of thermal noise, it is important to observe the effect
of connecting two noise sources in series and parallel.

• Case 1: When resistors are connected in series.

• Equivalent resistance 𝑅 = 𝑅1 + 𝑅2 . Then noise voltage generated by the resistor R can be


given as:

𝐸𝑛2 = 4𝑘𝑇𝐵𝑅 𝐸𝑛2 = 4𝑘𝑇𝐵(𝑅1 + 𝑅2 ) 𝐸𝑛2 = 4𝑘𝑇𝐵𝑅1 + 4𝑘𝑇𝐵𝑅2

2 2
𝐸𝑛 = 2
𝐸𝑛1 2
+ 𝐸𝑛2 𝐸𝑛2 = 𝐸𝑛1 + 𝐸𝑛2

29
Contd.
• Case 2: When resistors are connected in parallel.

• As two conductors 𝐺1 and 𝐺2 are


connected in parallel, the effective value
of conductance is given by:
𝐺𝑝 = 𝐺1 + 𝐺2

• The noise current generated by the conductance 𝐺𝑝 is given as:

𝐼𝑛2 = 4𝑘𝑇𝐵𝐺𝑝 𝐼𝑛2 = 4𝑘𝑇𝐵(𝐺1 + 𝐺2 ) 𝐼𝑛2 = 4𝑘𝑇𝐵𝐺1 + 4𝑘𝑇𝐵𝐺2

2 2
𝐸𝑛 = 4𝑘𝑇𝐵𝑅𝑒𝑞 𝐼𝑛 = 2
𝐼𝑛1 2
+ 𝐼𝑛1 𝐼𝑛2 = 𝐼𝑛1 + 𝐼𝑛1

30
White Gaussian Noise
• White Noise is that noise whose power spectral density is uniform over the entire
frequency range of interest.

• The white noise contains all the frequency components in equal proportion. This is
analogous with white light which is a superposition of all visible spectral components.

• The white noise has a Gaussian distribution, which means that the pdf of white noise has
the shape of Gaussian pdf. Hence, it is called as Gaussian noise.

• The power spectral density of a white noise is


given by:
𝑁0
𝑆𝑛 𝑓 =
2

31
Contd.
• The equation suggests that psd of white noise is independent of frequency.

• As 𝑁0 is constant, the pad is uniform over the entire frequency range including the
positive as well as negative frequencies.

• 𝑁0 can be defined as 𝑁0 = 𝑘𝑇𝑒


Where k = Boltzmann’s constant and 𝑇𝑒 is equivalent noise temperature of the system.

• The best example of white noise is the thermal or Johnson noise.

32
Signal to Noise Ratio (SNR)
• In communication systems, the comparison of signal power with the noise power at the
same point is important.

• It is required to ensure that the noise at that point is not excessively large.

• It is defined as the ratio of signal power to noise power at the same point.

• Therefore, it can be given as 𝑆 𝑃𝑠


=
𝑁 𝑃𝑛
• Where 𝑃𝑠 = Signal Power; 𝑃𝑛 = Noise power at the same point.

• SNR is normally expressed in dB and typical values range from about 10 dB to 90 dB.
33
Contd.
• Higher the value of SNR better the system performance in presence of noise.

• In dB, it can be given as


𝑆 𝑃𝑠
(𝑑𝐵) = 10 log10 ( )
𝑁 𝑃𝑛

• If we want to express in terms of signal and noise voltages as under:


Vs2 Vn2 Where 𝑉𝑠 = Signal voltage
𝑃𝑠 = & 𝑃𝑛 = 𝑉𝑛 = Noise voltage
R R
Hence, • The signal to noise ratio in dB is given by
Vs2 2
𝑆 R 𝑉𝑠 2
𝑆 𝑉𝑠
= 2= 𝑆 𝑉𝑠
𝑁 Vn 𝑉𝑛 = 10log10 = 20log10 [ ]
𝑁 𝑑𝐵 𝑉𝑛 𝑁 𝑑𝐵 𝑉𝑛
R 34
Contd.
• The efforts are made to keep the SNR has high as possible, under all operating
conditions.

• But sometimes it is difficult to measure SNR, therefore in practice measurement of


another ratio (S+N)/N is done.

• There is one more variation of SNR is present which is called as SINAD (Signal noise
and distortion) and given by
𝑆+𝑁+𝐷
𝑆𝐼𝑁𝐴𝐷 =
𝑁+𝐷

Where S = signal; N = noise; D = Distortion

• It is generally used in the specifications of FM receiver. 35


Noise Factor
• The noise factor (F) of an amplifier or any network is defined in terms of the SNR at the
input and output of the system.
𝑆
• It is defined as 𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑖𝑛𝑝𝑢𝑡 𝑃𝑠𝑖 𝑃𝑛𝑜
𝐹= 𝑁 = ∗
𝑆
𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑜𝑢𝑡𝑝𝑢𝑡 𝑃𝑛𝑖 𝑃𝑠𝑜
𝑁

Where 𝑃𝑠𝑖 and 𝑃𝑛𝑖 = signal and noise power at the input
𝑃𝑠𝑜 and 𝑃𝑛𝑜 = signal and noise power at the output

• The temperature to calculate the noise power is assumed to be the room temperature.

• The S/N at the input will always be greater than the S/N at the output; hence its value will
always be greater than one (ideal value being 1).
36
Noise Figure
• Sometimes the noise factor is expressed in decibels.

• When noise factor is expressed in decibels, it is known as Noise Figure.

𝐹𝑑𝐵 = 10 log10 𝐹
𝑆
𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑖𝑛𝑝𝑢𝑡 𝑆 𝑆
𝑁𝑜𝑖𝑠𝑒 𝐹𝑖𝑔𝑢𝑟𝑒 = 10 log10 [ 𝑁 ] = 10 log10 − 10 log10
𝑆 𝑁 𝑁
𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑜𝑢𝑡𝑝𝑢𝑡 𝑖 𝑜
𝑁

𝑆 𝑆
Hence, 𝐹𝑑𝐵 = 𝑑𝐵 − 𝑑𝐵
𝑁 𝑖
𝑁 𝑜

• The ideal value of noise figure is 0 dB.


37
Random Process or Stochastic Process
• Let us consider a random variable X which represents the temperature of a city at 12 pm.
Here, temperature is a random variable and can take different values everyday.

• To get a more clear picture, we will record the temperature over many days. Hence, the
sample space of this random variable has been shown here.

• Each sample point corresponds to the


temperature at 12 pm on a particular day e.g. 𝑆1
corresponds to day 1, 𝑆2 corresponds to day 2
and so on.

• But, the temperature is not a function of day


alone, it is a function of time also.
38
Contd.
• Hence, the temperature at 1 am will have a different distribution of values from
distribution of temperature at 12 pm.

• We have to define another random variable to record the temperature at 1 am as shown in


the figure here.

• This sample space has the sample points 𝑆1′ , 𝑆2′ , ….etc.
which are completely different from 𝑆1 , 𝑆2 ,…

• Therefore, we can say that sample space and sample points


corresponds to the random variables.

39
Sample Functions
• Next step is to record the temperature for each value of time i.e. 12pm, 1 pm, 2 pm, …etc.
on everyday and plot the waveforms 𝑥(𝑡, λ𝑖 ) as shown here in figure.

• Here, λ𝑖 represents the number of day i.e. day 1, day 2,…etc.

• Each waveform in the


figure is called a sample
function.

40
Ensemble
• Ensemble means family or collection. Hence, collections of all the possible sample
functions is called as an ensemble.

• Sample space is the collection of all possible sample points and ensemble is the collection
of all possible sample functions.

• The ensemble comprised


of functions of time is
called as random or
stochastic process.

• A random process is
denoted by 𝑋(𝑡) and if
we denote a random
variable by 𝑋(𝑠), then
random process will be
𝑋 𝑡, 𝑠 . 41
Ensemble Average or Ensemble Mean
• Ensemble mean or ensemble average is taken over the ensemble of waveforms at a
fixed instant of time e.g. the ensemble mean taken at 𝑡 = 𝑡1 will consist of all the values
taken at 𝑡 = 𝑡1 for all days.


• Mathematically, Ensemble mean can be written as 𝑚𝑥 = 𝑥𝑓
−∞ 𝑥
𝑥, 𝑡 𝑑𝑥

• Similarly, ensemble mean can be


obtained at other instants of time
also.

• The value of ensemble mean will


be different at different instants
of time.
42
Autocorrelation function of a random process
• Autocorrelation represents the degree of similarity between a given time series and a
lagged version of itself over successive time intervals.

• Autocorrelation measures the relationship between a variable's current value and its past
values.

• An autocorrelation of +1 represents a perfect positive correlation, while an


autocorrelation of negative 1 represents a perfect negative correlation.

• It's conceptually similar to the correlation between two different time series, but
autocorrelation uses the same time series twice: once in its original form and once lagged
one or more time periods.

43
Contd.
• The autocorrelation function for a random process X(t) is defined as:

𝑅𝑋 𝑡1 , 𝑡2 = 𝐸[𝑋 𝑡1 𝑋 𝑡2 ]
∞ ∞
𝑅𝑋 𝑡1 , 𝑡2 = 𝑥1 𝑥2 𝑓𝑥1𝑥2 (𝑥1 , 𝑥2 ) 𝑑𝑥1 𝑑𝑥2
−∞ −∞

• The autocorrelation function indicates the similarity between the amplitudes 𝑋(𝑡1 ) and
𝑋 𝑡2 of the random process 𝑋(𝑡) at time instants 𝑡1 and 𝑡2 .

• The value of 𝑅𝑋 𝑡1 , 𝑡2 is obtained by taking the product of the values of the sample
functions at instants 𝑡1 and 𝑡2 and then taking the mean of the product.

• 𝑓𝑥1 𝑥2 (𝑥1 , 𝑥2 ) is the second order PDF of the random process 𝑋(𝑡).
44
Time Average
• The ensemble averages are obtained at constant values of 𝑡, whereas the time averages are
obtained by changing time 𝑡.

• Time average of a random process is defined as the statistical average obtained by


considering time 𝑡 as a variable.

• The time average of any sample


function can be defined as under:
1 𝑇/2
𝑚𝑥 𝑡 = lim 𝑋 𝑡 𝑑𝑡
𝑇→∞ 𝑇 −𝑇/2

• The time average of sample


function 1 need not be identical
to that of sample function 2 or 3. 45
Classification of Random Processes
• The random processes are broadly classified into two categories:

Stationary random Non-stationary


process random process

Wide-sense (or
weakly) processes Ergodic processes

46
Contd.
• A random process whose statistical characteristics (e.g. mean) do not change with time is
known as stationary random process e.g. point in a channel where noise is introduced in
the signal.

• Hence, the shift of time origin will not have any effect on the stationary random process.

• While a Non-stationary process is inconsistent with time e.g. temperature measurement of


a city.

• For stationary random process, pdf at time 𝑡1 = pdf at time 𝑡2


𝑓𝑥 𝑥, 𝑡1 = 𝑓𝑥 (𝑥, 𝑡2 ) or 𝑓𝑥 𝑥 = 𝑓𝑥 (𝑥)

• However, stationary random process does not exist in practical life.


47
Wide-sense (or weakly) processes
• A process is wide sense stationary if it’s mean and autocorrelation are independent of the
shift of time origin.

• If the ensemble average of a random process is constant then it is called as wide sense
stationary process i.e. 𝑚𝑥 𝑡 = constant and 𝑅𝑥 𝑡1 , 𝑡2 = 𝑅𝑥 (𝑡2 − 𝑡1 ).

• All the stationary processes are wide-sense stationary process but vice-versa may not be
true.

• However, a truly stationary process cannot occur in real life.

• For Ergodic process, ensemble averages are equal to the corresponding time averages of
any sample function. 48

You might also like