Unit 5 Noise
Unit 5 Noise
Communication System
Unit 5 Noise
Taught by:
Dr. Prateek Verma
Assistant Professor
DYPIU Akurdi
1
Introduction to Noise
• Noise is an unwanted electrical disturbance which gives rise to audible or visual
disturbances in the communication systems.
• In any communication system, during the transmission of the signal or while receiving
the signal, some unwanted signal gets introduced into the communication, making it
unpleasant for the receiver, and questioning the quality of the communication. Such a
disturbance is called as Noise.
• Noise is an unwanted signal, which interferes with the original message signal and
corrupts the parameters of the message signal.
• This alteration in the communication process, leads to the message getting altered. It
most likely enters at the channel or the receiver.
2
Noise as a Signal
• The signals which can be described by some fixed mathematical equations are called
deterministic signals.
• There is one more class of signals, the behavior of which cannot be predicted. Such type
of signals are called random signals such as noise in communication systems.
• This means that the noise interference during transmission is totally unpredictable.
• Such signals cannot be modelled mathematically e.g. thermal noise in the receiver.
• For example, Let us say we are running an experiment of tossing a fair coin. The
possible events are Heads, Tails. And for instance, if we use X to denote the events, the
probability distribution of X would take the value 0.5 for X=heads, and 0.5 for X=tails.
Discrete
Probability Mass
Function (PMF)
Cumulative
Distribution Function
(CDF)
Continuous Probability Density
Function (PDF)
5
Probability Mass Function (PMF)
• Let us take the example of a dice,
PMF
It has 6 possible
outcomes
1
= 0.167
6
6
Cumulative Distribution Function (CDF)
• Now, Can we draw the cumulative distribution function using PMF?
PMF CDF
𝑷 𝑿≤𝟑 =𝑷 𝑿=𝟏 +𝑷 𝑿=𝟐 +𝑷 𝑿=𝟑 7
Probability Density Function (PDF)
• Let us take a continuous random variable, for example height of boys/girls in a class.
• Now, the important question is how can we obtain CDF from PDF or vice versa.
8
Contd.
1st Case – 50% of
data
9
Contd.
Gradient
Area to the
left
• The method to find the gradient is shown in the first Here, Gradient means Differentiation
figure.
& Area to the left means Integration.
𝑥
• The y-axis in PDF graph represents the gradient value
of CDF. 𝐹𝑋 𝑥 = 𝑓𝑋 𝑥 𝑑𝑥
10
−∞
Probability Density Function Properties
• Let x be the continuous random variable with density function f(x), and the probability
density function should satisfy the following conditions:
a) For a continuous random variable that takes some value between certain limits, say a
and b, the PDF is calculated by finding the area under its curve and the X-axis within
the lower limit (a) and upper limit (b). Thus, the PDF is given by
𝑏
𝑃 𝑥 = 𝑓 𝑥 𝑑𝑥
𝑎
b) The probability density function is non-negative for all the possible values, i.e. f(x)≥ 0,
for all x.
c) The area between the density curve and horizontal X-axis is equal to 1, i.e.
∞
𝑓 𝑥 𝑑𝑥 = 1
−∞ 11
Tutorial
1. Let X be a continuous random variable with the PDF given by:
𝑥; 0 < 𝑥 < 1
𝑓 𝑥 = 2 − 𝑥; 1 < 𝑥 <2 Find P(0.5<x<1.5).
0; 𝑥 >2
1 2 ∞ 1 1.5 3
𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 + 0 𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 Ans
0 1 2 0.5 1 4
1 7 35 26
Ans (a) (b) (c) (d)
18 54 54 54
Case 2: 0 ≤ 𝑥 ≤ 1 0 𝑥 𝑥 𝑥2
F(𝑥) = −∞
𝑓 𝑥 𝑑𝑥 + 0
𝑓 𝑥 𝑑𝑥 = 0 + 0
𝑥𝑑𝑥 =
2
Case 3: 1 ≤ 𝑥 ≤ 2 0 1 𝑥
F(𝑥) = −∞
𝑓 𝑥 𝑑𝑥 + 0
𝑓 𝑥 𝑑𝑥 + 1
𝑓 𝑥 𝑑𝑥
0; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
1 𝑥
F(𝑥) = 0 + 𝑥𝑑𝑥 + 2 − 𝑥 𝑑𝑥 𝑥2
0 1 ;0 ≤ 𝑥 ≤ 1
𝐹𝑋 𝑥 = 2
1 𝑥 𝑥2
𝑥2 𝑥2 1 𝑥2 1 𝑥2 −1 + 2𝑥 − ;
F(𝑥) = + 2𝑥 − = + 2𝑥 − − 2− = −1 + 2𝑥 − 2
2 0 2 1 2 2 2 2 1 ≤ 𝑥 ≤142
Types of Noise
• Noise can be divided into two categories:
External Internal
• External noise cannot be reduced except by changing the location of the receiver or the
entire system.
• Internal noise can be reduced upto an extent by proper designing of the instruments.
15
External Noise
• External noise, i.e. noise whose sources are external.
Man-made/Industrial Extraterrestrial
Atmospheric noise
source noise
• The natural phenomena that give rise to noise are lighting discharges in the
thunderstorms and other natural electrical disturbances occurring in the atmosphere.
• The noise originating from the Sun and outer space is called Extraterrestrial Noise.
• The noise is called as black body noise and it is distributed uniformly over the entire sky.
17
Man-made source (Industrial Noise)
• Man-made noise is created due to wear and tear of electrical machinery used in
transmission systems.
• It is generated due to the make and break process in a current carrying circuit.
• They are due to the type of electrical motors, welding machines, fluorescent lights, often
caused by ignition circuits, and switching elements causing spark gaps.
18
Fundamental or Internal Sources of Noise
• It is within the electronic equipment.
• They are called fundamental sources because they are the integral part of the physical
nature of the material used for making electronic components.
• Hence, it can be reduced by properly designing the electronic circuits and equipments.
19
Different Types of Internal Noise
• The fundamental noise sources produce different types of noise. These can be classified
as under:
Shot noise
Partition noise
Flicker or low
freq. noise
Thermal noise
Transit time
or high freq.
noise 20
Flicker or Low Freq. Noise
• The flicker noise will appear at frequencies below a few kilohertz or at low audio
frequencies.
• It is inversely proportional to the frequency and hence also called as 1/f noise.
• In semiconductor devices, flicker noise is generated due to the fluctuations in the carrier
density that will result in fluctuations in the conductivity of the material.
• This will produce fluctuation voltage drop when a direct current flows through a device.
This fluctuating voltage is called as flicker noise voltage.
• It may be neglected at frequencies above about 500 Hz. Therefore, it possess no serious
problem.
21
Thermal noise or Johnson Noise
• The thermal noise is continuous in nature. It occurs at all frequencies.
• It is predictable, additive and present in all devices. Hence, Thermal noise is the most
significant of all noise sources.
• The free electrons within a conductor are always in random motion, which is due to the
thermal energy received by them.
• This distribution of these free electrons is not uniform at a given instant of time. Hence, it
is highly likely that an excess number of electrons may appear at one end or the other end
of conductor.
• There is some finite value of average power, which is called thermal noise power.
22
Contd.
• The average power is given by: V
𝑃𝑛 = 𝑘𝑇𝐵 𝑤𝑎𝑡𝑡𝑠
′ −23
𝐸𝑛
Where 𝑘 = 𝐵𝑜𝑙𝑡𝑧𝑚𝑎𝑛𝑛 𝑠 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡 = 1.38 ∗ 10 𝐽𝑜𝑢𝑙𝑒𝑠/𝐾𝑒𝑙𝑣𝑖𝑛
B = bandwidth of the noise spectrum (Hz)
T = Temperature of the conductor (°𝐾)
𝑉2 𝐸𝑛2
• We also the know the relation between power and voltage i.e. 𝑃𝑛 = =
𝑅𝐿 4𝑅
Where 𝐸𝑛 is rms noise voltage and R is resistance of the conductor
• The devices themselves are very tiny, so the distances involved are minimal.
• Yet the time it takes for the current carriers to move even a short distance is finite. At low
frequencies this time is negligible.
• But when the frequency of operation is high and the signal being processed is the
magnitude as the transit time, then problem can occur.
• The transit time shows up as a kind of random noise within the device, and this is directly
proportional to the frequency of operation. 24
Partition Noise
• Partition noise occurs whenever current has to divide between two or more paths,
and results from the random fluctuations in the division.
• It would be expected, therefore, that a diode would be less noisy than a transistor
(all other factors being equal) If the third electrode draws current (i.e.., the base
current).
25
Shot noise
• The most common type of noise is referred to as shot noise which is produced by the
random arrival of electrons or holes at the output element, at the plate in a tube, or at the
collector or drain in a transistor.
• Shot noise is also produced by the random movement of electrons or holes across a PN
junction.
• The formula for the mean square shot noise current can be obtained only for diodes and
is given by:
Where I = direct current across the junction (Amp) q = electronic charge = 1.6 ∗ 10−19 C
𝐼0 = reverse saturation current (Amp) B = effective noise bandwidth in Hz
26
Tutorial
1. A receiver has a noise power bandwidth of 12 kHz. What is the noise power
contributed by this resistor in the receiver bandwidth when 𝑇 = 30°𝐶?
Ans - 𝐼 = 0.625 A
3. An amplifier has a bandwidth of 4 MHz with 10 KOhm as the input resistor. Calculate
the rms noise voltage at the input to this amplifier if the room temperature is 25°𝐶.
Ans - En = 25.65𝜇𝑉
27
Contd.
4. Calculate the thermal noise power available from any resistor at room temperature
(290°K) for a bandwidth of 2 MHz. Also calculate the corresponding noise voltage
given that R = 100 Ohm.
Ans - 𝑃𝑛 = 8 ∗ 10−15 W; Voltage = 0.894 𝜇𝑉
5. Calculate the rms noise voltage at the input of a receiver RF amplifier using a device
that has a 100 Ohm equivalent noise resistance and a 200 Ohm input resistor. The
bandwidth of the amplifier is 1 MHz, the temperature is 25°𝐶.
Ans - En = 2.2𝜇𝑉
3. An amplifier has a bandwidth of 4 MHz with 10 KOhm as the input resistor. Calculate
the rms noise voltage at the input to this amplifier if the room temperature is 25°𝐶.
Ans - En = 25.65𝜇𝑉
28
Thermal Noise Calculations
• Since, the resistors acts as a source of thermal noise, it is important to observe the effect
of connecting two noise sources in series and parallel.
2 2
𝐸𝑛 = 2
𝐸𝑛1 2
+ 𝐸𝑛2 𝐸𝑛2 = 𝐸𝑛1 + 𝐸𝑛2
29
Contd.
• Case 2: When resistors are connected in parallel.
2 2
𝐸𝑛 = 4𝑘𝑇𝐵𝑅𝑒𝑞 𝐼𝑛 = 2
𝐼𝑛1 2
+ 𝐼𝑛1 𝐼𝑛2 = 𝐼𝑛1 + 𝐼𝑛1
30
White Gaussian Noise
• White Noise is that noise whose power spectral density is uniform over the entire
frequency range of interest.
• The white noise contains all the frequency components in equal proportion. This is
analogous with white light which is a superposition of all visible spectral components.
• The white noise has a Gaussian distribution, which means that the pdf of white noise has
the shape of Gaussian pdf. Hence, it is called as Gaussian noise.
31
Contd.
• The equation suggests that psd of white noise is independent of frequency.
• As 𝑁0 is constant, the pad is uniform over the entire frequency range including the
positive as well as negative frequencies.
32
Signal to Noise Ratio (SNR)
• In communication systems, the comparison of signal power with the noise power at the
same point is important.
• It is required to ensure that the noise at that point is not excessively large.
• It is defined as the ratio of signal power to noise power at the same point.
• SNR is normally expressed in dB and typical values range from about 10 dB to 90 dB.
33
Contd.
• Higher the value of SNR better the system performance in presence of noise.
• There is one more variation of SNR is present which is called as SINAD (Signal noise
and distortion) and given by
𝑆+𝑁+𝐷
𝑆𝐼𝑁𝐴𝐷 =
𝑁+𝐷
Where 𝑃𝑠𝑖 and 𝑃𝑛𝑖 = signal and noise power at the input
𝑃𝑠𝑜 and 𝑃𝑛𝑜 = signal and noise power at the output
• The temperature to calculate the noise power is assumed to be the room temperature.
• The S/N at the input will always be greater than the S/N at the output; hence its value will
always be greater than one (ideal value being 1).
36
Noise Figure
• Sometimes the noise factor is expressed in decibels.
𝐹𝑑𝐵 = 10 log10 𝐹
𝑆
𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑖𝑛𝑝𝑢𝑡 𝑆 𝑆
𝑁𝑜𝑖𝑠𝑒 𝐹𝑖𝑔𝑢𝑟𝑒 = 10 log10 [ 𝑁 ] = 10 log10 − 10 log10
𝑆 𝑁 𝑁
𝑟𝑎𝑡𝑖𝑜 𝑎𝑡 𝑡ℎ𝑒 𝑜𝑢𝑡𝑝𝑢𝑡 𝑖 𝑜
𝑁
𝑆 𝑆
Hence, 𝐹𝑑𝐵 = 𝑑𝐵 − 𝑑𝐵
𝑁 𝑖
𝑁 𝑜
• To get a more clear picture, we will record the temperature over many days. Hence, the
sample space of this random variable has been shown here.
• This sample space has the sample points 𝑆1′ , 𝑆2′ , ….etc.
which are completely different from 𝑆1 , 𝑆2 ,…
39
Sample Functions
• Next step is to record the temperature for each value of time i.e. 12pm, 1 pm, 2 pm, …etc.
on everyday and plot the waveforms 𝑥(𝑡, λ𝑖 ) as shown here in figure.
40
Ensemble
• Ensemble means family or collection. Hence, collections of all the possible sample
functions is called as an ensemble.
• Sample space is the collection of all possible sample points and ensemble is the collection
of all possible sample functions.
• A random process is
denoted by 𝑋(𝑡) and if
we denote a random
variable by 𝑋(𝑠), then
random process will be
𝑋 𝑡, 𝑠 . 41
Ensemble Average or Ensemble Mean
• Ensemble mean or ensemble average is taken over the ensemble of waveforms at a
fixed instant of time e.g. the ensemble mean taken at 𝑡 = 𝑡1 will consist of all the values
taken at 𝑡 = 𝑡1 for all days.
∞
• Mathematically, Ensemble mean can be written as 𝑚𝑥 = 𝑥𝑓
−∞ 𝑥
𝑥, 𝑡 𝑑𝑥
• Autocorrelation measures the relationship between a variable's current value and its past
values.
• It's conceptually similar to the correlation between two different time series, but
autocorrelation uses the same time series twice: once in its original form and once lagged
one or more time periods.
43
Contd.
• The autocorrelation function for a random process X(t) is defined as:
𝑅𝑋 𝑡1 , 𝑡2 = 𝐸[𝑋 𝑡1 𝑋 𝑡2 ]
∞ ∞
𝑅𝑋 𝑡1 , 𝑡2 = 𝑥1 𝑥2 𝑓𝑥1𝑥2 (𝑥1 , 𝑥2 ) 𝑑𝑥1 𝑑𝑥2
−∞ −∞
• The autocorrelation function indicates the similarity between the amplitudes 𝑋(𝑡1 ) and
𝑋 𝑡2 of the random process 𝑋(𝑡) at time instants 𝑡1 and 𝑡2 .
• The value of 𝑅𝑋 𝑡1 , 𝑡2 is obtained by taking the product of the values of the sample
functions at instants 𝑡1 and 𝑡2 and then taking the mean of the product.
• 𝑓𝑥1 𝑥2 (𝑥1 , 𝑥2 ) is the second order PDF of the random process 𝑋(𝑡).
44
Time Average
• The ensemble averages are obtained at constant values of 𝑡, whereas the time averages are
obtained by changing time 𝑡.
Wide-sense (or
weakly) processes Ergodic processes
46
Contd.
• A random process whose statistical characteristics (e.g. mean) do not change with time is
known as stationary random process e.g. point in a channel where noise is introduced in
the signal.
• Hence, the shift of time origin will not have any effect on the stationary random process.
• If the ensemble average of a random process is constant then it is called as wide sense
stationary process i.e. 𝑚𝑥 𝑡 = constant and 𝑅𝑥 𝑡1 , 𝑡2 = 𝑅𝑥 (𝑡2 − 𝑡1 ).
• All the stationary processes are wide-sense stationary process but vice-versa may not be
true.
• For Ergodic process, ensemble averages are equal to the corresponding time averages of
any sample function. 48