0% found this document useful (0 votes)
15 views53 pages

Chapter VI - Probability & Random Processes

Chapter 6 discusses probability and random processes, defining key concepts such as random experiments, sample spaces, events, and various probability definitions. It covers conditional probabilities, independent events, Bayes' theorem, and the role of random variables in probability theory, along with the probability distribution function. The chapter also introduces the white noise process and its significance in practical applications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views53 pages

Chapter VI - Probability & Random Processes

Chapter 6 discusses probability and random processes, defining key concepts such as random experiments, sample spaces, events, and various probability definitions. It covers conditional probabilities, independent events, Bayes' theorem, and the role of random variables in probability theory, along with the probability distribution function. The chapter also introduces the white noise process and its significance in practical applications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 53

Chapter-6

PROBABILITY AND RANDOM PROCESSES

 Random Experiment: An experiment is a random experiment if its outcome cannot be predicted precisely. One out of a
number of outcomes is possible in a random experiment. A single performance of the random experiment is called a trial.
 Sample Space: The sample space is the collection of all possible outcomes of a random experiment. The elements of
are called sample points.

 A sample space may be finite, countably infinite or uncountable.


 A finite or countably infinite sample space is called a discrete sample space.
 An uncountable sample space is called a continuous sample space

 Event: An event A is a subset of the sample space such that probability can be assigned to it. Thus


 For a discrete sample space, all subsets are events.
 is the certain event (sure to occur) and is the impossible event.
Classical definition of probability (Laplace 1812):

Consider a random experiment with a finite number of outcomes If all the outcomes of the experiment are equally likely, the
probability of an event is defined by

Discussion

 The classical definition is limited to a random experiment which has only a finite number of outcomes. In many
experiments like that in the above examples, the sample space is finite and each outcome may be assumed ‘equally likely.'
In such cases, the counting method can be used to compute probabilities of events.
 Consider the experiment of tossing a fair coin until a ‘head' appears. As we have discussed earlier, there are countably
infinite outcomes. Can you believe that all these outcomes are equally likely
 The notion of equally likely is important here. Equally likely means equally probable. Thus this definition presupposes that
all events occur with equal probability. Thus the definition includes a concept to be defined

Relative-frequency based definition of probability (von Mises, 1919):

 If an experiment is repeated times under similar conditions and the event occurs in times, then

Basic results of probability:

1.
Suppose,
Then

Therefore

Thus which is possible only if


2.If
We have ,

3. where where

We have,

4. If

We have,

We can similarly show that ,


5. If
We have ,

6. We can apply the properties of sets to establish the following result for

Probability assignment in a discrete sample space

Consider a finite sample space. Then the sigma algebra is defined by the power set of S. For any elementary event, we can
assign a probability P( si ) such that,
For any event , we can define the probability

In a special case, when the outcomes are equi-probable, we can assign equal probability p to each elementary event.

Probability assignment in a continuous space


Suppose the sample space S is continuous and un-countable. Such a sample space arises when the outcomes of
an experiment are numbers. For example, such sample space occurs when the experiment consists in measuring the voltage, the
current or the resistance. In such a case, the sigma algebra consists of the Borel sets on the real line.
Suppose and is a non-negative inerrable function such that,

For any Borel set ,


defines the probability on the Borel sigma-algebra B .

We can similarly define probability on the continuous space of etc.

Conditional probability

Consider the probability space . Let A and B two events in . We ask the following question –
Given that A has occurred, what is the probability of B?
The answer is the conditional probability of B given A denoted by . We shall develop the concept of the
conditional probability and explain under what condition this conditional probability is same as .
Let us consider the case of equiprobable events discussed earlier. Let sample points be favorable for the joint event .

This concept suggests us to define conditional probability. The probability of an event B under the condition that another event A
has occurred is called the conditional probability of B given A and defined by
We can similarly define the conditional probability of A given B , denoted by . From the definition of conditional
probability, we have the joint probability of two events A and B as follows

Properties of Conditional Probabilities

If , then
We have ,
Independent events

Two events are called independent if the probability of occurrence of one event does not affect the probability of
occurrence of the other. Thus the events A and B are independent if

and

where and are assumed to be non-zero.


Equivalently if A and B are independent, we have

Two events A and B are called statistically dependent if they are not independent. Similarly, we can define the independence of n

events. The events are called independent if and only if


Bayes' Theorem:
This result is known as the Baye's theorem. The probability is called the a priori probability and is called the a
posteriori probability. Thus the Bays' theorem enables us to determine the a posteriori probability from the observation
that B has occurred. This result is of practical importance and is the heart of Baysean classification, Baysean estimation etc.

Random Variables:

In application of probabilities, we are often concerned with numerical values which are random in nature. For
example, we may consider the number of customers arriving at a service station at a particular interval of time or the
transmission time of a message in a communication system. These random quantities may be considered as real-valued
function on the sample space. Such a real-valued function is called real random variable and plays an important role in
describing random data. A random variable associates the points in the sample space with real numbers .
Consider the probability space and function mapping the sample space into the real line. Let

us define the probability of a subset by Such a definition will be valid if

is a valid event. If is a discrete sample space, is always a valid event, but the same may not be true if is infinite.
The concept of sigma algebra is again necessary to overcome this difficulty. We also need the Borel sigma algebra -the
sigma algebra defined on the real line. The function is called a random variable
Probability Distribution Function:

It can be observe that the event and are equivalent and .The underlying
sample space is omitted in notation and we simply write and instead of and
respectively.
Consider the Borel set , where represents any real number. The equivalent event
is denoted as .The event can be taken as a representative event in studying the probability description of a random
variable . Any other event can be represented in terms of this event.
The probability is called the probability distribution function ( also called the cumulative

distribution function , abbreviated as CDF ) of and denoted by . Thus


Properties of the Distribution Function:

This follows from the fact that is a probability and its value should lie between 0 and 1.

 is a non-decreasing function of . Thus, if

 is right continuous.

.

We have ,


Thus we have seen that given , we can determine the probability of any event involving values of the random
variable .Thus is a complete description of the random variable .
Fig: equivalent model for the physical propagation channel, including the noise generated by the receiver front end
WHITE NOISE PROCESS:

One of the very important random processes is the white noise process. Noises in many practical situations are

approximated by the white noise process. A white noise process is a random process that has constant power spectral
density at all frequencies.
Where is a real constant and called the intensity of the white noise.
The corresponding autocorrelation function is given by

Where is the Dirac delta.

The average power of white noise

The autocorrelation function and the PSD of a white noise process is shown in Figure 1 below.
Figure 1

 The term white noise is analogous to white light which contains all visible light frequencies.
 A white noise is generally assumed to be zero-mean. A white noise process is unpredictable as the noise samples at
different instants of time are uncorrelated

 Thus the samples of a white noise process are uncorrelated no matter how closely the samples are placed. Thus a
white noise has an infinite variance.
Probability & Random Variables Previous GATE Objective Questions
GATE-2001-1mark

1). The PDF of a Gaussian random variable X is given by Px(X) = 1/ 3√2Π e-(x-4)2/18. The Probability of the event [X = 4} is
(a) ½ (b) 1/3 √2Π
(c) 0 (d) ¼

GATE-2001-2mark

2). During transmission over a communication channel, bit errors occur independently with probability p. If a block of n bits is transmitted the
probability of at most one bit error is equal to
(a) 1 – (1 – p)n (b) P +(n – 1)(1 – P)
(c) np(1 – p)n (d) (1 – p)n +np(1 – p)n-1

GATE-2001-2mark
3). The PSD and the power of a signal g(t) are , respectively. Sg(ω) and Pg. The PSd and the power of the signal ag(t) are, respectively,
(a) a2Sg(ω) and a2Pg (b) a2Sg(ω) and aPg
2
(c) aSg(ω) anda Pg (d) aSg(ω) and aPg

GATE-2001-1mark

4). The line- of- sight communication requires the transmit and receive antennas to face each other. If the transmit antenna is vertically polarized, for
best reception the receiver antenna should be
(a) Horizontally polarized (b) vertically polarized
0
(c) at 45 with respect to horizontal polarization
(d) at 45o with respect to vertical polarization

GATE-2002-2mark
5). If the variance σ2x of d(n) = x(n) – x(n – 1) is one-tenth the variance σ 2x of a stationary zero-mean discrete time signal x(n), then the normalized
autocorrelation function Rxx(k)/σ2x at K =1 is
(a) 0.95 (b) 0.90
(c) 0.10 (d) 0.05
GATE-2003-1mark

6). The input to coherent detector is DSB-SC signal plus noise. The noise at the character output is
(a) the in-phase component (b) the quadrature-component
(c) Parabolic (d) the envelope

GATE-2003-1mark

7). The noise at the input to an ideal frequency detector is white. The detector is operating above threshold. The power spectral density of the noise at
the output is
(a) raised-cosine (b) flat
(c) Parabolic (d) Gaussian

GATE-2003-2mark
8). Let X and Y be two statically independent random variables uniformly distributed in the ranges (-1, 1) and ( -2, 1) respectively. Let Z = X + Y.
(a) zero (b) 1/6
(c) 1/3 (d) 1/12

GATE-2003-2mark
Common Data for Questions 9 and 10.

Let X be the Gaussian random variable obtained by sampling the process at t = t1 and Auto correlation function Rxx (τ)

= 4(e-0.2‫׀‬τ1 + ‫ )׀‬and mean = 0


9). The probability that [x ≤ 1] is
(a) 1 – Q(0.5) (b) Q(0.5) (c) Q(1/2 √2) (d)1 – Q(1/2√2)

10). Let Y and Z be the random variables obtained by sampling X(t) at t=2 and t= 4 respectively. Let W = Y– Z. The variance of W is
(a) 13.36 (b) 9.36
(c) 2.64 (d) 8.00
GATE-2004-2mark

11). The distribution function Fx(X) of a random variable X is shown in the figure. The probability that X = 1 is

(a) zero (b) 0.25 (c) 0.55 (d) 0.30

GATE-2004-2mark

12). A random variable X with uniform density in the interval 0 to 1 is quantized as follows:
If 0 ≤ X ≤0.3 Xq = 0
If 0.3 ≤X ≤1, Xq = 0.7 where Xq is the quantized value of X
The root-mean square value of the quantization noise is
(a) 0.573 (b) 0.198
(c) 2.205 (d) 0.266

GATE-2005-2mark

13). Noise with uniform power spectral density of N 0W/Hz is passed through a filter H(ω) = 2exp(-jωt d) followed by an ideal low pass filter of
bandwidth BHz . The output noises power in Watts is
(a) 2N0B (b) 4N0B
(c) 8N0B (d)16N0B
GATE-2005-2mark

14). An output of a communication channel is a random variable v with the probability density function as shown in the figure. The mean square value
of V is

(a) 4 (b) 6 (c) 8 (d) 9

GATE-2005-2mark
Common Data for Question 15 and 16.
A symmetric three-level midtread quantizer is to be designed assuming equiprobable occurrence of all quantization levels.

15. If the probability density function is divided in to three regions as shown in the figure, the value of a in the figure is

(a) 1/3 (b) 2/3


(c) ½ (d) ¼
16. The quantization noise power for the quantization region between –a and +a in the figure is
(a) 4/81 (b) 1/9
(c) 5/81 (d) 2/81

GATE-2006-1mark

17. A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of
symbols, the most efficient source encoder would have average bit rate is
(a) 6000 bits/sec (b) 4500 bits/sec
(c) 3000 bits/sec (d) 1500 bits/sec

GATE-2007-2mark
Common Data for Questions 18 & 19:
The following two questions refer to wide sense stationary stochastic processes
18. It is desired to generate a stochastic process (as voltage process) with power spectral density by driving a Linear – Time –
Invariant system by zero mean white noise (as voltage process) with power spectral density being constant equal to 1. The system which can
perform the desired task could be
(a) first order low pass R-L filter
(b) first order high pass R-Cfilter
(c) tuned L-C filter
(d) series R-L-C filter

19. The parameters of the system obtained in Q.78 would be


(a) first order R-L low pass filter would have R = 4Ω L = 1H
(b) first order R-C high pass filter would have R = 4Ω C = 0.25F
(c) tuned L-C filter would have L= 4H C = 4F
(d) series R-L-C low pass filter would have R = 1 Ω , L = 4H, C = 4H

GATE-2007-1mark
20. If E denotes expectation, the variance of a random variable X is given by
(a) E[X2] – E2[X] (b) E[X2] + E2[X]
2
(c) E[X ] (d) E2[X]
GATE-2007-1mark
21. During transmission over a certain binary communication channel. bit errors occur independently with probability ρ. The probability of AT MOST
one bit in error in a block of n bits is given by
(a) Pn (b) 1 – Pn (c) np(1 – p)n-1 +(1 – p)n (d) 1 – ( 1 – p)n
GATE-2008-2mark
22. A memory less source emits n symbol each with a probability P. The entry of the source as a function of n.
(a) increases as log n (b) decreases as log(1/n)
(c) increases as n (d) increases as n log n

GATE-2008-2mark
23. Noise with doubled –sided power spectral density of K over all frequencies is passed through a RC low pass filter with 3 dB cut – off
frequency of fc . The noise power at the filter output is
(a) K (b) Kfc (c) KΠfc (d) ∞

GATE-2008-2mark

24. Consider a Binary Symmetric channel (BSC) with probability of error being p. To transmit a bit, say 1 , we transmit a sequence of three 1s. The
receiver will interpret the received sequence to represent 1 if at least two are 1 . The probability that the transmitted bit will be received in error
is
(a) p3 + 3p2(1 – p) (b) p3 (c) (1 – p)3 (d)p3 + p2(1 – p)

25. Consider two independent random variables X and Y with identical distributions. The variables X and Y take values 0, 1 and 2 with probabilities
½, ¼, and ¼ respectively. What is the conditional probability P(X + Y =2X – Y = 0)?
(a) 0 (b) 1/16 (c) 1/6 (d) 1

26. A discrete random variable X takes values from 1 to 5 with probabilities as shown in the table. A student calculates the mean X as 3.5 and her
teacher calculate the variable the variance of X as 1.5. Which of the following statements is true?
(a) Both the student and the teacher and the teacher are right
(b) Both the student and the teacher are wrong
(c) The student is wrong but the teacher is right
(d) The student is right but the teacher is wrong
Probability & Random Variables Chapter-6 GATE Previous Questions Answers
Qno 1 2 3 4 5 6 7 8 9 10 11
Ans C D A B A A C D D C D
Qno 12 13 14 15 16 17 18 19 20 21 22
Ans B B C B A B A A A C A
Qno 23 24 25 26
Ans C A C B

Probability & Random Variables Previous IES Objective Questions


IES-1999

Q.1 A source deliver symbols X 1, X2, X3 and X4 with probabilities ½, ¼, 1/8 and 1/8 respectively. The entropy of the system is respectively. The
entropy of the system is
(a) 1.75 bits per second (b) 1.75 bits per symbol
(c) 1.75 symbols per second (d) 1.75 symbols per bit

IES-2001
Q.2 To permit the selection of 1 out of 16 equivalent probable events, the number of bits required is
(a) 2 (b) log10 16 (c) 8 (d) 4

Q.3 Which one of the following types of noise gains importance at high frequency?
(a) Short noise (b) Random noise (c) Impulse noise (d) Transit- time noise

IES-2003

Q.4 Thermal noise is passed through an ideal low-pass filter having cut-off at fc = ω Hz. The autocorrelation value of the noise at the output of the
filter is given as
(a) A delta function at t=0
(b) Gaussian over the range -∞ ≤ t ≤ ∞
(c) Sinc function over the range -∞ ≤ t ≤ ∞
(d) Triangular function over the range -1/2ω≤ t ≤ 1/2ω
Q.5 A random process obeys Poisson distribution. It is given that the mean of the process is 5. Then the variance of the process is
(a) 5 (b) 0.5 (c) 25 (d) 0

IES-2004

Q.6 A ternary source produces alphabets A, B and C with probabilities PA = PB = p and ρc. Which one of the following gives the correct values for
the maximum value of the entropy of the source and the corresponding value of p and the range of p?
(a) 1.58, 0.33, (0, 0.05) (b) 1.0, 0.5, (0, 1)
(c) 3.0, 0.67, (0, 0.5) (d) 2.0, 4.2, (0, 0.3)

Q.7 Match List-I(Type of Random Process) with List-II( Property of the Random process) and select the correct answer using the codes given
below the lists:
List-I
A. Stationary process
B. Ergodic process
C. Wide sense stationary process
D. Cyclostationary process
List-II
1. Statistical averages periodic in time
2. Statistical average are independent of them
3. Mean and autocorrelation are independent of time
4. Time average equal corresponding essemble average
Codes :
A B c D
(a) 3 1 2 4
(b) 2 4 3 1
(c) 3 4 2 1

(d) 2 1 3 4
IES-2006

Q.8 Match List-I (Type ) with List-II (Application ) select the correct answer using the codes given below the lists:
List-I
A. Speech signal
B. Non-stationary
C. random signal
D. Chaotic signal

List-II
1. The received signal of a radar system monitoring variation in prevalent weather condition
2. One dimensional signal where amplitude varies with time
3. Signals of coupled system of non-linear difference
4. Ensemble of unpredictable wave-forms
Codes:
A B C D
(a) 2 1 4 3
(b) 4 3 2 1
(c) 2 3 4 1
(d) 4 1 2 3

Q.9 A source produces 26 symbols with equal probability . What is the average information produced by the source?
(a) < 4 bits/symbol (b) 6 bits/symbol
(c) 8 bits /symbol (d) Between 4 and 6 bits/symbol

IES-2008

Q.10 Match List-I(Type of noise) with List-II( Its property) and select the correct answer using the code given below the lists:
List-I
A. Short noise
B. Thermal noise
C. White noise
D. Narrow band noise
List-II
1. Noise generated in a resistor
2. Power spectral density is in independent of frequency
3. Temperature –limited diode
4. Noise at the output of a filter
Codes:
A B C D
(a) 2 4 3 1
(b) 3 1 2 4
(c) 2 1 3 4
(d) 3 4 2 1

Q.11 Which one of the following is the correct statement?


If the value of a resistor creating thermal noise generated is
(a) halved (b) doubled
(c) unchanged (d) slightly changed

Q.12 The output of two noise sources each producing uniformly distributed noise over the range –a to +a are added. What is the p.d.f of the added
noise?
(a) Uniformly distributed over the range -2a to +2a
(b) Triangular over the range -2a to + 2a
(c) Gaussian over the range -∞ to ∞ (d) None of the above

Probability & Random Variable Chapter-6 IES Previous Questions Answers


Qno 1 2 3 4 5 6 7 8 9 10 11
Ans B D D C A A B C D B C
Qno 12
Ans B

You might also like