0% found this document useful (0 votes)
11 views

ITC Unit 2

Uploaded by

4nm22ec401
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

ITC Unit 2

Uploaded by

4nm22ec401
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 186

Unit-2

Source coding
Encoding of source output
Encoding of the source output
• Source encoding is the process by which
the output of an information source is
converted into a r-array sequence, where r
is the number of different symbols used in
this transformation process.
• If r=2,then output is a binary sequence
Source encoding
• Source encoding is the process by which the
output of an information source is converted
into a binary sequence
• The functional block that performs this task is
called the source encoder
Types/properties of codes
• Block codes
• Non-singular codes
• Uniquely decodable codes
• Instantaneous codes
• Optimal codes

– With an example explain the different


types/properties of source codes
Block codes
• In this type of code each symbol from source
alphabet is mapped to finite sequence of
code symbols from code alphabet.
• Block codes can be of fixed length or variable
length
Block codes
• Consider source S emitting only four symbols
which are to be encoded with binary coding
• Then 𝑆 = {𝑠1 , 𝑠2 , 𝑠3 , 𝑠4 }
• And X = {0, 1}

Symbol Fixed length block code Variable length block code

𝑠1 00 1
𝑠2 01 01
𝑠3 10 110
𝑠4 11 111
Non-singular codes
• A Block code is said to be non singular if all
the code words are “distinct” and easily
“distinguishable” from each other
Non-singular codes
• Suppose if a source emits four symbols 𝑆 =
{𝑠1 , 𝑠2 , 𝑠3 , 𝑠4 }, let these symbols be encoded
by code - A and code - B
Symbol Code-A Code-B

𝑠1 00 11
𝑠2 01 10
𝑠3 10 11
𝑠4 11 001

• code – A :Non-singular
• and code – B : singular
Uniquely decodable codes
• A block code is said to be uniquely decodable
if nth order extension is also non singular for
all finite values of n
Consider the code A given in the table
Consider the code A given in the table
If code B is used to decode the received bit 001100
Instantaneous codes
• A uniquely decodable is said to be
instantaneous code, if it is possible to
recognize the end of any codeword in a
received sequence without reference to the
succeeding symbols
Optimal codes
• Instantaneous codes are called optimal
codes if the length of the code words
assigned to the symbols are minimum
Circle diagram
• Following coding scheme has been used to encoding
the sequence. Draw the decision tree.
Source symbols Code-B
A 1
H 01
I 001
S 0001
T 00001
- 00000

• If the receiver receives the following bit stream,


decode the information
• 000010100100010000000100010000001001000100
000011000010000000000101001000100000011000
0100000001000100000010010001
Prefix of a code
• Let 𝑋 = 𝑋1 𝑋2 𝑋3 ….. 𝑋𝑚 be a code-word of
some code. Then sequences 𝑋1 𝑋2 𝑋3 . . 𝑋𝑗 for
all 𝑗 ≤ 𝑚 are called “prefixes” of code X
Test for instantaneous property
(Prefix property)
• A necessary and sufficient condition for a
uniquely decodable code to be instantaneous
is that “no complete word of a code be a
prefix of any other code-word”
• The above test for instantaneous property
(prefix property can be applied to any code to
determine whether the code is instantaneous
or not
Kraft–McMillan inequality (Kraft’s
inequality)
• In coding theory, the Kraft–McMillan
inequality gives a necessary and sufficient
condition for the existence of a prefix code or
a instantaneous code for a given set of code-
word lengths.

• State Kraft’s inequality. Explain the same with an example


Kraft–McMillan inequality (Kraft’s
inequality)
• A necessary and sufficient condition for the
existence of a instantaneous code with word
lengths 𝑙1 , 𝑙2 , … … 𝑙𝑞 is that
𝑞

𝑟 −𝑙𝑖 ≤ 1
𝑖=0
• Where
𝑟 = number of different symbols used in the code alphabet X
𝑙𝑖 = word length in binary digits binits of the codeword corresponding to ith
source symbol
𝑞 = nummber of source symbols
Kraft’s inequality
• For binary codes, we have 𝑟 = 2, so that
Kraft’s inequality equation becomes
𝑞

2−𝑙𝑖 ≤ 1
𝑖=0
Consider 5 different codes given in the table. Check if they satify Kraft’s inequality
Construct the decision trees for code-L, M and N given in the table below
SHANNON'S FIRST THEOREM
(NOISELESS CODING THEOREM)

"Shannon's noiseless coding theorem” places


an upper and a lower bound on the minimal
possible expected length of codewords as a
function of the entropy of the input word and of the
size of the target alphabet.
SHANNON'S FIRST THEOREM
(NOISELESS CODING THEOREM)

"Shannon's first theorem“ or "Shannon's


fundamental theorem" or "Noiseless coding
theorem" states that
"given a code alphabet with 'r’ symbols and source
alphabet of 'q’ symbols, the average length of
code-words can be made as close to Hr(S) as
possible by increasing the extension.

State and prove Shannon’s NOISELESS CODING THEOREM


SHANNON'S FIRST THEOREM
(NOISELESS CODING THEOREM)

Shannon suggested that the length ‘𝑙𝑖 ’ of code-word can be known using the formula
1
𝑙𝑖 = log 𝑟
𝑝𝑖

Above equation signifies that larger the value of symbol probability smaller
will be the value of ‘𝑙𝑖 ’ and a fewer encoding digits will be sufficient
SHANNON'S FIRST THEOREM
(NOISELESS CODING THEOREM)

If 𝑙𝑖 happens to be a fraction as calculated from equation


above, then it is rounded off to next integer as given by

1 1
log 𝑟 ≤ 𝑙𝑖 ≤ 1 + log 𝑟
𝑝𝑖 𝑝𝑖
• Using the property of logarithms,

1 1
log 2 log 2
𝑝𝑖 𝑝𝑖
≤ 𝑙𝑖 ≤ 1 +
log 2 𝑟 log 2 𝑟
Multiplying throughout by pi, and taking summation for all i
varying from 1 to q, we get
But
but
For nth extension code
We have

Therefore

………..(1)
Dividing throughout by 'n'

…..(2)
Equation (1) and (2) are usually called "Noiseless
coding theorem" and is the basis for "Shannon's first
theorem or fundamental theorem".

Notice that in the above discussion, we have not


considered the effect of noise on the codes & hence it is
termed "noiseless“

The emphasis is only on how to most efficiently encode


the source and to improve the code efficiency to a
maximum extent.
When L is the average length of code-words for the basic source S, then for all
values of extension 'n', it is true that
Shannon’s encoding algorithm
Shannon’s encoding algorithm
Shannon’s encoding algorithm
Shannon’s encoding algorithm
SHANNON-FANO ENCODING
ALGORITHM
Which is less than the efficiency of the 2nd extended source

You might also like