Unit IV - Information Theory
Unit IV - Information Theory
AND CODING
• Introduction
• Information Measure
• Function Determination for Information
• Average Information per Symbol
• Information rate
• Coding
• Shannon-Fano Coding
Source coding
1.1 Introduction to information theory
-3-
• The information theory and coding answers two fundamental questions
• What is the irreducible complexity below which a signal cannot be
compressed?
• What is the ultimate transmission rate for reliable communication over a
noisy channel?
• The answers lie in entropy of a source and the capacity of a channel
respectively.
• Advantages
– Error probability is reduced
– Excellent channel coding techniques are available
– Utilization of full channel capacity
– Coding techniques for forward error correction and automatic repeat
request are available
• Disdvantages
– All coding techniques add redundancy bits for error detection and
correction
– Computations in coding and decoding involve delay in transmission
and reception
• Entropy is defined in terms of probabilistic behavior of a source of
information
• Capacity is defined as intrinsic ability of a channel to convey information.
Uncertainty, Information, Entropy
K-1
pk = 1
K=0
Information measure
• So Question is which function that we can use that measure the Information?
• Information = F(1/Probability)
Requirement that function must satisfy
1. Its output must be non negative Quantity.
2. Minimum Value is 0.
3. It Should make Product into summation.
Here b may be 2, e or 10
If b = 2 then unit is bits
b = e then unit is nats
b = 10 then unit is decit
Information Measure-properties
• Also we can state the three law from Intution
Mathematically I(sk) = 0 as Pk 1
It may be zero
Mathematically
0.6
0.4
0.2
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Probability (p)