0% found this document useful (0 votes)
11 views

Introduction to ITC

The document outlines the course Information Theory and Coding, detailing its objectives, syllabus, and key concepts. It covers fundamental topics such as information sources, encoding techniques, channel capacity, and error control codes, emphasizing Shannon's contributions to digital communication. The course aims to equip students with the ability to analyze and implement various coding algorithms and understand the principles of reliable communication.

Uploaded by

7.bharani12
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Introduction to ITC

The document outlines the course Information Theory and Coding, detailing its objectives, syllabus, and key concepts. It covers fundamental topics such as information sources, encoding techniques, channel capacity, and error control codes, emphasizing Shannon's contributions to digital communication. The course aims to equip students with the ability to analyze and implement various coding algorithms and understand the principles of reliable communication.

Uploaded by

7.bharani12
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Information Theory and Coding

Course Name : Information Theory and Coding


Course Code : 19ECE312
Semester : 6th Sem – ECE

Ms. Latha
Assistant Professor (SG)
Department of ECE,
Amrita School of Engineering , Bengaluru

22-01-2025 Latha, ASE Bangalore


Digital communication system

1/22/2025
Course Outcomes
CO1: Able to understand the Information theory fundamentals and the
fundamental limits of communication system
CO2: Able to analyse the basic types of codes and understand the source coding
algorithms
CO3: Able to derive the channel capacity of communication channel models
CO4: Able to understand the method of encoding and decoding technique of linear
block code, cyclic code, convolutional codes
CO5: Able to carry out implementation of different source coding and channel
coding algorithms

22-01-2025 Latha, ASE Bangalore


Syllabus
• Unit 1 : Introduction to Information Theory: Modeling of information sources – uncertainty and
information – Entropy – information measures for continuous random variable – source coding
theorem – Kraft inequality – source coding algorithms: Huffman coding – arithmetic coding –
Lempel-Ziv algorithm. Rate distribution function – Entropy rate of stochastic process - Modeling
of communication channels - Binary symmetric channel – binary erasure channel – channel
capacity – noisy channel coding theorem – Information capacity theorem – Shannon’s limit –
bounds on communication.

• Unit 2 : Linear block codes - structure – matrix description – Hamming codes - Standard array
arithmetic of Galois fields - Integer ring – finite fields based on integer ring – polynomial rings –
finite fields based on polynomial rings – primitive elements - Structure of finite fields - Cyclic
codes - Structure of cyclic codes – encoding and decoding of cyclic codes.

• Unit 3 : BCH codes - Generator polynomials in terms of minimal polynomial – Decoding of BCH
codes – Reed-Solomon codes – Peterson-Gorenstein – Zierler decoder – Introduction to low
density parity check codes - Convolutional Codes: Introduction to Convolutional Codes – Basics of
Convolutional Code encoding and decoding – Sequential decoding – Viterbi decoding-
Introduction to Turbo codes
22-01-2025 Latha, ASE Bangalore
Textbooks/References
Text Books
• Ranjan Bose, “Information Theory, Coding and Cryptography”, Tata McGraw
Hill, 2nd edition.
• Thomas M Cover and Joy A Thomas, “Elements of Information Theory”, Second
Edition John Wiley, 2006.
References
• J.Proakis, M. Salehi, “Fundamentals of Communications systems”, Pearson
Education, Second Edition, 2005.

• Shu Lin and Daniel J.Costello, “Error Control Coding –Fundamentals and
Applications”, Pearson, Second Edition, 2004.
• P.S. Satyanarayana, “Concepts of Information Theory and Coding”, Dynaram
Publication, 2005
• “Information Theory and Coding” K Giridhar. (For problems)
22-01-2025 Latha, ASE Bangalore
Father of Digital Communication

The roots of modern digital communication stem from the ground-breaking


paper “A Mathematical Theory of Communication” by Claude
Elwood Shannon in 1948.
22-01-2025 Latha, ASE Bangalore
Shannon’s Definition of Communication

“The fundamental problem of communication


is that of reproducing at one point either
exactly or approximately a message selected at
another point.”

“Frequently the messages have meaning”

“... [which is] irrelevant to the engineering problem.”


22-01-2025 Latha, ASE Bangalore
Shannon Wants to…
• Shannon wants to find a way for “reliably” transmitting data throughout the
channel at “maximal” possible rate.
• He later on found a solution
and published in this 1948
paper.
Information • In his 1948 paper he build a rich
Coding
Source theory to the problem of reliable
communication, now called
Communication “Information Theory” or “The
Channel Shannon Theory” in honor of him.

Destination Decoding

22-01-2025 Latha, ASE Bangalore


Shannon Theory
• The original 1948 Shannon Theory contains:

1. Measurement of Information
2. Source Coding Theory
3. Channel Coding Theory

22-01-2025 Latha, ASE Bangalore


Model of a Digital Communication System

Message Encoder
e.g. English symbols e.g. English to 0,1 sequence

Information
Coding
Source

Communication
Channel

Destination Decoding

Can have noise


or distortion
Decoder
e.g. 0,1 sequence to English
22-01-2025 Latha, ASE Bangalore
Shannon’s Vision

Source Channel
Data
Encoding Encoding
Channel
Source Channel
User
Decoding Decoding

22-01-2025 Latha, ASE Bangalore


Data Zip Add CRC

Channel

User Unzip Verify CRC

22-01-2025 Latha, ASE Bangalore


22-01-2025 Latha, ASE Bangalore
Introduction to Information Theory and Coding

• Information theory provides the necessary information about the information generated by
the source and representation. Coding techniques deals with the necessary and different
techniques that encode the information travelling through the channel.
• These are divided into 2 categories
Source Encoding and Channel Encoding

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
• The meaning of the word information in information theory is “message ” or “intelligence”
• A source which produces these messages is called “information source”

Information Source Classification Examples


• An analog information
Microphone , TV Camera
Scanning a scene.
source can be
Analog
This message can be information transformed into
an electrical signal Source
such as voltage, emit one or more continuous discrete information
amplitude electrical signals wrt time.
current (or) power source through analog
(or) speech (or)
music (or) images. Discrete to digital
Information
Source Teletype or the numerical converters(ADC)
output of a computer consists
of sequence of letters, etc

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
• Discrete Information Sources
• Example of source alphabet ( discrete information source) :
• a teletype having 26 letters of the English Alphabet plus several special characters such as full stop,
comma, etc. along with numerical.

• Output of a quantizer: If the quantizer is of 256 levels, then we say that the source emits 256
symbols.

• Output of M-ary modulator: Here each symbol is a block of k binary bits.

• These sources are characterized by

• (1) source alphabet (2) symbol rate (3) source alphabet probabilities (4) probability dependence of
symbols in sequence

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
• Discrete Information Sources

• The symbol rate is the rate at which the source generates the symbols measured as
“symbols per second”.
• Examples:

• If a teletype operates at the speed of 10 characters/sec , then the symbol rate is said to be

10 symbols/sec….

• If sampling rate is 8000 samples/sec, then symbol rate is 8000 symbols per sec.

• If bit rate is 10 kbps and M=32, then symbol rate is 2000 symbols/sec

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)

• In block diagram of the information system shown, lets assume that


information source is a discrete source emitting discrete message symbols
S1, S2,…..….Sq with probabilities of occurrence given by P1, P2,…. Pq
respectively.
• The sum of all these probabilities must be equal to 1, since if at all symbols is
emitted by the source , then it has to be one of S1, S2,…..….Sq and not any
other symbol.

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
Source Encoder
Major function of source encoder is information compression which is
achieved by removing data redundancy
• The source encoder converts the symbol sequence into a binary sequence of “0’s “ and “1’s”
by assigning the code words to the symbols in the input sequence.
• Binary coding is preferable because of high efficiency of transmission
• Binary/Ternary/Quarternary are the other types of coding .
• Fixed length / Variable length encoding
• The important parameters of a source encoder are : block size , length of code words,
average data rate and encoder efficiency .

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)

• If I toss a dice 1,000,000 times and record values from each trial
1,3,4,6,2,5,2,4,5,2,4,5,6,1,….
• In principle, I need 3 bits for storing each outcome as 3 bits covers 1-8. So I
need 3,000,000 bits for storing the information.
• Using ASCII representation, computer needs 8 bits=1 byte for storing each
outcome
• The resulting file has size 8,000,000 bits

22-01-2025
Latha, ASE Bangalore
Introduction (Contd…)
But Shannon said:
File Size Compression
• You only need 2.585 bits for storing each
Ratio
outcome. Let’s Do Some Test!
• So, the file can be compressed to yield size
No Compression 8,000,000 bits 100%

2.585x1,000,000=2,585,000 bits
• Optimal Compression Ratio is:
Shannon 2,585,000 bits 32.31%

Winzip 2,930,736 bits 36.63%


2,585,000
= 0.3231 = 32.31%
8,000,000 WinRAR 2,859,336 bits 35.74%

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
Channel Encoder
• Error control codes or channel codes are used to detect and
correct the Errors
• How do error control codes work?
• Basic idea is to add redundancy in the form of “check bits” or
“parity bits” to make communication more robust
• Extra bits reduces the net communication rate and increases the
bandwidth requirement.

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
Channel
• It provides the electrical connection between source and destination.
• Due to physical limitations, communication channels have only finite
bandwidth and the information bearing signal suffers amplitude and phase
distortion as it travels over the channel.
• In addition to distortion, the signal power also decreases due to attenuation
of the channel.
• Furthermore, the signal is corrupted by unwanted unpredictable electrical
signals referred as a noise.

22-01-2025 Latha, ASE Bangalore


Introduction (Contd…)
Decoding and Receiver
• The source decoder converts the binary output of the channel decoder into a
symbol sequence
• The decoder for a fixed length code words is quite simple, but the decoder for a
system using variable length code words will be very complex.
• The function of decoder is to convert the corrupted signals into a symbol
sequence
• The function of the receiver is to identify the symbol sequence and match it
with the correct sequence

22-01-2025 Latha, ASE Bangalore

You might also like