Introduction to ITC
Introduction to ITC
Ms. Latha
Assistant Professor (SG)
Department of ECE,
Amrita School of Engineering , Bengaluru
1/22/2025
Course Outcomes
CO1: Able to understand the Information theory fundamentals and the
fundamental limits of communication system
CO2: Able to analyse the basic types of codes and understand the source coding
algorithms
CO3: Able to derive the channel capacity of communication channel models
CO4: Able to understand the method of encoding and decoding technique of linear
block code, cyclic code, convolutional codes
CO5: Able to carry out implementation of different source coding and channel
coding algorithms
• Unit 2 : Linear block codes - structure – matrix description – Hamming codes - Standard array
arithmetic of Galois fields - Integer ring – finite fields based on integer ring – polynomial rings –
finite fields based on polynomial rings – primitive elements - Structure of finite fields - Cyclic
codes - Structure of cyclic codes – encoding and decoding of cyclic codes.
• Unit 3 : BCH codes - Generator polynomials in terms of minimal polynomial – Decoding of BCH
codes – Reed-Solomon codes – Peterson-Gorenstein – Zierler decoder – Introduction to low
density parity check codes - Convolutional Codes: Introduction to Convolutional Codes – Basics of
Convolutional Code encoding and decoding – Sequential decoding – Viterbi decoding-
Introduction to Turbo codes
22-01-2025 Latha, ASE Bangalore
Textbooks/References
Text Books
• Ranjan Bose, “Information Theory, Coding and Cryptography”, Tata McGraw
Hill, 2nd edition.
• Thomas M Cover and Joy A Thomas, “Elements of Information Theory”, Second
Edition John Wiley, 2006.
References
• J.Proakis, M. Salehi, “Fundamentals of Communications systems”, Pearson
Education, Second Edition, 2005.
• Shu Lin and Daniel J.Costello, “Error Control Coding –Fundamentals and
Applications”, Pearson, Second Edition, 2004.
• P.S. Satyanarayana, “Concepts of Information Theory and Coding”, Dynaram
Publication, 2005
• “Information Theory and Coding” K Giridhar. (For problems)
22-01-2025 Latha, ASE Bangalore
Father of Digital Communication
Destination Decoding
1. Measurement of Information
2. Source Coding Theory
3. Channel Coding Theory
Message Encoder
e.g. English symbols e.g. English to 0,1 sequence
Information
Coding
Source
Communication
Channel
Destination Decoding
Source Channel
Data
Encoding Encoding
Channel
Source Channel
User
Decoding Decoding
Channel
• Information theory provides the necessary information about the information generated by
the source and representation. Coding techniques deals with the necessary and different
techniques that encode the information travelling through the channel.
• These are divided into 2 categories
Source Encoding and Channel Encoding
• Output of a quantizer: If the quantizer is of 256 levels, then we say that the source emits 256
symbols.
• (1) source alphabet (2) symbol rate (3) source alphabet probabilities (4) probability dependence of
symbols in sequence
• The symbol rate is the rate at which the source generates the symbols measured as
“symbols per second”.
• Examples:
• If a teletype operates at the speed of 10 characters/sec , then the symbol rate is said to be
10 symbols/sec….
• If sampling rate is 8000 samples/sec, then symbol rate is 8000 symbols per sec.
• If bit rate is 10 kbps and M=32, then symbol rate is 2000 symbols/sec
• If I toss a dice 1,000,000 times and record values from each trial
1,3,4,6,2,5,2,4,5,2,4,5,6,1,….
• In principle, I need 3 bits for storing each outcome as 3 bits covers 1-8. So I
need 3,000,000 bits for storing the information.
• Using ASCII representation, computer needs 8 bits=1 byte for storing each
outcome
• The resulting file has size 8,000,000 bits
22-01-2025
Latha, ASE Bangalore
Introduction (Contd…)
But Shannon said:
File Size Compression
• You only need 2.585 bits for storing each
Ratio
outcome. Let’s Do Some Test!
• So, the file can be compressed to yield size
No Compression 8,000,000 bits 100%
2.585x1,000,000=2,585,000 bits
• Optimal Compression Ratio is:
Shannon 2,585,000 bits 32.31%