Nitte Meenakshi Institute of Technology
Nitte Meenakshi Institute of Technology
Assignment Questions on
INFORMATION THEORY AND CODING
<Huffman coding & Entropies>
Submitted by
COURSE CO-ORDINATOR
Mrs.Pramodini
Assistant Professor ,Dept of E&C
Nitte Meenakshi Institute of Technology
Bangalore – 64
Certificate
This is to Certify that the Mini-Project entitled “Huffman
coding & Entropies” is the bonafied work of ASHUTOSH
SINGH(1NT16EC023), for the Course on “ELECTRONICS AND
COMMUNICATION ENGINEERING – Information Theory and
coding” during the Academic Year 2018-19.
………………………………………………………………..
Mrs. Pramodini
Assistant Professor ,Dept of EC
Nitte Meenakshi Institute of Technology
Bangalore – 64
2.
ACKNOWLEDGEMENT
Nothing is envisaged without the help guidance of an experienced person
respect in the field of concert subject. Through the benefits achieved from
them can never be adequately valued, we would like to express hearty
gratitude towards them.
We would like to take this opportunity to thank Mrs. V.M Sweta S, Assistant
Professor Civil for her valuable suggestions and support throughout the
course of Advanced Concrete Technology.
3.
INFORMATION THEORY AND CODING
Question. no.1
Determination of various entropies and mutual information of the given BSC channel.
Aim: Write a program for determination of various entropies and mutual information of a given
Apparatus: PC,MATAB/C
Theory:
1. Explain in detail BSC with neat diagram.
2. Find capacity of BSC channel.
Algorithm:
I) Entropies:
1. Input the no. of inputs of a channel.
2. Input the no. of outputs of a channel.
3. Input the channel matrix. Test the condition that sum of all the entries in each row should
be equal to 1.
4. Input the channel input probabilities. i.e. P[X].
5. Calculate the entropy of the channel input. i.e. H(X)
9. Calculate the joint probability matrix by multiplying input probability matrix in diagonal
form by channel matrix.
10. Calculate joint entropy with the help of formula
11. Calculate conditional entropies H(Y/X)&H(X/Y).
P(Y/X)=
P(X,Y)=
Conclusion:
clc;
clear all;
close all;
p=input('Enter probability=');
% entropy
H(x) for
n=1:i
H=sum+(p(n)*log2(1/p(n)));
sum=H;
end
disp('H(x):
');
disp(H);
for n=1:i
for m=1:i
a(n,m)=q(n,m)*p(n);
end
end
disp('P(X,Y):');
disp(a);
% entropy
H(Y/X) d=0;
for n=1:i
for
m=1:i
H1=d+(a(n,m)*log2(1/q(n,m)));
d=H1;
end
end
disp('H(Y/X):');
disp(H1);
% probability
P(Y) for n=1:i
w=0;
for m=1:i
s(n)=w+a(m,n
); w=s(n);
end
end
disp('P(Y):'
); disp(s);
% entropy H(Y)
k=0;
for n=1:i
H2=k+(s(n)*log2(1/s(n)));
k=H2;
end
disp('H(Y): ');
disp(H2);
% Find Mutual
Information m=H2-H1;
disp('MI='
);
disp(m);
Output:
H(x):
0.8113
P(X,Y):
0.2500 0.5000
0.1667 0.0833
H(Y/X):
0.9183
P(Y):
0.4167 0.5833
H(Y):
0.9799
MI=
0.0616
Question. No.2
Encoding and decoding of Huffman code
Aim: Write a program for generation and evaluation of variable length source coding
using Huffman Coding and decoding.
Apparatus: Matlab/C
Theory:
1. Explain variable length coding.
3. Solve theoretically and verify using matlab program the given example.
Algorithm:
1. Start.
6. With addition & other probabilities again sort out the total probabilities.
7. If the addition result is equal to probability of an symbol then put it on the top
10. Find out entropy, avg. code word length and efficiency.
11. Stop .
Conclusion:
Program:
clear all;
close all;
dict{2,:}
dict{3,:}
dict{4,:}
dict{5,:}
disp(hcode);
disp('decoded
msg:'); disp(dhsig);
code_length=length(hcode)
for m=1:5
H=Hx+(p(m)*log2(1/p(m)));
Hx=H;
end
disp('Hx=');
disp(H);
Efficiency=(Hx/avglen)*100
Output:
dict =
[1] [1x4 double]
[2] [1x4 double]
[3] [1]
[4] [1x2 double]
[5] [1x3 double]
avglen =
2.1000
samplecode = 3 encoded msg:
0 0 1 ans = 0 0 0 1 0 0 0
ans = 1 0 1 0 1 0 0 1
1 ans = decoded msg:
ans = 4 1 2 3 4 5
0 0 0 1 ans = code_length =
ans = 0 1 14
2 ans = Hx=
ans = 5 2.0464
0 0 0 0 ans = Efficiency =
ans = 0 0 1 97.4495
II: Huffman coding:string
clear all;
close all;
msg='TEECT'
dict{2,:}
dict{3,:}
disp('encoded msg:');
disp(hcode);
disp('decoded msg:');
disp(dhsig);
Output:
msg =
TEECT
dict =
'T' [1x2 double]
'E' [ 1]
avglen =1.6000
ans =T
ans =
0 0
ans =E
ans=1
ans=C
ans =0 1
encoded msg: 0 0 1 1 0 1 0 0
decoded msg:
'T' 'E' 'E' 'C' 'T'