Channel Coding Exam
Channel Coding Exam
Introduction
This coursework contributes 5% of your mark for ELEC1011 Communications
and Control. It is therefore worth half a credit and should take you up to 5
hours to complete it. Please do not spend any more time than this since Im
sure you have better things to be doing!
Besides discussing ideas with your classmates, you are required to work entirely on your own to produce a write-up, which only needs to include the figures,
calculations and comments that are directly requested in the exercises below.
Your write-up can contain a combination of hand-written work, computer printouts and pages from this document. When you are finished, you need to print
out a receipt from C-BASS
https://handin.ecs.soton.ac.uk/handin/0910/ELEC1011/2/
You then need to staple your receipt together with your write-up and submit it
at the ECS reception before 4pm on 05/05/2010.
Here are some tips that will help you get a high mark in this coursework:
Include your workings for each calculation in your write-up.
Each figure in your write-up should have well labelled and accurately
annotated axes, as well as a relevant title and/or legend that identifies
which signals are shown.
If you draw your figures by hand, make sure you use a ruler.
You may like to use Matlab to help you complete the exercises and to
produce the figures.
If you notice any mistakes in this document or have any queries about it,
please email me at [email protected]
Have fun, Rob Maunder.
Filters
C = 1 nF
L = 1 mH
vi (t)
R = 100 k
vo (t)
Huffman Coding
Table 1 provides the probabilities of occurrence for the letters of the alphabet,
when used in the English language. A Huffman codebook has been designed
based on these probabilities of occurrence and the resultant codewords are also
shown in Table 1.
a) Calculate the entropy H of the letters of the alphabet, when used in the
English language.
b) Calculate the average codeword length L for the Huffman codebook provided
in Table 1.
c) Calculate the Huffman coding efficiency R.
d) Determine the bit sequence that results from the Huffman coding of the
letter sequence HUFFMAN.
Letter i
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
Probability of
occurrence pi
0.0856
0.0139
0.0279
0.0378
0.1304
0.0289
0.0199
0.0528
0.0627
0.0013
0.0042
0.0339
0.0249
0.0707
0.0797
0.0199
0.0012
0.0677
0.0607
0.1045
0.0249
0.0092
0.0149
0.0017
0.0199
0.0008
Huffman
codeword ci
1111
010110
01010
11010
100
10100
110111
0100
0111
1101101110
11011010
10101
00010
1100
1110
00000
1101101101
1011
0110
001
00011
1101100
010111
1101101111
00001
1101101100
Table 1: The letters of the alphabet, their probabilities of occurrence when used
in the English language and the allocated Huffman codewords.
e) Complete the binary tree shown in Figure 2 so that it describes the Huffman
codebook of Table 1.
f ) Decode the bit sequence 0110111101111100001011010100001010111. You
may find your binary tree useful for this.
Arithmetic Coding
A particular source generates the symbols [A, C, G, T] with the probabilities [0.4,
0.3, 0.2, 0.1], respectively. An arithmetic coding scheme is used to encode and
decode these symbols. Figure 3 provides a diagram that may be used to assist
the encoding and decoding of sequences comprising seven symbols.
a) Draw diagonal lines between the horizontal bars of Figure 3 to show the
range that is expanded during the encoding of each symbol in the sequence
GATTACA. Using at least six decimal places1 , label each of the vertical lines
in Figure 3 with the values that separate the adjacent ranges. State the
final range that results from the encoding of the symbol sequence GATTACA.
b) Draw a binary tree to determine the shortest bit sequence that represents a
decimal value in the final range that you identified for the above symbol
sequence.
c) Calculate to at least six decimal places, the decimal value that is represented
by the arithmetic encoded bit sequence 11101101111111.
d) Using a copy of Figure 3, determine the seven symbols that are represented
by the above bit sequence.
e) Explain why arithmetic coding can potentially achieve a higher coding efficiency than Huffman coding.
Hamming Coding
1 1 0 1
1 0 1 1
1 0 0 0
G=
0 1 1 1
0 1 0 0
0 0 1 0
0 0 0 1
0
H= 0
1
0
1
0
0
1
1
1
0
0
1
0
1
1
1
0
1
1
1
Figure 3: Diagram that may be used to assist the arithmetic encoding and
decoding of sequences comprising seven symbols.
i
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
[0
[0
[0
[0
[0
[0
[0
[0
[1
[1
[1
[1
[1
[1
[1
[1
xi
0 0
0 0
0 1
0 1
1 0
1 0
1 1
1 1
0 0
0 0
0 1
0 1
1 0
1 0
1 1
1 1
yi
0 ]T [ 0 0 0 0 0 0 0 ]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
0 ]T [
]T
1 ]T [
]T
2
4
3
3
3
4
3
3
4
7
4
4
3
3
8
4
4
3
3
3
3
4
d(yi, yj )
6
3
3
4
4
4
9 10 11 12 13 14 15 16
3 3 4 4 4 4 3 7 1
3 3 4 4 4 4
3 2
4 3 3 3 7 4 4 3
4 4 3
7 3 4 4 4
4 4 3 7 3 3 4 4 5
4 4 7 3 3 3 4 4 6
3 7 4 4 4
3 3 7
3 4 4 4 4 3 3 8
4 3
3 3 4 4 9
3 3 3 3 4 4 10
4 4 4
3 11
4 4 3 3 12
4 3 3 13
3 3 14
15 i
5
3
3
0 ]T containing
= [ 1
j) Explain how the erroneous bit in y
identified using the syndrome s.
0 ]T can be