0% found this document useful (0 votes)
35 views

Comm-04-Phase and Frequency Modulation

This document provides an overview of error control coding. It discusses error detecting codes like cyclic redundancy codes (CRC) that can detect errors and error correcting codes like block codes, convolutional codes, and turbo codes that can both detect and correct errors. The document explains how linear block codes use generator and parity check matrices to encode messages into codewords in a systematic way. It also describes how syndromes generated from the parity check matrix can be used to detect errors in received codewords.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

Comm-04-Phase and Frequency Modulation

This document provides an overview of error control coding. It discusses error detecting codes like cyclic redundancy codes (CRC) that can detect errors and error correcting codes like block codes, convolutional codes, and turbo codes that can both detect and correct errors. The document explains how linear block codes use generator and parity check matrices to encode messages into codewords in a systematic way. It also describes how syndromes generated from the parity check matrix can be used to detect errors in received codewords.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

Error Control Coding

Wireless Information Transmission System Lab.


Institute of Communications Engineering
g g
National Sun Yat-
Yat-sen University
Introduction

◊ Error Detecting Codes: Capability of detecting errors so


that re-transmission
re transmission or dropping can be done.
done
◊ Cyclic Redundancy Code (CRC)

◊ Error Correcting Codes: Capability of detecting and


correcting errors.
◊ Block Codes: Cyclic code, BCH code, RS code, … etc.
◊ Convolutional code
◊ Turbo code
◊ Low Density Parity Check (LDPC) Code

2
Introduction

◊ In general, if the channel quality is good (e.g. line


transmission) error detection is preferred.
transmission), preferred In this case
case,
most of the packets can be received correctly and the
backward error correction (BEC) scheme can be adopted,
adopted
i.e. the packet is re-transmitted once there is an error.

◊ If the channel quality is poor (e.g. wireless transmission),


error correction is preferred. In this case, almost all the
packets are received erroneously and the forward error
correction (FEC) scheme can be adopted, i.e. the error
correction scheme is applied to every packet.

3
Cyclic Redundancy Code (CRC)

◊ The sender and receiver must agree upon a generator


polynomial G(x),
polynomial, G(x) in advance
advance.

4
Cyclic Redundancy Code (CRC)

5
Cyclic Redundancy Code (CRC)

◊ Examples of CRCs used in practice:

◊ A 16-bit checksum catches all single and double errors, all


errors with an odd number of bits, all burst errors of length
16 or less, 99.997% of 17-bit error bursts, and 99.998% of
18-bit and longer bursts.

6
Linear Block Codes

◊ Encoder transforms block of k successive binary digits


into longer block of n (n>k) binary digits.
◊ Called an (n,k) code.
◊ Redundancy = n-k; Code Rate = k/n;
◊ There are 2k ppossible messages.
g
◊ There are 2k possible code words corresponding to the
messages.
g
◊ Code Word (or code vector) is an n-tuple from the
space
p Vn of all n-tuple.
p
◊ Storing the 2k code vector in a dictionary is prohibitive
for large
g k.

7
Vector Spaces

◊ The set of all binary n-tuples, Vn, is called a vector


space over GF (2).
◊ GF: Galois Field.
◊ Two operations are defined:
◊ Addition: V + U = V1 + U1 + V2 + U 2 + ... + Vn + U n
◊ Scalar Multiplication: aV = aV1 + aV2 + ... + aVn
◊ Example: Vector Space V4
◊ 0000 0001 0010 0011 0100 0101 0110 0111
1000 1001 1010 1011 1100 1101 1110 1111
◊ (0101)+(1110)=(0+1, 1+1, 0+1, 1+0)=(1, 0, 1, 1)
◊ 1 (1010) (1 1 11·0,
1·(1010)=(1·1, 0 11·1,
1 11·0)=(1,
0) (1 0,
0 1,
1 0)
8
Subspaces

◊ A subset S of Vn is a subspace if
◊ The all-zero vector is in S
◊ The sum of any two vectors in S is also in S.
S

◊ Example of S: V 0 = 0000
V 1 = 0101
V 2 = 1010
V 3 = 1111

9
Reducing Encoding Complexity

◊ Keyy feature of linear block codes: the 2k code vectors


form a k-dimensional subspace of all n-tuples.
◊ p k = 3,, 2k = 8,, n = 6,, ( 6 , 3 ) code
Example:
Message Code Word
000 000000 ⎫

100 110100 ⎪
010 011010 ⎪

110 101110 ⎪ A 3 - dimensional subspace of

001 101001 ⎪ the vector space of all 6 - tuples.
101 011101 ⎪

011 110011 ⎪

111 000111 ⎭

10
Reducing Encoding Complexity

◊ It is possible to find a set of k linearly independent n -


tuples v1 , v 2 , ..., v k such that each n - tuple of the suspace
i a linear
is li combinatio
bi i n off v1 , v 2 , ..., v k .

◊ Code word u = m1 v1 + m2 v 2 + ... + mk v k


where mi = 0 or 1
i = 1,..., k

11
Generator Matrix

⎡ v1 ⎤ ⎡ v11 v12 v1n ⎤


⎢ v ⎥ ⎢v v v ⎥
G = ⎢ 2 ⎥ = ⎢ 21 22 2n ⎥
= k × n Generator Matrix
⎢ ⎥ ⎢ ⎥
⎢ ⎥ ⎢ ⎥
⎣v k ⎦ ⎣ vk 1 vk 2 vkn ⎦
◊ The 2k code vectors can be described by a set of k linearly
independent code vectors.
◊ Let m=[m[ 1, m2, … , mk] be a message. g
◊ Code word corresponding to message m is obtained by:
⎡ v1 ⎤
⎢v ⎥
u = mG = [m1 m2 mk ] ⎢ 2 ⎥
⎢ ⎥
⎢ ⎥
⎣v k ⎦
12
Generator Matrix
◊ Storage is greatly reduced.
◊ The
h encoder
d needs d to store the
h k rows off G iinstead
d off
the 2k code vectors of the code.
◊ For example:
⎡ v1 ⎤ ⎡1 1 0 1 0 0⎤
Let G = ⎢⎢v 2 ⎥⎥ = ⎢⎢0 1 1 0 1 0⎥⎥ and m = [1 1 0]
⎢⎣v 3 ⎥⎦ ⎢⎣1 0 1 0 0 1⎥⎦
Then
⎡ v1 ⎤ = 1 ⋅ v1 + 1 ⋅ v 2 + 0 ⋅ v 3
⎢ ⎥
u = [1 1 0]⎢v 2 ⎥ = 1 ⋅ [110100] + 1⋅ [011010] + 0 ⋅ [101001]
⎣⎢v 3 ⎥⎦ = [1 0 111 0] Code Vector for m = [110]
13
Systematic Code

14
Parity Check Matrix

◊ For each ggenerator matrix G,, there exists a pparity


y check matrix H
such that the rows of G are orthogonal to the rows of H. (u·h=0)
⎡ h1 ⎤ ⎡ h11 h12 h1n ⎤
⎢ h ⎥ ⎢ h h22 h2 n ⎥⎥
H= ⎢ 2 ⎥
= ⎢ 21

⎢ ⎥ ⎢ ⎥
⎢ ⎥ ⎢ ⎥
⎣⎢ ( n − k ) ⎥⎦ ⎢⎣h( n − k )1
h h( n − k ) 2 h( n − k ) n ⎥⎦
u = u1 , u2 ,… , un
uH T = u1hi1 + u2 hi 2 + + un hin = 0
where i = 1,2, …, n − k
◊ U iss a code word
wo d ge
generated
e ed by matrix G if and y if uuHT=0
d oonly

15
Parity Check Matrix and Syndrome

◊ In a systematic code with G


G=[P
[Pkxr Ikxk]
H=[Irxr PTrxk]
r u e
◊ Received Code Error
= +
Vector Vector Vector
◊ Syndrome
y of r used for error detection and correction
s = rH T

⎧= 0 If r is a code vector
◊ Syndrome s ⎨
⎩≠ 0 Otherwise
16
Example of Syndrome Test

⎡ ⎤ H = [ I n−k PT ]
⎢1 1 0 1 0 0⎥
⎢ ⎥
◊ G = ⎢0 1 1 0 1 0 ⎥ ⎡1 0 0 1 0 1⎤
⎢1 0 1 0 0 1⎥ H = ⎢⎢0 1 0 1 1 0⎥⎥
⎢⎣ ⎥
P Ik ⎦ ⎢⎣0 0 1 0 1 1⎥⎦
◊ The 66-tuple
tuple 1 0 1 1 1 0 is the code vector corresponding to the
message 1 1 0. ⎡1 0 0⎤
⎢0 1 0⎥⎥

⎢0 0 1⎥
s = u ⋅ H = [1 0 1 1 1 0]• ⎢
T
⎥ = [0 0 0]
⎢1 1 0⎥
⎢0 1 1⎥
⎢ ⎥
⎣⎢1 0 1⎥⎦
◊ Compute
Co pu e thee syndrome
sy d o e for
o thee non-code-vector
o code vec o 0 0 1 1 1 0
s = [0 0 1 1 1 0]• H T = [1 0 0]
17
Weight and Distance of Binary Vectors

◊ Hamming Weight of a Vector:


◊ w(v) = Number of non-zero bits in the vector.
◊ H
Hamming
i DiDistance
t b t
between 2 vectors:
t
◊ d(u,v) = Number of bits in which they differ.
◊ For example: u=10010110001
v=11001010101
d(u,v) = 5.
◊ d(u,v) =w(u+v)
◊ The Hamming Distance between 2 vectors is equal to the
Hamming Weight of their vector sum.

18
Minimum Distance of a Linear Code

◊ The set of all code vectors of a linear code form a


subspace of the n-tuple space.
◊ If u and v are 2 code vectors,, then u+v must also be a
code vector.
◊ Therefore,, the distance d(u,v)
( , ) between 2 code vectors
equals the weight of a third code vector.
◊ d(u,v) =w(u+v)=w(w)
◊ Thus, the minimum distance of a linear code equals
the minimum weight of its code vectors.
◊ A code with minimum distance dmin can be shown to
correct (dmin-1)/2 erroneous bits and detect (dmin-1)
erroneous bits.
19
Example of Minimum Distance

dmin=3
20
Example of Error Correction and Detection Capability

u v

d min (u , v ) = 7

⎢ d min − 1⎥
t max =⎢ ⎥ : E
Error C
Correcting
i S
Strength
h
⎣ 2 ⎦

mmax = d min − 1 : Error


E D t ti Strength
Detecting St th
21
Convolutional Code Structure

1 2 K
1 2 k 1 2 k 1 2 k
k bits

+ 1 + 2 + n-1 + n

Output

22
Convoltuional Code

◊ Convolutional codes
◊ k = number
b off bits
bit shifted
hift d into
i t the
th encoder
d att one time
ti
◊ k=1 is usually used!!
◊ n = number of encoder output bits corresponding to the k
information bits
◊ r = k/n = code rate
◊ K = constraint length, encoder memory
◊ Each encoded bit is a function of the present input bits and
their past ones.

23
Generator Sequence

◊ u
r0 r1 r2 v

g (1)
0 = 1, g(1)
1 = 0, g (1)
2 = 1, and
d g (1)
3 = 1.
Generator Sequence:
q g(1)=(1
( 0 1 1))


u
r0 r1 r2 r3 v

g 0( 2 ) = 1, g1( 2 ) = 1, g 2( 2 ) = 1, g 3( 2 ) = 0, and g 4( 2 ) = 1.
G
Generator
t Sequence:
S g(2)=(1
(1 1 1 0 1)

24
Convolutional Codes
An Example – (rate=1/2 with K=2)

G1(x)=1+x2 0(00)
G2(x)=1+x1+x2
x1 x2
00
0(11) 1(11)
Present Next Output
0(01)
0 00 00 00 01 10
1(00)
1 00 10 11
0 01 00 11 0(10) 1(10)
1 01 10 00 11
0 10 01 01
1 10 11 10
1(01)
0 11 01 10
State Diagram
1 11 11 01
25
Trellis Diagram Representation

Trellis termination: K tail bits with value 0 are usually added to the end of the code.
26
Encoding Process

Input: 1 0 1 1 1 0 0
Output: 11 01 00 10 01 10 11

27
Viterbi Decoding Algorithm

◊ Maximum Likelihood (ML) decoding rule

ML
received sequence r detected sequence d

min(d,r) !!

◊ Viterbi Decoding Algorithm


◊ An efficient search algorithm
◊ Performing ML decoding rule.
rule
◊ Reducing the computational complexity.

28
Viterbi Decoding Algorithm

◊ Basic concept
◊ Generate
G t th
the code
d ttrellis
lli att the
th decoder
d d
◊ The decoder penetrates through the code trellis level by level in
search for the transmitted code sequence
◊ At each level of the trellis, the decoder computes and compares
the metrics of all the partial paths entering a node
◊ The decoder stores the partial path with the larger metric and
eliminates all the other partial paths. The stored partial path is
called the survivor.

29
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2

01 01 01 01 01

10 10 10 10 10
0

11 1(01) 11 1(01) 11 1(01) 11

30
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4

01 01 01 01 01
1

10 10 10 10 10
0 2

11 1(01) 11 1(01) 11 1(01) 11


1
31
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3

01 01 01 01 01
1 2

10 10 10 10 10
0 2 1

11 1(01) 11 1(01) 11 1(01) 11


1 2
32
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3 3

01 01 01 01 01
1 2 2

10 10 10 10 10
0 2 1 3

11 1(01) 11 1(01) 11 1(01) 11


1 2 1
33
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3 3 3

01 01 01 01 01
1 2 2 3

10 10 10 10 10
0 2 1 3 3

11 1(01) 11 1(01) 11 1(01) 11


1 2 1 1
34
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3 3 3 3

01 01 01 01 01
1 2 2 3 2

10 10 10 10 10
0 2 1 3 3

11 1(01) 11 1(01) 11 1(01) 11


1 2 1 1
35
Viterbi Decoding Algorithm

Output: 11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3 3 3 3 2

01 01 01 01 01
1 2 2 3 2

10 10 10 10 10
0 2 1 3 3

11 1(01) 11 1(01) 11 1(01) 11


1 2 1 1
36
Viterbi Decoding Algorithm

Decision:11 01 00 10 01 10 11
Receive: 11 11 00 10 01 11 11
0(00) 0(00) 0(00) 0(00) 0(00)
00 00 00 00 0(00) 00 0(00) 00 00 00
2 4 3 3 3 3 2

01 01 01 01 01
1 2 2 3 2

10 10 10 10 10
0 2 1 3 3

11 1(01) 11 1(01) 11 1(01) 11


1 2 1 1
37

You might also like