0% found this document useful (0 votes)
79 views81 pages

Fundamentals of Information Theory and Coding Design 1st Edition Roberto Togneri PDF Download

The document provides information about the book 'Fundamentals of Information Theory and Coding Design' by Roberto Togneri, which covers key concepts of information theory, coding, and their applications. It is aimed at upper-level undergraduate and graduate students in mathematics, engineering, and computer science, emphasizing both theoretical foundations and practical algorithms. The book includes numerous worked examples and exercises to enhance understanding of the material presented.

Uploaded by

jynsewain309
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views81 pages

Fundamentals of Information Theory and Coding Design 1st Edition Roberto Togneri PDF Download

The document provides information about the book 'Fundamentals of Information Theory and Coding Design' by Roberto Togneri, which covers key concepts of information theory, coding, and their applications. It is aimed at upper-level undergraduate and graduate students in mathematics, engineering, and computer science, emphasizing both theoretical foundations and practical algorithms. The book includes numerous worked examples and exercises to enhance understanding of the material presented.

Uploaded by

jynsewain309
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 81

Fundamentals of information theory and coding

design 1st Edition Roberto Togneri pdf download

https://ebookname.com/product/fundamentals-of-information-theory-
and-coding-design-1st-edition-roberto-togneri/

Get the full ebook with Bonus Features for a Better Reading Experience on ebookname.com
Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Coding Theorems of Classical and Quantum Information


Theory Second Edition Parthasarathy

https://ebookname.com/product/coding-theorems-of-classical-and-
quantum-information-theory-second-edition-parthasarathy/

Fundamentals of Convolutional Coding 2nd Edition Rolf


Johannesson

https://ebookname.com/product/fundamentals-of-convolutional-
coding-2nd-edition-rolf-johannesson/

Two Dimensional Information Theory and Coding With


Applications to Graphics Data and High Density Storage
Media 1st Edition Jørn Justesen

https://ebookname.com/product/two-dimensional-information-theory-
and-coding-with-applications-to-graphics-data-and-high-density-
storage-media-1st-edition-jorn-justesen/

Innocent 1st Edition Scott Turow

https://ebookname.com/product/innocent-1st-edition-scott-turow/
Why I am Still an Anglican Essays and Conversations
Chartres

https://ebookname.com/product/why-i-am-still-an-anglican-essays-
and-conversations-chartres/

Reconceiving the Family Critique on the American Law


Institute s Principles of the Law of Family Dissolution
1st Edition Robin Fretwell Wilson

https://ebookname.com/product/reconceiving-the-family-critique-
on-the-american-law-institute-s-principles-of-the-law-of-family-
dissolution-1st-edition-robin-fretwell-wilson/

Art in Public Politics Economics and a Democratic


Culture 1st Edition Lambert Zuidervaart

https://ebookname.com/product/art-in-public-politics-economics-
and-a-democratic-culture-1st-edition-lambert-zuidervaart/

Essentials of Ecology 5th Edition Miller

https://ebookname.com/product/essentials-of-ecology-5th-edition-
miller/

Brezhnev s Folly The Building of BAM and Late Soviet


Socialism 1st Edition Christopher J. Ward

https://ebookname.com/product/brezhnev-s-folly-the-building-of-
bam-and-late-soviet-socialism-1st-edition-christopher-j-ward/
Clinical and Diagnostic Virology 2nd Edition Tim
Wreghitt

https://ebookname.com/product/clinical-and-diagnostic-
virology-2nd-edition-tim-wreghitt/
DISCRETE
MATHEMATICS
and
ITS APPLICATIONS Series Editor
Kenneth H. Rosen, Ph.D.
AT&T Laboratories
Middletown, New Jersey

Abstract Algebra Applications with Maple,


Richard E. Klima, Ernest Stitzinger, and Neil P. Sigmon
Algebraic Number Theory, Richard A. Mollin
An Atlas of The Smaller Maps in Orientable and Nonorientable Surfaces,
David M. Jackson and Terry I. Visentin
An Introduction to Crytography, Richard A. Mollin
Combinatorial Algorithms: Generation Enumeration and Search,
Donald L. Kreher and Douglas R. Stinson
The CRC Handbook of Combinatorial Designs,
Charles J. Colbourn and Jeffrey H. Dinitz
Cryptography: Theory and Practice, Second Edition, Douglas R. Stinson
Design Theory, Charles C. Lindner and Christopher A. Rodgers
Frames and Resolvable Designs: Uses, Constructions, and Existence,
Steven Furino, Ying Miao, and Jianxing Yin
Fundamental Number Theory with Applications, Richard A. Mollin
Graph Theory and Its Applications, Jonathan Gross and Jay Yellen
Handbook of Applied Cryptography,
Alfred J. Menezes, Paul C. van Oorschot, and Scott A. Vanstone
Handbook of Constrained Optimization,
Herbert B. Shulman and Venkat Venkateswaran
Handbook of Discrete and Combinatorial Mathematics, Kenneth H. Rosen
Handbook of Discrete and Computational Geometry,
Jacob E. Goodman and Joseph O’Rourke
Introduction to Information Theory and Data Compression,
Darrel R. Hankerson, Greg A. Harris, and Peter D. Johnson
Continued Titles
Network Reliability: Experiments with a Symbolic Algebra Environment,
Daryl D. Harms, Miroslav Kraetzl, Charles J. Colbourn, and John S. Devitt
RSA and Public-Key Cryptography
Richard A. Mollin
Quadratics, Richard A. Mollin
Verificaton of Computer Codes in Computational Science and Engineering,
Patrick Knupp and Kambiz Salari
DISCRETE MA THEMA TICS A ND ITS A PPLICA TIONS
Series Editor KENNETH H. ROSEN

FUNDAMENTALS of
INFORMATION THEORY
and CODING DESIGN

Roberto Togneri
Christopher J.S. deSilva

CHAPMAN & HALL/CRC


A CRC Press Company
Boca Raton London New York Washington, D.C.
/LEUDU\RI&RQJUHVV&DWDORJLQJLQ3XEOLFDWLRQ'DWD

7RJQHUL5REHUWR
)XQGDPHQWDOVRILQIRUPDWLRQWKHRU\DQGFRGLQJGHVLJQ5REHUWR7RJQHULDQG
&KULVWRSKHU-6GH6LOYD
SFP³ 'LVFUHWHPDWKHPDWLFVDQGLWVDSSOLFDWLRQV
,QFOXGHVELEOLRJUDSKLFDOUHIHUHQFHVDQGLQGH[
,6%1 DONSDSHU
,QIRUPDWLRQWKHRU\&RGLQJWKHRU\,'H6LOYD&KULVWRSKHU-6,,7LWOH,,,&5&
3UHVVVHULHVRQGLVFUHWHPDWKHPDWLFVDQGLWVDSSOLFDWLRQV

47
d³GF 
 

This edition published in the Taylor & Francis e-Library, 2006.


“To purchase your own copy of this or any of Taylor & Francis or Routledge’s
collection of thousands of eBooks please go to www.eBookstore.tandf.co.uk.”

7KLVERRNFRQWDLQVLQIRUPDWLRQREWDLQHGIURPDXWKHQWLFDQGKLJKO\UHJDUGHGVRXUFHV5HSULQWHGPDWHULDO
LVTXRWHGZLWKSHUPLVVLRQDQGVRXUFHVDUHLQGLFDWHG$ZLGHYDULHW\RIUHIHUHQFHVDUHOLVWHG5HDVRQDEOH
HIIRUWVKDYHEHHQPDGHWRSXEOLVKUHOLDEOHGDWDDQGLQIRUPDWLRQEXWWKHDXWKRUDQGWKHSXEOLVKHUFDQQRW
DVVXPHUHVSRQVLELOLW\IRUWKHYDOLGLW\RIDOOPDWHULDOVRUIRUWKHFRQVHTXHQFHVRIWKHLUXVH

1HLWKHUWKLVERRNQRUDQ\SDUWPD\EHUHSURGXFHGRUWUDQVPLWWHGLQDQ\IRUPRUE\DQ\PHDQVHOHFWURQLF
RUPHFKDQLFDOLQFOXGLQJSKRWRFRS\LQJPLFURÀOPLQJDQGUHFRUGLQJRUE\DQ\LQIRUPDWLRQVWRUDJHRU
UHWULHYDOV\VWHPZLWKRXWSULRUSHUPLVVLRQLQZULWLQJIURPWKHSXEOLVKHU
7KHFRQVHQWRI&5&3UHVV//&GRHVQRWH[WHQGWRFRS\LQJIRUJHQHUDOGLVWULEXWLRQIRUSURPRWLRQIRU
FUHDWLQJQHZZRUNVRUIRUUHVDOH6SHFLÀFSHUPLVVLRQPXVWEHREWDLQHGLQZULWLQJIURP&5&3UHVV//&
IRUVXFKFRS\LQJ
'LUHFWDOOLQTXLULHVWR&5&3UHVV//&1:&RUSRUDWH%OYG%RFD5DWRQ)ORULGD

7UDGHPDUN1RWLFH 3URGXFWRUFRUSRUDWHQDPHVPD\EHWUDGHPDUNVRUUHJLVWHUHGWUDGHPDUNVDQGDUH
XVHGRQO\IRULGHQWLÀFDWLRQDQGH[SODQDWLRQZLWKRXWLQWHQWWRLQIULQJH

9LVLWWKH&5&3UHVV:HEVLWHDWZZZFUFSUHVVFRP

‹E\&KDSPDQ +DOO&5&

1RFODLPWRRULJLQDO86*RYHUQPHQWZRUNV
,QWHUQDWLRQDO6WDQGDUG%RRN1XPEHU
/LEUDU\RI&RQJUHVV&DUG1XPEHU
3ULQWHGLQWKH8QLWHG6WDWHVRI$PHULFD
3ULQWHGRQDFLGIUHHSDSHU
Preface

What is information? How do we quantify or measure the amount of information


that is present in a file of data, or a string of text? How do we encode the information
so that it can be stored efficiently, or transmitted reliably?
The main concepts and principles of information theory were developed by Claude E.
Shannon in the 1940s. Yet only now, and thanks to the emergence of the information
age and digital communication, are the ideas of information theory being looked at
again in a new light. Because of information theory and the results arising from
coding theory we now know how to quantify information, how we can efficiently
encode it and how reliably we can transmit it.
This book introduces the main concepts behind how we model information sources
and channels, how we code sources for efficient storage and transmission, and the
fundamentals of coding theory and applications to state-of-the-art error correcting
and error detecting codes.
This textbook has been written for upper level undergraduate students and graduate
students in mathematics, engineering and computer science. Most of the material
presented in this text was developed over many years at The University of West-
ern Australia in the unit Information Theory and Coding 314, which was a core unit
for students majoring in Communications and Electrical and Electronic Engineering,
and was a unit offered to students enrolled in the Master of Engineering by Course-
work and Dissertation in the Intelligent Information Processing Systems course.
The number of books on the market dealing with information theory and coding has
been on the rise over the past five years. However, very few, if any, of these books
have been able to cover the fundamentals of the theory without losing the reader in
the complex mathematical abstractions. And fewer books are able to provide the
important theoretical framework when discussing the algorithms and implementa-
tion details of modern coding systems. This book does not abandon the theoretical
foundations of information and coding theory and presents working algorithms and
implementations which can be used to fabricate and design real systems. The main
emphasis is on the underlying concepts that govern information theory and the nec-
essary mathematical background that describe modern coding systems. One of the
strengths of the book are the many worked examples that appear throughout the book
that allow the reader to immediately understand the concept being explained, or the
algorithm being described. These are backed up by fairly comprehensive exercise
sets at the end of each chapter (including exercises identified by an * which are more
advanced or challenging).

v
vi

The material in the book has been selected for completeness and to present a balanced
coverage. There is discussion of cascading of information channels and additivity
of information which is rarely found in modern texts. Arithmetic coding is fully
explained with both worked examples for encoding and decoding. The connection
between coding of extensions and Markov modelling is clearly established (this is
usually not apparent in other textbooks). Three complete chapters are devoted to
block codes for error detection and correction. A large part of these chapters deals
with an exposition of the concepts from abstract algebra that underpin the design of
these codes. We decided that this material should form part of the main text (rather
than be relegated to an appendix) to emphasise the importance of understanding the
mathematics of these and other advanced coding strategies.

Chapter 1 introduces the concepts of entropy and information sources and explains
how information sources are modelled. In Chapter 2 this analysis is extended to
information channels where the concept of mutual information is introduced and
channel capacity is discussed. Chapter 3 covers source coding for efficient storage
and transmission with an introduction to the theory and main concepts, a discussion
of Shannon’s Noiseless Coding Theorem and details of the Huffman and arithmetic
coding algorithms. Chapter 4 provides the basic principles behind the various com-
pression algorithms including run-length coding and dictionary coders. Chapter 5
introduces the fundamental principles of channel coding, the importance of the Ham-
ming distance in the analysis and design of codes and a statement of what Shannon’s
Fundamental Coding Theorem tells us we can do with channel codes. Chapter 6
introduces the algebraic concepts of groups, rings, fields and linear spaces over the
binary field and introduces binary block codes. Chapter 7 provides the details of the
theory of rings of polynomials and cyclic codes and describes how to analyse and
design various linear cyclic codes including Hamming codes, Cyclic Redundancy
Codes and Reed-Muller codes. Chapter 8 deals with burst-correcting codes and de-
scribes the design of Fire codes, BCH codes and Reed-Solomon codes. Chapter 9
completes the discussion on channel coding by describing the convolutional encoder,
decoding of convolutional codes, trellis modulation and Turbo codes.

This book can be used as a textbook for a one semester undergraduate course in in-
formation theory and source coding (all of Chapters 1 to 4), a one semester graduate
course in coding theory (all of Chapters 5 to 9) or as part of a one semester under-
graduate course in communications systems covering information theory and coding
(selected material from Chapters 1, 2, 3, 5, 6 and 7).

We would like to thank Sean Davey and Nishith Arora for their help with the LATEX
formatting of the manuscript. We would also like to thank Ken Rosen for his review
of our draft manuscript and his many helpful suggestions and Sunil Nair from CRC
Press for encouraging us to write this book in the first place!

Our examples on arithmetic coding were greatly facilitated by the use of the conver-
sion calculator (which is one of the few that can handle fractions!) made available
by www.math.com.
vii

The manuscript was written in LATEX and we are indebted to the open source software
community for developing such a powerful text processing environment. We are
especially grateful to the developers of LyX (www.lyx.org) for making writing the
document that much more enjoyable and to the makers of xfig (www.xfig.org) for
providing such an easy-to-use drawing package.

Roberto Togneri
Chris deSilva
Contents

1 Entropy and Information 1


1.1 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Structure in Randomness . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 First Concepts of Probability Theory . . . . . . . . . . . . . . . . 2
1.4 Surprise and Entropy . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.5 Units of Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.6 The Minimum and Maximum Values of Entropy . . . . . . . . . . 7
1.7 A Useful Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.8 Joint Probability Distribution Functions . . . . . . . . . . . . . . . 10
1.9 Conditional Probability and Bayes’ Theorem . . . . . . . . . . . . 12
1.10 Conditional Probability Distributions and Conditional Entropy . . . 14
1.11 Information Sources . . . . . . . . . . . . . . . . . . . . . . . . . 16
1.12 Memoryless Information Sources . . . . . . . . . . . . . . . . . . 18
1.13 Markov Sources and n-gram Models . . . . . . . . . . . . . . . . . 19
1.14 Stationary Distributions . . . . . . . . . . . . . . . . . . . . . . . 23
1.15 The Entropy of Markov Sources . . . . . . . . . . . . . . . . . . . 27
1.16 Sequences of Symbols . . . . . . . . . . . . . . . . . . . . . . . . 29
1.17 The Adjoint Source of a Markov Source . . . . . . . . . . . . . . . 31
1.18 Extensions of Sources . . . . . . . . . . . . . . . . . . . . . . . . 34
1.19 Infinite Sample Spaces . . . . . . . . . . . . . . . . . . . . . . . . 41
1.20 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
1.21 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

2 Information Channels 51
2.1 What Are Information Channels? . . . . . . . . . . . . . . . . . . 51
2.2 BSC and BEC Channels . . . . . . . . . . . . . . . . . . . . . . . 54
2.3 Mutual Information . . . . . . . . . . . . . . . . . . . . . . . . . . 56
2.3.1 Importance of Mutual Information . . . . . . . . . . . . . . 61
2.3.2 Properties of the Mutual Information . . . . . . . . . . . . 61
2.4 Noiseless and Deterministic Channels . . . . . . . . . . . . . . . . 66
2.4.1 Noiseless Channels . . . . . . . . . . . . . . . . . . . . . . 66
2.4.2 Deterministic Channels . . . . . . . . . . . . . . . . . . . . 68
2.5 Cascaded Channels . . . . . . . . . . . . . . . . . . . . . . . . . . 69
2.6 Additivity of Mutual Information . . . . . . . . . . . . . . . . . . 73
2.7 Channel Capacity: Maximum Mutual Information . . . . . . . . . 78
2.7.1 Channel Capacity of a BSC . . . . . . . . . . . . . . . . . 79
2.7.2 Channel Capacity of a BEC . . . . . . . . . . . . . . . . . 80

ix
x

2.7.3 Channel Capacity of Weakly Symmetric Channels . . . . . 81


2.8 Continuous Channels and Gaussian Channels . . . . . . . . . . . . 83
2.9 Information Capacity Theorem . . . . . . . . . . . . . . . . . . . . 85
2.10 Rate Distortion Theory . . . . . . . . . . . . . . . . . . . . . . . . 88
2.10.1 Properties of  . . . . . . . . . . . . . . . . . . . . . . 93
2.11 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
2.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

3 Source Coding 105


3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
3.2 Instantaneous Codes . . . . . . . . . . . . . . . . . . . . . . . . . 107
3.2.1 Construction of Instantaneous Codes . . . . . . . . . . . . . 110
3.2.2 Decoding Instantaneous Codes . . . . . . . . . . . . . . . . 112
3.2.3 Properties of Instantaneous Codes . . . . . . . . . . . . . . 113
3.2.4 Sensitivity to Bit Errors . . . . . . . . . . . . . . . . . . . 113
3.3 The Kraft Inequality and McMillan’s Theorem . . . . . . . . . . . 115
3.3.1 The Kraft Inequality . . . . . . . . . . . . . . . . . . . . . 115
3.3.2 McMillan’s Theorem . . . . . . . . . . . . . . . . . . . . . 118
3.4 Average Length and Compact Codes . . . . . . . . . . . . . . . . . 121
3.4.1 Average Length . . . . . . . . . . . . . . . . . . . . . . . . 121
3.4.2 Lower Bound on Average Length . . . . . . . . . . . . . . 122
3.5 Shannon’s Noiseless Coding Theorem . . . . . . . . . . . . . . . . 125
3.5.1 Shannon’s Theorem for Zero-Memory Sources . . . . . . . 125
3.5.2 Shannon’s Theorem for Markov Sources . . . . . . . . . . 129
3.5.3 Code Efficiency and Channel Capacity . . . . . . . . . . . 131
3.6 Fano Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
3.7 Huffman Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
3.7.1 Huffman Codes . . . . . . . . . . . . . . . . . . . . . . . . 136
3.7.2 Binary Huffman Coding Algorithm . . . . . . . . . . . . . 137
3.7.3 Software Implementation of Binary Huffman Coding . . . . 142
3.7.4  -ary Huffman Codes . . . . . . . . . . . . . . . . . . . . 142
3.8 Arithmetic Coding . . . . . . . . . . . . . . . . . . . . . . . . . . 146
3.8.1 Encoding and Decoding Algorithms . . . . . . . . . . . . . 149
3.8.2 Encoding and Decoding with Scaling . . . . . . . . . . . . 154
3.8.3 Is Arithmetic Coding Better Than Huffman Coding? . . . . 156
3.9 Higher-order Modelling . . . . . . . . . . . . . . . . . . . . . . . 157
3.9.1 Higher-order Huffman Coding . . . . . . . . . . . . . . . . 158
3.9.2 Higher-order Arithmetic Coding . . . . . . . . . . . . . . . 159
3.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
3.11 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168

4 Data Compression 171


4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
4.2 Basic Concepts of Data Compression . . . . . . . . . . . . . . . . 172
4.3 Run-length Coding . . . . . . . . . . . . . . . . . . . . . . . . . . 173
xi

4.4 The CCITT Standard for Facsimile Transmission . . . . . . . . . . 174


4.5 Block-sorting Compression . . . . . . . . . . . . . . . . . . . . . 176
4.6 Dictionary Coding . . . . . . . . . . . . . . . . . . . . . . . . . . 179
4.7 Statistical Compression . . . . . . . . . . . . . . . . . . . . . . . 185
4.8 Prediction by Partial Matching . . . . . . . . . . . . . . . . . . . . 186
4.9 Image Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
4.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
4.11 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196

5 Fundamentals of Channel Coding 199


5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
5.2 Code Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
5.3 Decoding Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
5.4 Hamming Distance . . . . . . . . . . . . . . . . . . . . . . . . . . 206
5.4.1 Hamming Distance Decoding Rule for BSCs . . . . . . . . 207
5.4.2 Error Detection/Correction Using the Hamming Distance . . 208

5.5 Bounds on , Maximal Codes and Perfect Codes . . . . . . . . . 213
5.5.1 Upper Bounds on and the Hamming Bound . . . . . . . 213
5.5.2 Maximal Codes and the Gilbert Bound . . . . . . . . . . . 217

5.5.3 Redundancy Requirements for -bit Error Correction . . . . 219

5.5.4 Perfect Codes for -bit Error Correction . . . . . . . . . . . 220
5.6 Error Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 222
5.6.1 Bit and Block Error Probabilities and Code Rate . . . . . . 223
5.6.2 Probability of Undetected Block Error . . . . . . . . . . . . 225
5.7 Shannon’s Fundamental Coding Theorem . . . . . . . . . . . . . . 227
5.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
5.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234

6 Error-Correcting Codes 235


6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
6.2 Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
6.3 Rings and Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
6.4 Linear Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
6.5 Linear Spaces over the Binary Field . . . . . . . . . . . . . . . . . 251
6.6 Linear Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
6.7 Encoding and Decoding . . . . . . . . . . . . . . . . . . . . . . . 269
6.8 Codes Derived from Hadamard Matrices . . . . . . . . . . . . . . 272
6.9 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
6.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278

7 Cyclic Codes 281


7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
7.2 Rings of Polynomials . . . . . . . . . . . . . . . . . . . . . . . . . 281
7.3 Cyclic Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
7.4 Encoding and Decoding of Cyclic Codes . . . . . . . . . . . . . . 296
xii

7.5 Encoding and Decoding Circuits for Cyclic Codes . . . . . . . . . 300


7.6 The Golay Code . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
7.7 Hamming Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
7.8 Cyclic Redundancy Check Codes . . . . . . . . . . . . . . . . . . 307
7.9 Reed-Muller Codes . . . . . . . . . . . . . . . . . . . . . . . . . . 309
7.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
7.11 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318

8 Burst-Correcting Codes 319


8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
8.2 Finite Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
8.3 Irreducible Polynomials . . . . . . . . . . . . . . . . . . . . . . . 321
8.4 Construction of Finite Fields . . . . . . . . . . . . . . . . . . . . . 327
8.5 Bursts of Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
8.6 Fire Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
8.7 Minimum Polynomials . . . . . . . . . . . . . . . . . . . . . . . . 338
8.8 Bose-Chaudhuri-Hocquenghem Codes . . . . . . . . . . . . . . . 340
8.9 Other Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
8.10 Reed-Solomon Codes . . . . . . . . . . . . . . . . . . . . . . . . 345
8.11 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347
8.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349

9 Convolutional Codes 351


9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
9.2 A Simple Example . . . . . . . . . . . . . . . . . . . . . . . . . . 351
9.3 Binary Convolutional Codes . . . . . . . . . . . . . . . . . . . . . 356
9.4 Decoding Convolutional Codes . . . . . . . . . . . . . . . . . . . 360
9.5 The Viterbi Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 361
9.6 Sequential Decoding . . . . . . . . . . . . . . . . . . . . . . . . . 367
9.7 Trellis Modulation . . . . . . . . . . . . . . . . . . . . . . . . . . 371
9.8 Turbo Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
9.9 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
9.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379

Index 381
Chapter 1
Entropy and Information

1.1 Structure

Structure is a concept of which we all have an intuitive understanding. However,


it is not easy to articulate that understanding and give a precise definition of what
structure is. We might try to explain structure in terms of such things as regularity,
predictability, symmetry and permanence. We might also try to describe what struc-
ture is not, using terms such as featureless, random, chaotic, transient and aleatory.
Part of the problem of trying to define structure is that there are many different kinds
of behaviour and phenomena which might be described as structured, and finding a
definition that covers all of them is very difficult.
Consider the distribution of the stars in the night sky. Overall, it would appear that
this distribution is random, without any structure. Yet people have found patterns in
the stars and imposed a structure on the distribution by naming constellations.
Again, consider what would happen if you took the pixels on the screen of your
computer when it was showing a complicated and colourful scene and strung them
out in a single row. The distribution of colours in this single row of pixels would
appear to be quite arbitrary, yet the complicated pattern of the two-dimensional array
of pixels would still be there.
These two examples illustrate the point that we must distinguish between the pres-
ence of structure and our perception of structure. In the case of the constellations,
the structure is imposed by our brains. In the case of the picture on our computer
screen, we can only see the pattern if the pixels are arranged in a certain way.
Structure relates to the way in which things are put together, the way in which the
parts make up the whole. Yet there is a difference between the structure of, say, a
bridge and that of a piece of music. The parts of the Golden Gate Bridge or the
Sydney Harbour Bridge are solid and fixed in relation to one another. Seeing one
part of the bridge gives you a good idea of what the rest of it looks like.
The structure of pieces of music is quite different. The notes of a melody can be
arranged according to the whim or the genius of the composer. Having heard part
of the melody you cannot be sure of what the next note is going to be, leave alone

1
2 Fundamentals of Information Theory and Coding Design

any other part of the melody. In fact, pieces of music often have a complicated,
multi-layered structure, which is not obvious to the casual listener.
In this book, we are going to be concerned with things that have structure. The kinds
of structure we will be concerned with will be like the structure of pieces of music.
They will not be fixed and obvious.

1.2 Structure in Randomness


Structure may be present in phenomena that appear to be random. When it is present,
it makes the phenomena more predictable. Nevertheless, the fact that randomness is
present means that we have to talk about the phenomena in terms of probabilities.
Let us consider a very simple example of how structure can make a random phe-
nomenon more predictable. Suppose we have a fair die. The probability of any face
coming up when the die is thrown is 1/6. In this case, it is not possible to predict
which face will come up more than one-sixth of the time, on average.
On the other hand, if we have a die that has been biased, this introduces some struc-
ture into the situation. Suppose that the biasing has the effect of making the probabil-
ity of the face with six spots coming up 55/100, the probability of the face with one
spot coming up 5/100 and the probability of any other face coming up 1/10. Then
the prediction that the face with six spots will come up will be right more than half
the time, on average.
Another example of structure in randomness that facilitates prediction arises from
phenomena that are correlated. If we have information about one of the phenomena,
we can make predictions about the other. For example, we know that the IQ of iden-
tical twins is highly correlated. In general, we cannot make any reliable prediction
about the IQ of one of a pair of twins. But if we know the IQ of one twin, we can
make a reliable prediction of the IQ of the other.
In order to talk about structure in randomness in quantitative terms, we need to use
probability theory.

1.3 First Concepts of Probability Theory


To describe a phenomenon in terms of probability theory, we need to define a set
of outcomes, which is called the sample space. For the present, we will restrict
consideration to sample spaces which are finite sets.
Entropy and Information 3

DEFINITION 1.1 Probability Distribution A probability distribution on a


sample space           is a function  that assigns a probability

to each outcome in the sample space.  is a map from to the unit interval,
    , which must satisfy     .

DEFINITION 1.2 Events Events are subsets of the sample space.

which we denote by   , by setting   



We can extend a probability distribution  from to the set of all subsets of ,
¾   for any     . Note
that   .
An event whose probability is  is impossible and an event whose probability is  is
certain to occur.
If  and  are events and     then            .
DEFINITION 1.3 Expected Value If           is a sample space
to a vector space  , the expected value of  is 

with probability distribution  , and     is a function from the sample space
     .

NOTE We will often have equations that involve summation over the elements

of a finite set. In the equations above, the set has been           and

the summation has been denoted by  . In other places in the text we will denote
such summations simply by ¾ .

1.4 Surprise and Entropy


In everyday life, events can surprise us. Usually, the more unlikely or unexpected
an event is, the more surprising it is. We can quantify this idea using a probability
distribution.

DEFINITION 1.4 Surprise If  is an event in a sample space , we define the


surprise of  to be          .
Events for which    , which are certain to occur, have zero surprise, as we
would expect, and events that are impossible, that is, for which    , have
infinite surprise.
4 Fundamentals of Information Theory and Coding Design

Defining the surprise as the negative logarithm of the probability not only gives us the
appropriate limiting values as the probability tends to or , it also makes surprise
additive. If several independent events occur in succession, the total surprise they
generate is the sum of their individual surprises.

DEFINITION 1.5 Entropy We can restrict the surprise to the sample space
and consider it to be a function from the sample space to the real numbers. The
expected value of the surprise is the entropy of the probability distribution.

If the sample space is           , with probability distribution  , the


entropy of the probability distribution is given by

           (1.1)




The concept of entropy was introduced into thermodynamics in the nineteenth cen-
tury. It was considered to be a measure of the extent to which a system was disor-
dered. The tendency of systems to become more disordered over time is described by
the Second Law of Thermodynamics, which states that the entropy of a system can-
not spontaneously decrease. In the 1940’s, Shannon [6] introduced the concept into
communications theory and founded the subject of information theory. It was then
realised that entropy is a property of any stochastic system and the concept is now
used widely in many fields. Today, information theory (as described in books such
as [1], [2], [3]) is still principally concerned with communications systems, but there
are widespread applications in statistics, information processing and computing (see
[2], [4], [5]).
Let us consider some examples of probability distributions and see how the entropy is
related to predictability. First, let us note the form of the function    
where     and  denotes the logarithm to base 2. (The actual base does not
matter, but we shall be using base 2 throughout the rest of this book, so we may as
well start here.) The graph of this function is shown in Figure 1.1.
Note that   approaches as  tends to and also as  tends to . This
means that outcomes that are almost certain to occur and outcomes that are unlikely
to occur both contribute little to the entropy. Outcomes whose probability is close to
 make a comparatively large contribution to the entropy.

EXAMPLE 1.1
     with         . The entropy is
            
In this case,  and  are equally likely to occur and the situation is as unpredictable
as it can be.
Entropy and Information 5

0.6

0.5

0.4
-p*log(p)

0.3

0.2

0.1

0
0 0.2 0.4 0.6 0.8 1
p

FIGURE 1.1
The graph of   .

EXAMPLE 1.2
      with        , and       . The entropy is

    
   
  
      
  

In this case, the situation is more predictable, with  more than thirty times more
likely to occur than  . The entropy is close to zero.

EXAMPLE 1.3
      with     , and     . Using the convention that
    , the entropy is . The situation is entirely predictable, as  always
occurs.

EXAMPLE 1.4
             , with      for         . The entropy is
 and the situation is as unpredictable as it can be.

EXAMPLE 1.5
             , with   
      
    for         .
6 Fundamentals of Information Theory and Coding Design

The entropy is  and the situation is fairly predictable as  will occur far more
frequently than any other outcome.

EXAMPLE 1.6
             , with              for
    . The entropy is  and the situation is about as predictable as in
Example 1.1 above, with outcomes  and  equally likely to occur and the others
very unlikely to occur.

Roughly speaking, a system whose entropy is  is about as unpredictable as a system


with equally likely outcomes.

1.5 Units of Entropy


The units in which entropy is measured depend on the base of the logarithms used
to calculate it. If we use logarithms to the base 2, then the unit is the bit. If we
use natural logarithms (base ), the entropy is measured in natural units, sometimes
referred to as nits. Converting between the different units is simple.

PROPOSITION 1.1
If  is the entropy of a probability distribution measured using natural logarithms,
and  is the entropy of the same probability distribution measured using logarithms
to the base , then

     (1.2)

PROOF Let the sample space be        , with probability distri-


bution  . For any positive number ,

        (1.3)

It follows that

            
 


   
  
 

  
Entropy and Information 7
      

  
  (1.4)

1.6 The Minimum and Maximum Values of Entropy


If we have a sample space  with  elements, and probability distribution on  ,
it is convenient to denote the probability of  ¾  by  . We can construct a vector
in  consisting of the probabilities:



  
 
 ... 


Because the probabilities have to add up to unity, the set of all probability distribu-
tions forms a simplex in  , namely
 

¾    
 
We can consider the entropy to be a function defined on this simplex. Since it is
a continuous function, extreme values will occur at the vertices of this simplex, at
points where all except one of the probabilities are zero. If  is a vertex, then the
entropy there will be

         


The logarithm of zero is not defined, but the limit of    as tends to  ex-
ists and is equal to zero. If we take the limiting values, we see that at any vertex,
   , as   . This is the minimum value of the entropy function.
The entropy function has a maximum value at an interior point of the simplex. To
find it we can use Lagrange multipliers.

THEOREM 1.1
If we have a sample space with  elements, the maximum value of the entropy
function is   .
8 Fundamentals of Information Theory and Coding Design

PROOF We want to find the maximum value of

      (1.5)
 
subject to the constraint

   (1.6)
 
We introduce the Lagrange multiplier , and put 
 
         (1.7)
 

To find the maximum value we have to solve



   (1.8)

for         and

   (1.9)
 

   
   (1.10)

so
    (1.11)

for each . The remaining condition gives

   (1.12)


which can be solved for , or can be used directly to give

   (1.13)


for all . Using these values for the , we get
            (1.14)
Entropy and Information 9

1.7 A Useful Inequality

LEMMA 1.1


If       
conditions 

and          are all non-negative numbers that satisfy the
  and   , then

        (1.15)


 
with equality if and only if   for all .

PROOF We prove the result for the natural logarithm; the result for any other base
follows immediately from the identity
    (1.16)

It is a standard result about the logarithm function that


   (1.17)
for   , with equality if and only if  . Substituting     , we get
         (1.18)
with equality if and only if   . This holds for all    , so if we
multiply by  and sum over the , we get

                (1.19)


   
with equality if and only if   for all . So

         (1.20)


 
which is the required result.

The inequality can also be written in the form

      (1.21)

with equality if and only if   for all .
Note that putting   for all  in this inequality gives us an alternative proof
that the maximum value of the entropy function is  .
10 Fundamentals of Information Theory and Coding Design

1.8 Joint Probability Distribution Functions

There are many situations in which it is useful to consider sample spaces that are the
Cartesian product of two or more sets.

DEFINITION 1.6 Cartesian Product Let           and 


          be two sets. The Cartesian product of and  is the set  
              .
The extension to the Cartesian product of more than two sets is immediate.

DEFINITION 1.7 Joint Probability Distribution A joint probability distribution


is a probability distribution on the Cartesian product of a number of sets.

If we have and  as above, then a joint probability distribution function assigns a


probability to each pair    . We can denote this probability by  . Since these
values form a probability distribution, we have

     (1.22)

for      ,     , and

  (1.23)
 
If is the joint probability distribution function on   , the definition of entropy
becomes
 
                 (1.24)
   
If we want to emphasise the spaces and  , we will denote the entropy of the joint
probability distribution on   by    or simply by    . This is known
as the joint entropy of and  .
If there are probability distributions  and  on and  , respectively, and these
are independent, the joint probability distribution on   is given by

       (1.25)


Entropy and Information 11

for ,  . If there are correlations between the  and  , then this


formula does not apply.

DEFINITION 1.8 Marginal Distribution If  is a joint probability distribution


function on    , the marginal distribution on  is       given by

        (1.26)

for  and the marginal distribution on  is       given by

       (1.27)
 
for  .

There is a simple relationship between the entropy of the joint probability distribution
function and that of the marginal distribution functions.

THEOREM 1.2
If  is a joint probability distribution function on    , and  and  are the
marginal distributions on  and  , respectively, then

       (1.28)

with equality if and only if the marginal distributions are independent.

PROOF

         
 
 
        (1.29)
  
and similarly
 
           (1.30)
  
So
 
                 
  
12 Fundamentals of Information Theory and Coding Design

           (1.31)
  
Also,

           (1.32)
  
Since

     (1.33)
  
and
 
             (1.34)
      
we can use the inequality of Lemma 1.1 to conclude that
        (1.35)
with equality if and only if           for all  and  , that is, if the
two marginal distributions are independent.

1.9 Conditional Probability and Bayes’ Theorem

DEFINITION 1.9 Conditional Probability If  is a sample space with a prob-


ability distribution function , and and are events in  , the conditional prob-
ability of given is
     
  (1.36)

It is obvious that
             (1.37)
Almost as obvious is one form of Bayes’ Theorem:

THEOREM 1.3
If  is a sample space with a probability distribution function , and and are
events in  , then
       
  (1.38)
Entropy and Information 13

Bayes’ Theorem is important because it enables us to derive probabilities of hypothe-


ses from observations, as in the following example.

EXAMPLE 1.7
We have two jars, A and B. Jar A contains 8 green balls and 2 red balls. Jar B contains
3 green balls and 7 red balls. One jar is selected at random and a ball is drawn from
it.
We have probabilities as follows. The set of jars forms one sample space,   ,
with
       
as one jar is as likely to be chosen as the other.
The set of colours forms another sample space,   . The probability of
drawing a green ball is
     
as 11 of the 20 balls in the jars are green. Similarly,

     
We have a joint probability distribution over the colours of the balls and the jars with
the probability of selecting Jar A and drawing a green ball being given by

    


Similarly, we have the probability of selecting Jar A and drawing a red ball

    


the probability of selecting Jar B and drawing a green ball

     


and the probability of selecting Jar B and drawing a red ball

      
We have the conditional probabilities: given that Jar A was selected, the probability
of drawing a green ball is
     
and the probability of drawing a red ball is

    


14 Fundamentals of Information Theory and Coding Design

Given that Jar B was selected, the corresponding probabilities are:


    
and
    
We can now use Bayes’ Theorem to work out the probability of having drawn from
either jar, given the colour of the ball that was drawn. If a green ball was drawn, the
probability that it was drawn from Jar A is
  
        

while the probability that it was drawn from Jar B is
    
     
            

If a red ball was drawn, the probability that it was drawn from Jar A is
  
           

while the probability that it was drawn from Jar B is
    
            

(In this case, we could have derived these conditional probabilities from the joint
probability distribution, but we chose not to do so to illustrate how Bayes’ Theorem
allows us to go from the conditional probabilities of the colours given the jar selected
to the conditional probabilities of the jars selected given the colours drawn.)

1.10 Conditional Probability Distributions and Conditional Entropy


In this section, we have a joint probability distribution on a Cartesian product
  , where           and          , with marginal distri-
butions  and  .

DEFINITION 1.10 Conditional Probability of  given  For    and


  , the conditional probability of  given  is
               
   
(1.39)
  
Entropy and Information 15

DEFINITION 1.11 Conditional Probability Distribution given For a fixed


, the conditional probabilities    sum to 1 over , so they form a probability
distribution on  , the conditional probability distribution given . We will denote
this by   .

DEFINITION 1.12 Conditional Entropy given The conditional entropy


given is the entropy of the conditional probability distribution on  given . It
will be denoted    .


             (1.40)
 

DEFINITION 1.13 Conditional Probability Distribution on  given  The


conditional probability distribution on  given  is the weighted average of the
conditional probability distributions given for all  . It will be denoted   .


          (1.41)


DEFINITION 1.14 Conditional Entropy given  The conditional entropy given


 is the weighted average of the conditional entropies on  given for all   .
It will be denoted    .

 
               (1.42)
  
Since        , we can re-write this as
 
             (1.43)
  

We now prove two simple results about the conditional entropies.

THEOREM 1.4
                    
16 Fundamentals of Information Theory and Coding Design

PROOF

            
  

            
  
 
                   


     
 
                  
    
       (1.44)

The proof of the other equality is similar.

THEOREM 1.5
      with equality if and only if  and  are independent.

PROOF From the previous theorem,         


From Theorem 1.2,        with equality if and only if  and
 are independent.
So           .


Subtracting   from both sides we get      , with equality if and


only if  and  are independent.

 
This result is obviously symmetric in and ; so we also have       
 
with equality if and only if  and  are independent. We can sum up this result
by saying the conditioning reduces entropy or conditioning reduces uncertainty.

1.11 Information Sources


Most of this book will be concerned with random sequences. Depending on the
context, such sequences may be called time series, (discrete) stochastic processes or
Entropy and Information 17

signals. The first term is used by statisticians, the second by mathematicians and the
third by engineers. This may reflect differences in the way these people approach the
subject: statisticians are primarily interested in describing such sequences in terms
of probability theory, mathematicians are interested in the behaviour of such series
and the ways in which they may be generated and engineers are interested in ways of
using such sequences and processing them to extract useful information from them.
A device or situation that produces such a sequence is called an information source.
The elements of the sequence are usually drawn from a finite set, which may be
referred to as the alphabet. The source can be considered to be emitting an element
of the alphabet at each instant of a sequence of instants in time. The elements of the
alphabet are referred to as symbols.

EXAMPLE 1.8
Tossing a coin repeatedly and recording the outcomes as heads (H) or tails (T) gives
us a random sequence whose alphabet is .

EXAMPLE 1.9
Throwing a die repeatedly and recording the number of spots on the uppermost face
gives us a random sequence whose alphabet is     
    .

EXAMPLE 1.10
Computers and telecommunications equipment generate sequences of bits which are
random sequences whose alphabet is  . 

EXAMPLE 1.11
A text in the English language is a random sequence whose alphabet is the set con-
sisting of the letters of the alphabet, the digits and the punctuation marks. While we
normally consider text to be meaningful rather than random, it is only possible to
predict which letter will come next in the sequence in probabilistic terms, in general.

The last example above illustrates the point that a random sequence may not appear
to be random at first sight. The difference between the earlier examples and the final
example is that in the English language there are correlations between each letter in
the sequence and those that precede it. In contrast, there are no such correlations in
the cases of tossing a coin or throwing a die repeatedly. We will consider both kinds
of information sources below.
18 Fundamentals of Information Theory and Coding Design

An obvious question that is raised by the term “information source” is: What is the
“information” that the source produces? A second question, perhaps less obvious,
is: How can we measure the information produced by an information source?
An information source generates a sequence of symbols which has a certain degree
of unpredictability. The more unpredictable the sequence is, the more information
is conveyed by each symbol. The information source may impose structure on the
sequence of symbols. This structure will increase the predictability of the sequence
and reduce the information carried by each symbol.
The random behaviour of the sequence may be described by probability distribu-
tions over the alphabet. If the elements of the sequence are uncorrelated, a simple
probability distribution over the alphabet may suffice. In other cases, conditional
probability distributions may be required.
We have already seen that entropy is a measure of predictability. For an information
source, the information content of the sequence that it generates is measured by the
entropy per symbol. We can compute this if we make assumptions about the kinds
of structure that the information source imposes upon its output sequences.
To describe an information source completely, we need to specify both the alpha-
bet and the probability distribution that governs the generation of sequences. The
entropy of the information source with alphabet  and probability distribution 
will be denoted by   in the following sections, even though it is actually the en-
tropy of  . Later on, we will wish to concentrate on the alphabet and will use  
to denote the entropy of the information source, on the assumption that the alphabet
will have a probability distribution associated with it.

1.12 Memoryless Information Sources


For a memoryless information source, there are no correlations between the outputs
of the source at different times. For each instant at which an output is emitted, there
is a probability distribution over the alphabet that describes the probability of each
symbol being emitted at that instant. If all the probability distributions are the same,
the source is said to be stationary. If we know these probability distributions, we can
calculate the information content of the sequence.

EXAMPLE 1.12
Tossing a fair coin gives us an example of a stationary memoryless information source.
At any instant, the probability distribution is given by     ,     .
This probability distribution has an entropy of 1 bit; so the information content is 1
bit/symbol.
Entropy and Information 19

EXAMPLE 1.13
As an example of a non-stationary memoryless information source, suppose we have a

fair coin and a die with painted on four faces and painted on two faces. Tossing the
coin and throwing the die in alternation will create a memoryless information source
with alphabet  . Every time the coin is tossed, the probability distribution of
the outcomes is     ,  
   , and every time the die is thrown, the
probability distribution is  
   ,  
   .

The probability distribution of the outcomes of tossing the coin has an entropy of 1
bit. The probability distribution of the outcomes of throwing the die has an entropy
of 0.918 bits. The information content of the sequence is the average entropy per
symbol, which is 0.959 bits/symbol.

Memoryless information sources are relatively simple. More realistic information


sources have memory, which is the property that the emission of a symbol at any
instant depends on one or more of the symbols that were generated before it.

1.13 Markov Sources and n-gram Models

Markov sources and n-gram models are descriptions of a class of information sources
with memory.

DEFINITION 1.15 Markov Source A Markov source consists of an alphabet , 


a set of states , a set of transitions between states, a set of labels for the transitions
and two sets of probabilities. The first set of probabilities is the initial probability
distribution on the set of states, which determines the probabilities of sequences
starting with each symbol in the alphabet. The second set of probabilities is a set
 
of transition probabilities. For each pair of states, and , the probability of a
   
transition from  to is  . (Note that these probabilities are fixed and do not
depend on time, so that there is an implicit assumption of stationarity.) The labels
on the transitions are symbols from the alphabet.

To generate a sequence, a state is selected on the basis of the initial probability distri-
bution. A transition from this state to another state (or to the same state) is selected
on the basis of the transition probabilities, and the label of this transition is output.
This process is repeated to generate the sequence of output symbols.
It is convenient to represent Markov models diagrammatically in the form of a graph,
with the states represented by vertices and the transitions by edges, as in the follow-
ing example.

20 Fundamentals of Information Theory and Coding Design

 
½



Label 1,
 Label 1,

 
         

 Label 0,      
¾  ¿
Label 0,     

FIGURE 1.2
Diagrammatic representation of a Markov source.

EXAMPLE 1.14
Consider a Markov source with alphabet   and set of states   ½  ¾  ¿ .
Suppose there are four transitions:

1. ½  ¾, with label 1 and     ;

2. ¾  ¿, with label 0 and     ;

3. ¿  ½, with label 1 and     ;

4. ¿  ¾, with label 0 and     .

The initial probability distribution is  ½   ,  ¾   ,  ¿   .

The diagrammatic representation of this is shown in Figure 1.2.


The random sequences generated by this source all consist of subsequences of an
even number of 0’s separated by a pair of 1’s, except at the very beginning of the
sequence, where there may be a single 0 or 1.

It is possible to describe these sources without explicit reference to the states. In an


n-gram model, the description is solely in terms of the probabilities of all the possible
sequences of symbols of length .
Entropy and Information 21

EXAMPLE 1.15
The following probabilities give us a 3-gram model on the language .

         
         
        

        

To describe the relationship between n-gram models and Markov sources, we need
to look at special cases of Markov sources.

DEFINITION 1.16 th-order Markov Source A Markov source whose states


are sequences of  symbols from the alphabet is called an th-order Markov
source.

When we have an th-order Markov model, the transition probabilities are usually
given in terms of the probabilities of single symbols being emitted when the source
is in a given state. For example, in a second-order Markov model on , the
transition probability from  to  , which would be represented by    , would
be represented instead by the probability of emission of when in the state , that is
   . Obviously, some transitions are impossible. For example, it is not possible

to go from the state  to the state , as the state following  must have  as its
first symbol.
We can construct a th-order Markov model from an  -gram model and an
 -gram model. The -gram model gives us the probabilities of strings of length ,
such as  ½ ¾     . To find the emission probability of  from this state, we
set
 ½ ¾     
 ½ ¾       (1.45)
 ½ ¾     

where the probability  ½ ¾      is given by the  -gram model.

EXAMPLE 1.16
In the previous example 1.15 we had a 3-gram model on the language  given
by
         
          

         
         
22 Fundamentals of Information Theory and Coding Design

P(1|11)=0.4

P(1|01)=0.5
01 11

P(0|01)=0.5
P(1|00)=0.2 P(0|11)=0.6
P(1|10)=0.2

00 10
P(0|10)=0.8

P(0|00)=0.8

FIGURE 1.3
Diagrammatic representation of a Markov source equivalent to a 3-gram model.

If a 2-gram model for the same source is given by   ,   ,
   and   , then we can construct a second-order Markov source
as follows:
      
        

        

          

          

         

        

          

        

Figure 1.3 shows this Markov source.

To describe the behaviour of a Markov source mathematically, we use the transition


matrix of probabilities. If the set of states is
 ½  ¾       
Entropy and Information 23

the transition matrix is the matrix


          
             
    (1.46)
.. .. ..
. .
..
. . 
            
The probability of the source being in a given state varies over time. Let  be the
 

probability of the source being in state  at time , and set

½ 
    ¾ 
    (1.47)
..
. 


Then  ¼ is the initial probability distribution and


 ·½    (1.48)

and so, by induction,


   ¼ (1.49)

Because they all represent probability distributions, each of the columns of must

add up to , and all the  must add up to  for each . 

1.14 Stationary Distributions


The vectors  describe how the behaviour of the source changes over time. The
asymptotic (long-term) behaviour of sources is of interest in some cases.

EXAMPLE 1.17
Consider a source with transition matrix
 
   

Suppose the initial probability distribution is


 
 ¼   
24 Fundamentals of Information Theory and Coding Design

  
   
Then
 
    
Similarly, 
  




  

 


   

 

and so on.
Suppose instead that
  



 
Then
      
   
 

so that
 

for all  . This distribution will persist for all time.


In the example above, the initial distribution  has the property that
  
(1.50)
and persists for all time.

DEFINITION 1.17 Stationary Distribution A probability distribution over


the states of a Markov source with transition matrix  that satisfies the equation
 is a stationary distribution.

As shown in the example, if  is a stationary distribution, it persists for all time,


 
 for all . The defining equation shows that a stationary distribution
must be an eigenvector of  with eigenvalue . To find a stationary distribution for ,
we must solve the equation 

together with the condition that   . 
EXAMPLE 1.18
Suppose  
  
      
   
Entropy and Information 25

Then the equation  gives


½  ¾  ¿  ½
½  ¾  ¿  ¾
½  ¾  ¿  ¿ 
The first equation gives us

½ ¾  
and the other two give

½ ¾  ¿  


½  ¾ ¿  
from which we get
¾  ¿  
So
½   ¾
and
¿   ¾
Substituting these values in

½  ¾  ¿  
we get
      
 ¾ ¾  ¾
which gives us
¾   ½   ¿   
So the stationary distribution is


    


In the examples above, the source has an unique stationary distribution. This is not
always the case.
26 Fundamentals of Information Theory and Coding Design

EXAMPLE 1.19
Consider the source with four states and probability transition matrix


 
          


The diagrammatic representation of this source is shown in Figure 1.4.

    

 
    
         

    
 

   
FIGURE 1.4
A source with two stationary distributions.

For this source, any distribution with    ,     and       satisfies


the equation    . However, inspection of the transition matrix shows that once
the source enters either the first state or the fourth state, it cannot leave it. The only
stationary distributions that can occur are    ,    ,    ,    
or    ,    ,    ,    .

Some Markov sources have the property that every sequence generated by the source
has the same statistical properties. That is, the various frequencies of occurrence of
Entropy and Information 27

symbols, pairs of symbols, and so on, obtained from any sequence generated by the
source will, as the length of the sequence increases, approach some definite limit
which is independent of the particular sequence. Sources that have this property are
called ergodic sources.

The source of Example 1.19 is not an ergodic source. The sequences generated by
that source fall into two classes, one of which is generated by sequences of states
that end in the first state, the other of which is generated by sequences that end in the
fourth state. The fact that there are two distinct stationary distributions shows that
the source is not ergodic.

1.15 The Entropy of Markov Sources

There are various ways of defining the entropy of an information source. The fol-
lowing is a simple approach which applies to a restricted class of Markov sources.

DEFINITION 1.18 Entropy of the th State of a Markov Source The entropy of


the th state of a Markov source is the entropy of the probability distribution on the
set of transitions from that state.

If we denote the probability distribution on the set of transitions from the th state by
 , then the entropy of the th state is given by

          (1.51)


 

DEFINITION 1.19 Unifilar Markov Source A unifilar Markov source is one with
the property that the labels on the transitions from any given state are all distinct.

We need this property in order to be able to define the entropy of a Markov source.
We assume that the source has a stationary distribution.
28 Fundamentals of Information Theory and Coding Design

DEFINITION 1.20 Entropy of a Unifilar Markov Source The entropy


of a unifilar Markov source , whose stationary distribution is given by
    , and whose transition probabilities are  for   ,    
 
 

  , is

       
 

      (1.52)
  

It can be shown that this definition is consistent with more general definitions of the
entropy of an information source.

EXAMPLE 1.20
For the Markov source of Example 1.14, there are three states, ,  and  . The
probability distribution on the set of transitions from is for    .   
 is given by

                        
Its entropy is

                  
using the usual convention that     .
 is given by
                        
Its entropy is

                   
 is given by
                       
Its entropy is

                       


The stationary distribution of the source is given by

 


  


 



Entropy and Information 29

The entropy of the source is

          


        

EXAMPLE 1.21
For the source of Example 1.16, the states are , , , .
 is given by       , and         . Its entropy
 
is
             
   

  is given by         , and         . Its entropy


is
         
   

 is given by         , and         . Its entropy


is
             
   

 is given by         , and         . Its entropy


is
            
   

The stationary distribution of the source is given by

      

   

   

The entropy of the source is

          
     
  

   

1.16 Sequences of Symbols


It is possible to estimate the entropy of a Markov source using information about
the probabilities of occurrence of sequences of symbols. The following results apply
30 Fundamentals of Information Theory and Coding Design

to ergodic Markov sources and are stated without proof. In a sense, they justify
the use of the conditional probabilities of emission of symbols instead of transition
probabilities between states in th-order Markov models.

THEOREM 1.6
Given any  Æ
and any , we can find a positive integer 
such that all
sequences of length   fall into two classes: a set of sequences whose total

probability is less than ; and the remainder, for which the following inequality holds:
   Æ

 (1.53)

where  is the probability of the sequence and  is the entropy of the source.

PROOF See [6], Appendix 3.

THEOREM 1.7
Let be a Markov source with alphabet        , and entropy . Let 
 denote the set of all sequences of symbols from of length . For   , let
   be the probability of the sequence being emitted by the source. Define

        

(1.54)
¾
which is the entropy per symbol of the sequences of  symbols. Then  is a
monotonic decreasing function of  and
   
  
(1.55)

PROOF See [6], Appendix 3.

THEOREM 1.8
Let be a Markov source with alphabet        , and entropy . Let
 denote the set of all sequences of symbols from of length . For    ,



let    be the probability of the source emitting the sequence followed by the

symbol  , and let     be the conditional probability of the symbol  being
emitted immediately after the sequence . Define

    
 
       (1.56)

  ½ 
Entropy and Information 31

which is the conditional entropy of the next symbol when the preceding

symbols are known. Then  is a monotonic decreasing function of and

    (1.57)

PROOF See [6], Appendix 3.

THEOREM 1.9
If  and  are defined as in the previous theorems, then
      (1.58)

  
 (1.59)

and
   (1.60)

PROOF See [6], Appendix 3.

These results show that a series of approximations to the entropy of a source can
be obtained by considering only the statistical behaviour of sequences of symbols

of increasing length. The sequence of estimates  is a better approximation than

the sequence  . If the dependencies in a source extend over no more than 
symbols, so that the conditional probability of the next symbol knowing the preced-
ing    symbols is the same as the conditional probability of the next symbol
knowing the preceding   symbols, then   .  

1.17 The Adjoint Source of a Markov Source


It is possible to approximate the behaviour of a Markov source by a memoryless
source.

DEFINITION 1.21 Adjoint Source of a Markov Source The adjoint source


of a Markov source is the memoryless source with the same alphabet which emits
symbols independently of each other with the same probabilities as the Markov
source.
32 Fundamentals of Information Theory and Coding Design

If we have an th-order Markov source 


with alphabet    , the
probabilities of emission of the symbols are

             (1.61)




  
where     represents a sequence of symbols from the alphabet of the
   
source,       is the probability of this sequence in the stationary dis-
tribution of the Markov source and the summation over indicates that all such 
sequences are included in the summation. The adjoint source of this Markov source,
denoted  , is the memoryless source that emits these symbols with the same prob-

abilities.

EXAMPLE 1.22
For the 3-gram model of Example 1.15, we have transition probabilities

     


     
     
       
which give us the transition matrix

   

      
     
   
We need to find the stationary distribution of the source. The equation  
gives

      


       
       
        
Solving these equations together with the constraint

   
we get the stationary distribution

          


Entropy and Information 33

The probabilities for the adjoint source of the 3-gram models are

                






and

                

 


Although the probabilities of emission of single symbols are the same for both the
Markov source and its adjoint source, the probabilities of emission of sequences
of symbols may not be the same. For example the probability of emission of the
sequence  by the Markov source is     , while for the adjoint source it
is     (by the assumption of independence).

Going from a Markov source to its adjoint reduces the number of constraints on the
output sequence and hence increases the entropy. This is formalised by the following
theorem.

THEOREM 1.10
If  is the adjoint of the Markov source  , their entropies are related by

       (1.62)

PROOF If  is an th-order source with alphabet          , we will


denote the states, which are -tuples of the  , by  , where      . We
assume that  has a stationary distribution.
The probabilities of emission of the symbols are

       (1.63)


where the summation is over all states and  is the probability of state  in the
stationary distribution of the source.
The entropy of the adjoint is

        

34 Fundamentals of Information Theory and Coding Design

       


  

         (1.64)


  
The entropy of the  th state of  is

            (1.65)


 
and the entropy of  is

            


  
          (1.66)
  
If we apply the inequality of Lemma 1.1 to each summation over , the result follows.

1.18 Extensions of Sources


In situations where codes of various types are being developed, it is often useful to
consider sequences of symbols emitted by a source.

DEFINITION 1.22 Extension of a Stationary Memoryless Source The th


extension of a stationary memoryless source is the stationary memoryless source
whose alphabet consists of all sequences of symbols from the alphabet of , with
the emission probabilities of the sequences being the same as the probabilities of
occurrence of the sequences in the output of .

The th extension of will be denoted by  . Because the emission of successive


symbols by is statistically independent, the emission probabilities in  can be
computed by multiplying the appropriate emission probabilities in .

EXAMPLE 1.23
Consider the memoryless source with alphabet   and emission probabilities
  ,    .
Entropy and Information 35

The second extension of has alphabet      with emission probabilities


               
             
             
           
The third extension of has alphabet             with
emission probabilities

                 


                 
                  
               
                 
               
               
              

There is a simple relationship between the entropy of a stationary memoryless source


and the entropies of its extensions.

THEOREM 1.11
If is the th extension of the stationary memoryless source , their entropies are
related by
       (1.67)

PROOF If the alphabet of is          , and the emission probabilities


of the symbols are    for          , the entropy of is

          (1.68)


The alphabet of consists of all sequences ½ ¾     , where          .


The emission probability of ½ ¾     is

 ½ ¾        ½  ¾        (1.69)


36 Fundamentals of Information Theory and Coding Design

The entropy of is
 
    ½ ¾        ½ ¾     
½   
 
   ½ ¾          (1.70)
½     

We can interchange the order of summation to get


 
    ½ ¾          (1.71)
  ½   

Breaking  ½ ¾      into the product of probabilities, and rearranging, we get
  
   ½                  (1.72)
  ½     

Since

     (1.73)
 
for    , we are left with

         
  
  
 
   (1.74)

We also have extensions of Markov sources.

DEFINITION 1.23 Extension of an th-order Markov Source Let  and 


be positive integers, and let be the smallest integer that is greater than or equal
to  . The th extension of the th-order Markov source is the th-order
Markov source whose alphabet consists of all sequences of  symbols from the
alphabet of and for which the transition probabilities between states are equal
to the probabilities of the corresponding -fold transitions of the th-order source.

We will use to denote the th extension of .


Entropy and Information 37

EXAMPLE 1.24

Let be the first-order Markov source with alphabet   and transition probabil-
ities

                   

The second extension of has    and   , so   . It is a first-order


source with alphabet     . We can calculate the transition probabilities
as follows.

                 
                
               
             
               
              
             
           

EXAMPLE 1.25

Consider the second order Markov source with alphabet   and transition proba-
bilities

         
          
          
        

The transition probabilities of the second extension are


Exploring the Variety of Random
Documents with Different Content
and rose to the rank of lieutenant. It is said that Daniel Defoe met
Selkirk at \Va|i 
The text on this page is estimated to be only 27.64%
accurate

458 SELKIRK SELLERS ping, and that his adventures


suggested " Robinson Crusoe." founded upon Selkirk's " Providence
Display'd" (London, 1712), an exceedingly rare pamphlet. Cowper's "
Lines on Solitude, supposed to lie written by Alexander Selkirk,"
beginning "I am monarch of all I survey,'' are well known. See " The
Life and Adventures of Alexander Selkirk," by John Howell
(Edinburgh, 1829). A bronze statue of Selkirk was recently unveiled
at Largo on the site of the cottage in which the mariner was born.
SELKIRK. Edward, clergyman, b. in Waterbury. Conn.. 13 Oct., 1809
: d. 14 Feb., 1891. He was graduated at Trinity in 1840, at the
General theological seminary in 1843. was ordained deacon in the
Protestant Episcopal church the same year, and became priest in
1844. He was then rector of Trinity church, Albany, X. Y., in which he
continued till 1884, when he became rector emeritus. He was an
honorary canon of the Albany cathedral, lie had published "An
Address on the Laying of the Corner-Stone of Trinity Church"
(Albany, 1844) and "History of Trinity Church " (1870). SELKIRK.
Thomas Donsrlas, Earl of, b. at the family-seat, St. Mary's isle,
Kirkcudbrightshire, Scotland, in June, 1771 ; d. in Pau, France, 8
April, 1820. He studied at Edinburgh university from 1786 till 1790,
early developed a taste for literary pursuits, and was an associate of
Sir Walter Scott. He succeeded his brother as Lord Dacre in 1797,
and his father as Earl of Selkirk in May, 1799. In 1803 he settled a
colony of 800 Scottish Highlanders upon waste land that was given
to him by the government in Prince Edward island, and soon
afterward he established a small colony in Kent county, Upper
Canada. While residing in Montreal he conceived the project of
planting a colony of evicted Highlanders from the estates of the
Duchess of Sutherland in the Red river country. To accomplish this
he purchased a large tract of laud on the Red river for colonization
from the Hudson bay company. His Highland colonists began to
arrive in 1811, and in 1812 the Red river colony was established.
Trouble ensued between the colony and the Northwest trading
company, and the emigrants were driven from their new homes. In
1816 Lord Selkirk went to Red river to aid his colonists against their
enemies, and, assisted by a small armed force, restored them to
their lands and reimbursed them for their losses. He became
financially embarrassed in consequence of his philanthropic
schemes, and persecution and slander so shattered his health that
he never recovered. Soon after his return to Scotland he went to the
south of France to recruit, but he died shortly afterward. He wrote "
Observations on the Present State of the Highlands of Scotland, with
a View of the Causes and Probable ConseOAiences of Emigration "
(London, 1805) ; " The Necessity of a more Effectual System of
National Defence" (1808); "Sketch of the British Fur Trade " (1816) :
" The Red River Settlement " (1817) ; and "Occurrences in the
Indian Countries of North America " (Montreal. 1818). SELLERS, < ul
I-IIM ii. dynamical engineer, b. in Philadelphia, Pa., 28 Jan., 1827. He
was educated at common schools and studied for five years with
Anthony Bolmar in West Chester, Pa. In 1S4U hr became
draughtsman in the Globe rolling-mill in Cincinnati, Ohio, and he
remained there for three vears. during part of the time as
superintendent. Mr. Sellers then engaged in the manufacture of
locomotives, and served for five years as foreman in the works of
Niles and Co. In 1S.~>() he moved to Philadelphia, where he
became chief engineer of William Sellers and Co. (the senior partner
of which firm was his second cousin), makers of machinists' tools,
and general millwrights. Since 1888 he has devoted himself chiefly
to consulting practice. Mr. Sellers has obtained more than thirty
letters-patent for inventions of his own, one of the first of which, a
coupling device for shafting (1857), is the essential factor in the
modern system of interchangeable shafting parts. His invention in
1866 of feed-disks for lathes or other machine tools was the first
practical solution of the problem of the infinite gradation of feeds.
His other patents relate chiefly to improved forms of tools or
modifications of existing machines. The use of absorbent cotton for
surgical operations was recommended by him as early as 1861, and
he proposed the employment of glycerine in order to keep
photographic plates wet. He was appointed professor of mechanics
in the Franklin institute in 1881, and non-resident professor of
engineering practice in Stevens institute of technology in 1888. both
of which chairs he still (1898) holds. The order of St. Olaf was
conferred on him by the king of Sweden in 1877, and the degree of
doctor of engineering by Stevens institute in 1888. He was president
of the Franklin institute during 1870-'o, and of the American society
of mechanical engineers in 1884, and he has also held that office in
the Pennsylvania society for the prevention of cruelty to animals and
the Photographic society of Philadelphia. He is a member of other
learned societies both at home and abroad. Mr. Sellers was chosen a
member of the Seybert commission to investigate the claims of
Spiritualists, owing to his knowledge of sleight-of-hand, having been
an expert in the practice of that art from his childhood. He was
American correspondent of the " British Journal of Photography " in
1861-'3, and, in addition, contributed many papers to technical
journals. SELLERS, William, mechanical engineer, b. in Upper Darby,
Pa., 19 Sept., 1824. He was educated at a private school, and at the
age of fourteen was apprenticed to his uncle, a machinist, with
whom he remained for seven years. In 1845 he was called to the
management of the shops of the Fairbanks and Bancroft machine-
works in Providence, R. I., and two years afterward he established
himself independently in Philadelphia. He was then joined by his
former employer, and in 1848 the firm of Bancroft and Sellers was
formed, which continued until 1855, when, on the death of the
senior member, the style became William Sellers and Co. Mr. Sellers
has been active in the improvement of existing forms of tools and
machines, as well as in the invention of new patterns, and from his
first patent, for an improvement on turning-lathes in 1854, until
1888 he has received seventy patents. His inventions have received
numerous medals, and at the World's fair in Vienna in 1873 he was
awarded a grand diploma of honor. In 1868 he established the
Edgemoor iron company, which now owns the largest plant in this
country for building iron bridges and other structures of iron and
steel. All of the iron-work for the buildings of the World's fair in
Philadelphia in 1876 were supplied by this company. He became
president of the Midvale steel-works in 1873, and reorganized that
concern, which is now one of the largest establishments in the
vicinity of Philadelphia. Mr. Sellers was elected president of the
Franklin institute in 1SU4. and while holding that office proposed the
first formula that wa- ever «tVere.l I'm- a -ystem of screws, threads,
and nuts, which subsequently became the standard fur thr Tinted
States. He is a member i 'f -eirntific societies both in this country and
The text on this page is estimated to be only 28.14%
accurate

SKLLSTEDT SEMMES 459 abroad, was elected to the


American philosophical society in 1804, to the National academy of
sciences in 1873, and correspondent of the Societe d'eneouragement
pour 1'industrie Rationale in 1875. At the formation of the Fail-
mount park commission in 1867 he was appointed a commissioner
for five years, during which time all of the land now comprised in
this great park was purchased by the commission. He was active in
the organization of the World's fair in Philadelphia in 1876, and was
at the beginning vice-president of the management. In 1868 he was
elected a trustee of the University of Pennsylvania, and he is a
director of several railroads. His publications include short papers
and discussions on technical subjects. SELLSTEUT, Lars tiustaf, artist,
b. in Sundsvall, Sweden, 30 April, 1819. For several years he
followed the life of a sailor, but came to the United States in 1834.
and in 1842 settled in Buffalo, N. Y., where he still (1888) resides.
Soon after his arrival in that city he began to paint, and during his
studies profited much by association with Thomas Le Clear and
William H. Beard. He has devoted himself chiefly to portraiture, his
works in that line including Solomon G. Haven (1856): George W.
Clinton (1862); Millard Fillmore(1869); a portrait of himself in his
studio, one of his best works (1871); Sherman S. Rogers (1873);
William G. Fargo and Isaac Verplanck (1874) ; Benjamin Fitch (1883)
; and Grover Cleveland (1884). He has also painted a few marine
and genre pictures. Since 1858 he has exhibited frequently at the
National academy, where he was elected an associate in 1871, and
an academician in 1874. In Buffalo he has held office in the Fine arts
academy since 1863. SELWYN, Alfred Richard Cecil, Canadian
geologist, b. in Somersetshire, England, in 1824. He was educated
privately, and continued his studies in Switzerland, and in 1845 was
appointed assistant on the geological survey of Great Britain. In
1852 he was made director of the geological survey of the colony of
Victoria, Australia, in 1854 and 1859 he examined and reported upon
coalfields and gold-fields in Tasmania and South Australia, and he
acted in other important capacities until he left Australia in 1869,
when he went to Canada and succeeded Sir William E. Logan as
director of the geological survey of that country. He has contributed
to and edited twenty volumes of annual reports of the geological and
natural history survey. He was pensioned in 1895. SELYNS, Henricus,
clergyman, b. in Amsterdam, Holland, in 1636 ; d. in New York city
in July, 1701. His ancestors were clergymen in the Reformed church
in Holland for a century previous to his birth. He was educated for
the ministry, and in 1660 was sent to this country by the classis of
Amsterdam to become pastor of the Reformed Dutch church of
Breukelen (Brooklyn). To supplement his salary, he was also
permitted to officiate on Sunday afternoons at Peter Stuyvesant's
farm, Bouwerie (now Bowery). New York, where he taught negroes
and the poor whites. He returned to Holland in 1664, but in 1682
accepted a call from the 1st Reformed Dutch church of New York
city, of which he was pastor until his death. He was on intimate
terms with the most eminent men of his day, and was the chief of
the early ministers to enlarge the usefulness of his church, and to
secure for it an independent and permanent foundation under the
English government. He and his consistory obtained, in May, 1696,
the first church charter that was issued in the colony. Although his
original work that has been preserved is scanty, he wrote much, and
Cotton Mather says of his poetical powers that "he had so nimble a
fancy for putting his devout thoughts into verse that upon this, as
well as upon greater accounts, he was a David unto the flocks in the
wilderss." He collected all the records of the New York Reformed
Dutch church to the date of his own ministry, and transcribed them
with his'own pen. This volume is still extant and in good preservation
in the records of the Reformed Dutch church of New York city. His
only publications are " Poems," translated from the Dutch into
English by Henry C. Murphy, and printed in his "Anthology of the
New Netherlands" in the collections of New York historical society,
and a Latin poem (1687) prefixed to some editions of Cotton
Mather's " Magnalia." SEMMES, Alexander Aldebaran, naval officer, b.
in Washington, D. C., 8 June, 1825 ; d. in Hamilton, Va., 22 Sept.,
1885. He entered the navy as a midshipman, 22 Oct., 1841,
attended the naval academy at Annapolis, and became a passed
midshipman, 10 Aug., 1847. He was promoted to master, 11 Aug.,
1855, and to lieutenant, 15 Sept., 1855. During the civil war he
rendered creditable service in command of the steamer " Rhode
Island " on the Atlantic coast blockade in 1861, and in the steamer "
Wamsutta" on the South Atlantic blockade, during which he
conducted numerous engagements with forts and batteries on the
coasts of Georgia and Florida, where he captured several blockade-
runners in 1862-'3. He commanded the monitor " Lehigh " in the
bombardment of Fort Pringle, and participated in the operations at
Charleston until that city surrendered. He cooperated with Grant's
army, fought the Howlett house batteries, and was present at the fall
of Richmond in 1865. He was commissioned a commander, 25 July,
1866, promoted to captain, 24 Aug., 1873, and stationed at the
Pensacola navy-yard in 1873-'5. In 1880 he was president of the
board of inspection, after which he was commandant of the navy-
yard at Washington. He was commissioned com'modore, 10 March,
1882, and was in command of the navy-yard at the time of his
death, but had left the city on account of his health. SEMMES,
Raphael, naval officer, b. in Charles county, Md., 27 Sept., 1809 ; d.
in Mobile, Ala., 30 Aug., 1877. President John Quincy Adams
appointed him a midshipman in the U. S. navy in 1826, but he did
not enter upon active service until 1832, the intermediate years
being spent in study. In 1834, after returning from his first cruise, he
was admitted to the bar, but decided to remain a seaman. In 1837
he was promoted lieutenant, and in 1842 he removed to Alabama.
At the beginning of the war with Mexico he was made flag-lieutenant
under Com. Conner, commanding the squadron in the Gulf, and in
the siege of Vera Cruz he was in charge of one of the naval batteries
on shore. He was in command of the U. S. brig " Somers " on the
blockade
The text on this page is estimated to be only 27.05%
accurate

460 SEMMES SEMPLE of the Mexican coast, when the brig


foundered in a §ale, and most of her crew were drowned. Lieut,
emmes served for several years as inspector of light-houses on the
Gulf coast, in 1855 was promoted commander, and in 1858 became
secretary of the light-house board at Washington. On the secession
of Alabama, 15 Feb., 1861, he resigned his commission in the U. S.
navy and reported to Jefferson Davis at Montgomery, who instructed
him to return to the north and endeavor to procure mechanics
skilled in the manufacture and use of ordnance and rifle machinery
and the preparation of fixed ammunition and percussion-caps. He
was also to buy war material. In Washington he examined the
machinery of the arsenal, and conferred with mechanics whom he
desired to go south. Within the next three weeks he made a tour
through the principal workshops of New York, Connecticut, and
Massachusetts, purchased large quantities of percussion-caps in New
York, which were sent to Montgomery without any disguise, made
contracts for light artillery, powder, and other munitions of war, and
shipped thousands of pounds of powder to the south. He returned to
Montgomery on 4 April, to find that he had been commissioned
commander in the Confederate navy, and placed in charge of the
light-house bureau, which he relinquished within two weeks to go to
New Orleans and fit out the " Sumter," with which he captured
eighteen merchantmen. After the blockade of that ship at Tangiers
by two U. S. men-of-war, he sold her and went to England, having
been promoted meantime to the rank of captain. There the fast
steamer "Alabama" was built for him, and in August, 1863, he took
command of her at the Azores islands, put to sea, and captured
sixty-two American merchantmen, most of which he burned at sea.
Upon her loss in the battle with the " Kearsarge," on 19 June. 1864
(see WINSLOW, JOHX A.), he returned to England, and in London
was presented by officers of the British army and navy with a sword
to replace that which he had cast into the sea from the deck of his
sinking ship. On 3 Oct., 1864, he sailed for Havana, whence he
reached Bagdad, a Mexican port on the Gulf, and passed through
Texas and Louisiana. He was appointed rear-admiral, and ordered to
the James river squadron, with which he guarded the water
approaches to Richmond until the city was evacuated. At
Greensboro', N. C., on 1 May, 1865, he participated in the
capitulation of Gen. Johnston's army. He returned to Mobile and
opened a law office. There, on 15 Dec., 1865, he was arrested by
order of Sec. Welles and was imprisoned. The reason, as given by
the attorney-general of the United States was his liability to trial as a
traitor, which he had evaded by his escape after the destruction of
the " Alabama." From his prison he wrote to President Johnson a
letter claiming immunity for all past deeds under the military
convention, to which he was a party at Greensboro', and the
subsequent quarrel between Mr. Johnson and the Republican
majority of congress interrupted any proceedings looking to his trial.
He was released under the third of the president's amnesty
proclamations, and in May, 1866, was elected judge of the probate
court of Mobile county, but an order from President Johnson forbade
him to exercise the functions of the office. He then became editor of
a daily paper in Mobile, which he gave up to accept a professor's
chair in the Louisiana military institute. He afterward returned to
Mobile and resumed the practice of law, in whirl: lit1 was occupied
till his death. He published "Service Afloat and Ashore during the
Mexican War " (( 'inrinnati. 1851) : " The Campaign of Gen. Scott in
the Valley of Mexico" (1852); '• The Cruise of the Alabama and
Sumter" (New York. isi;4i ; and " Memoirs of Service Afloat during
the War between the Slat >•, " (Baltimore, 1869). The action of the
British government in permitting the "Alabama" and other similar
cruisers to be fitted out in its ports gave rise to the so-called
"Alabama claims" on the part of the United States, settled by
arbitration in 1873. (See GRANT, ULYSSES S.)— His cousin,
Alexander Jenkins, surgeon, b. in Georgetown, D. C., 17 Dec., 1828,
was educated at Georgetown college, and graduated at the National
medical coll. C., in 1854. He subsequently studied in Paris and
London, and on his return >ettled in Georgetown, D. C., but
removed to New Orleans, La. He was commissioned a surgeon in the
Confederate army in 1861, served in that capacity in Gen. Thomas J.
Jackson's corps in the Army of Northern Virginia, was surgeon in
charge in the Jackson military hospital, Richmond, Va., became
medical inspector of the Department of Northern Virginia in 1862,
inspector of hospitals in the Department of Virginia in 1863, and
president of the examining boards of the Louisiana, Jackson, Stuart,
and Winder hospitals, Richmond, Va., in 1865. He was visiting
physician to the Charity hospital, New Orleans, La., in 1866-'7,
removed to Savannah, Ga.. and in 1870-'6 was professor of
physiology in the Savannah medical college. Subsequently he took
orders in the Roman Catholic church, and in 1886 he became
president of Pio Nono college, Macon, Ga. He was a secretary of the
American medical association in 1858-'9, a member of several
professional societies, and the author of medical and other papers.
His publications include "Medical Sketches of Paris" (New York,
1852) : " Gunshot Wounds " (1864) ; " Notes from a Surgical Diary "
(1866) ; " Surgical Notes of the Late War" (1867): " The Fluid"
Extracts " (1869) ; " Evolution the Origin of Life " (1873) : and the "
Influence of Yellow Fever on Pregnancy and Parturition" (1875).
SEMPLE, James, senator, b. in Green county, Ky., 5 Jan., 1798 ; d. in
Elsah Landing. 111., 20 Dec., 1866. His educational advantages were
limited to the common schools of Greensburg and the law-school at
Louisville, Ky. After his graduation at the latter he removed at once
to Edwardsville, 111., and practised his profession. At the beginning
of the Black Hawk war he was commissioned brigadier-general. He
represented Madison county several times in the legislature, and was
twice speaker of the house. From 1837 till 1842 he was minister at
Bogota. Colombia. In 1843 he was elected judge of the superior
court, but he soon resigned to enter the U. S. senate, where he
served from 4 Dec.. 1N43. till 3 March, 1847, filling the unexpired
term of Samuel McRoberts, deceased. He became an active advocate
of the 54° 40' line in the Oregon question. Returning to his home in
1847. he declined to accept any political nflin'. He expended < ' '11-
it Irrable time and money during the last years of his life in
experimenting on a steam road-wagon which he had made, but it
proved a failure. SEMl'LE. Uobert. British author, b. in Boston, Mass.,
in 176(i ; d. in Fort Douglas. British America, 19 June, 1816. He was
nominated chief governor of all the factories anil territories of the
MinNnn bay company in ISl.i. and. sailing from England, reached
York factory, British America, in August of the same year. He made a
tour of inspect ion of all the posts'of the company immediately upon
his arrival, and did not reach his headquarters at Fort Douglas (now
part of Winnipeg) until the spring
The text on this page is estimated to be only 27.55%
accurate

SEMPLE SEPTENVILLE 461 of 1816. For some time previous


to the arrival of Gov. Semple there had been a conflict of authority
between the Hudson bay company and the Northwest trading
company,.which resulted in bloodshed on several occasions. On 19
June, 1816, Cuthbert Grant, a half-breed, representing the
Northwest company, in command of a band of Indians and others,
marched against Foil Douglas, alt.-ii'kcd Guv. Semple while he was
parleying- with them, and killed him and twenty-seven others. He is
represented as a mild, just, and honorable man. Among other works
he wrote " Walks and Sketches at the Cape of Good Hope" (London,
1803); " Charles Ellis, or the Friends," a novel (1806) ; " A Journey
through Spain and Italy " (2 vols., 1807) ; "Spanish Post-Guide"
(1808); "Second Journey in Spain " (1810) ; " State of Caraccas "
(1812) ; and " Tour from Hamburgh " (1814). SEMPLE, Robert
Baylor, clergyman, b. in King and Queen county, Vva., 20 Jan., 1769;
d. in Fredericksburg, Va., 25 Dec., 1831. After receiving a good
education he taught in a private family and then began to study law,
but abandoned it and devoted himself to the ministry. In 1790 he
was chosen pastor of the Bruington Baptist church, and he
continued in this relation until his death. He soon became one of the
most useful and popular men in Virginia, performed frequent and
extensive preaching tours, and with equal vigor and wisdom
promoted the new enterprises of benevolence that were beginning
to attract the attention of his denomination. The interests of
missions and education found in him a powerful friend. He received
many testimonies of public confidence and esteem. He was for some
time financial agent of Columbian college, and president of its board
of trustees, declined an invitation to the presidency of Transylvania
university in 1805, and in 1820 was elected president of the Baptist
triennial convention, continuing to hold this office until his death. He
received the honorary degree of D. D. from Brown in 1816. Dr.
Semple was the author of a "Catechism" (1809); a "History of
Virginia Baptists" (1810); "Memoir of Elder Straughan"; "Letters to
Alexander Campbell," etc. SENECAL, Louis Adelard, Canadian
senator, b. in Varennes, Lower Canada. 10 July, 1829 ; d. in
Montreal. 11 Oct., 1887. He was educated in his native place and in
Burlington, Vt., and afterward engaged in business. He was a
member of the Quebec assembly for Drummond and Arthabaska
from 1867 till 1871. and of the Dominion parliament for Yamaska
from 1807 till 1872. and became a member of the Dominion senate,
12 March. 1887. In 1857 he opened to navigation the Yamaska river
between Sorel and St. Airne. and the St. Francis river between Sorel
and St. Francis. He has constructed numerous railways, including the
ice railway on the St. Lawrence from Montreal to Longueuil, which
he worked for two winters. Under his management the Richelieu line
was extended from Hamilton and Toronto to Chicoutimi, a distance
of about 1,000 miles. He was a general superintendent of the
government railways of the province oi Quebec, president of the
North Shore railway, the Montreal City Passenger railway, and the
Richelieu and Ontario navigation company. He was a commander of
the French Legion of honor. SENER, James Beverly, lawyer, b. in
Fredericksburg, Va., 18 May, 1837. He received an academic
preparation, attended lectures at the University of Virginia as a state
student, and was graduated in several of the schools of the
university. He then studied law at Lexington, Va., was admitted to
the bar in March, 1860, and served as sergeant (or sheriff) of the
city of Fredericksburg, Va., in 1863-'5. He was army correspondent
of ,he Southern associated press, with Gen. Lee's Army }f Northern
Virginia in 1862-'5, and from 1865 till 1875 was editor of the
Fredericksburg " Ledger." VIr. Sener was a delegate from Virginia to
the National Republican conventions of 1872 and 1876 and served
on the National Republican committee 'rom 187ii till 18SO. He was a
member of congress in 1873-'5, and was the chairman of the
committee on expenditures in the department of justice, being the
first chairman of such a committee. He was chief justice of Wyoming
territory from 18 Dec., 1879, till 10 March, 1884. SENEY, Joshua,
member of the Continental congress, b. on the eastern shore of
Maryland in 1750; d. there in 1799. He was educated by private
tutors, engaged in planting, and supported the patriot cause during
the Revolution. He was a member of the Continental congress in
1787-'8, and of the 1st congress in 1789, and served by reelection
till 1 May, 1792, when he resigned. He was a presidential elector in
that year, supporting Washington and Adams. He married Frances,
daughter of Com. James Nicholson. — His grandson, George
Ingraliam, banker, b. in Astoria. L. I., 12 May, 1826 ; d. in New York,
7 April, 1893. was the son of a clergyman of the Methodist Episcopal
church. Georg'e was a student in 1845 at Wesleyan, from which he
received the degree of A. M. in 1866, was graduated at the
University of the city of New York in 1847. entered the banking
business, and rose from the post of paying-teller in the Metropolitan
bank, New York city, to the presidency of that institution, holding the
latter office in 1877-'84, when the bank was suspended and Mr.
Seney lost a fortune of several million dollars, a large part of which
he afterward regained. His contributions to charitable and
educational institutions include $410,000 to the Methodist general
hospital of Brooklyn, $100,000 to the Long Island historical society,
.|250,000 to Emory college and Wesleyan female college, Macon,
Ga., and $100,000 to benevolent objects in Brooklyn. He founded
the Seney scholarships and largely endowed Wesleyan university,
and had contributed to miscellaneous charities more than $400.000.
His gallery of pictures, numbering 285 specimens, was sold in 1885,
and he also presented several valuable paintings to the Metropolitan
museum of art, New York city. SENTER, Isaac, physician, b. in New
Hampshire in 1755 ; d. in Newport, R. L. 20 Dec., 1799. He went to
Newport. R. L, early in life, studied medicine with Dr. Thomas
MofEat, was a surgeon in the Revolutionary army, and accompanied
Benedict Arnold's expedition to Quebec, an interesting account of
which he published in the " Bulletin of the Historical S_oeiety of
Pennsylvania." He afterward practised in Pawtucket, but finally
settled in Newport, and became one of the most eminent surgeons
and practitioners in the state. He was an honorary member of the
medical societies of London, Edinburgh, and Massachusetts, and for
many years was president of the Society of the Cincinnati of Rhode
Island. He contributed to the medical journals, and published "
Remarks on Phthisis Pulmonalis " in the " Transactions of the College
of Physicians of Philadelphia" (1795). SEPTENVILLE, Charles Edourd
Langlois (say-tong-veal), Baron de, French author, b. in Paris. 17
Nov., 1835. He inherited a fortune, and devoted himself to historical
researches, especially upon the early history of South America. In
March, 1876, he was elected a deputy by the city of Amiens, and he
is member of various learned
The text on this page is estimated to be only 27.39%
accurate

462 SERCEY SERGEANT societies, including the Antiquaires


de France, the Historical institute of Rio Janeiro, and the
Archieological society of Madrid. Septenville's works include, besides
numerous valuable articles in historical magazines, " Victoires et
conquetes de i'Kspagne depuis 1'occupation des Maures jusqu'a nos
jours " (3 vols., Paris, 1862) ; " Decouvertes et conquetes du
Portugal dans les deux mondes " (2 vols., 1863); " Le Bresil sous la
domination Portugaise " (1872) ; and " Pastes inilitaires et maritimes
du Portugal " (2 vols., 1879). SERCEY, Pierre Cesar Charles
Guillaiime, Marquis de. French naval officer, b. near Autun, 26 April,
1753; d. in Paris, 10 Aug., 1836. He entered the navy in 1766, was
commissioned ensign in May, 1779, and served under the Count de
Guiehen. For his participation in several dangerous enterprises
during the siege of Pensacola, Fla., he was made lieutenant and
given the cross of St. Louis. On his return to France he was ordered
to the command of " La Surveillante " in 1790, and sailed for
Martinique. He was promoted captain in 1792, and in January, 1793,
was ordered to convoy to France all the merchant vessels in those
waters. He had collected more than fifty ships laden with valuable
cargoes, when the rising of the negroes in Santo Domingo occurred.
He rescued 6,000 of the colonists. As his scanty supply of provisions
and the feebleness of his naval force did not permit of his attempting
to cross the Atlantic, he set sail for the coast of New England, where
he arrived in safety. On his return to France in December he was
imprisoned for six months for being of noble birth. In December,
1795, he was given command of the naval force that was detailed to
accompany the two civil commissioners that were charged with the
execution of the decree giving liberty to the blacks in Mauritius and
Reunion. Sercey, fearing that scenes similar to those he had
witnessed at Santo Domingo might be enacted there, warned the
colonists of the nature of the commissioners' errand, and they were
in consequence not allowed to land. In 1804, at his earnest request,
he was placed on the retired list, and sailed for the Mauritius, which
he gallantly defended against the English in 1810. On the declaration
of peace in 1814 he was appointed president of the commission to
negotiate in England for the exchange of French prisoners. On his
return to France he was promoted vice-admiral, again placed on the
retired list in April, 1832, and became a member of the house of
peers. SERGEANT, John, missionary, b. in Newark, N. J., in 1710; d.
in Stockbridge," Mass., 27 July, 1749. His grandfather, Jonathan, was
a founder of Newark in 1667. John was graduated at Yale in 1729,
and served as tutor there in 1731-'5. He began to preach to the
Indians at Housatonic, in western Massachusetts, in 1734, and the
next year permanently settled among them and taught them in their
own language. In 1736, when the genrral court purchased of the
Indians all the land at Skatehook, and in return granted them the
to\vnsliip which is now called Stockbridge, he was made owner of
one sixtieth part, and ordained " settled missionary to the Indians"
there and at Kaunaumeek. A short time before his death he
establi>ln'.l a manual-labor school at Stockbridge that was in
successful operation several years. He translated into the Indian
language parts of the Old Testament and all of the New except the
book of Revelation, and published a "Letter on the Indians " (1743)
and " A' Sermon " (1743). — His son, Erastns, physician, b. in
Stockbridge, Mass., 7 Aug., 1742; d. there, 14 Nov., I'M I, IM— i-d
two years at Princeton, and studied medicine with his uncle. Dr.
Thomas Williams, in Decrfield, Mass. He then settled in Stockbridge,
and was the first practitioner in that town. He was a skilful surgeon,
and the principal operator within a circle of thirty miles radius. He
entered the Revolutionary army in 1775 as major of the 7th
Massachusetts regiment, and served with it on Lake Champlain from
December, 1776, till April, 1777, and subsequently till Burgoyne's
surrender. — Another son of John, John, missionary, b. in
Stockbridge. Mass., in 1747; d. there, 8 Sept.. 1824, studied at
Princeton two years, was ordained to the ministry of the
Congregational church, and in 1775 took charge of the Indian part of
the Stockbridge congregation. When they removed to New
Stockbridge, N. Y., he followed them and labored among them until
his death. One of his daughters established a temperance society for
Indian women. Mr. Sergeant possessed little worldly wisdom, and
was better known for his useful and blameless life than for his
intellectual gifts, but he exercised great influence among the Indian
tribes, and, on hearing of his expected death, one of the chiefs said
: " We feel as if our sun was setting, and we do not know what
darkness will succeed." — The first John's nephew, Jonathan
Dickinson, lawyer, b. in Newark. N. J., in 1746 ; d. in Philadelphia,
Pa.. 8 Oct., 1793, was the grandson of Jonathan Dickinson, the first
president of Princeton. He was graduated there in 1762, studied law,
and began practice in his native state. He took his seat in the
Continental congress a few days after the signing of the Declaration
of Independence, served in 1776-'7, and in July, 1777, became
attorney - general of Pennsylvania. In 1778, congress having
ordered a court-martial for the trial of Gen. Arthur St. Clair and other
officers in relation to the evacuation of Ticonderoga. he was
appointed by that body, with William Patterson, of New Jersey, to
assist the judge-advocate in the conduct of the trial. He resigned the
office of attorney-general in 1780, settled in his profession in
Philadelphia, was counsel for the state of Pennsylvania in the
controversy with Connecticut concerning the Wyoming lands in 1782,
and was conspicuous in the management of many other important
cases. When the yellow fever visited Philadelphia in 1793 he was
appointed one of the health committee, and in consequence refused
to leave the city. He distributed large sums among the poor, nursed
the sick, and was active in sanitary measures, but fell a victim to the
epidemic.— Jonathan Dickinson's son. John, lawyer, b. in
Philadelphia, 5 Dec., 1779; d. there, 25 Nov., 1852, was graduated at
Princeton in 1795. and, abandoning his intention to become a
merchant, studied law, and was admitted to the Philadelphia bar in
17!M. For more than half a century he was known throughout the
country as one of the most honorable and learned members of his
|>ro|V»ion and its acknowledged leader in Philadelphia. He
The text on this page is estimated to be only 27.95%
accurate

SERNA SERRA 463 entered public life in 1801, when he was


appointed (•Miiiml>-i'>ii(jr df bankruptcy by Thomas Jefferson, was
a member of the legislature in 1808-'10, and of congress in 1815-
'23, 1827-9, and 1837-'4','. In 1820 he was active in securing the
passage of the Missouri compromise. He was appointed one of the
two envoys in 1826 to the Panama congress, was president of the
Pennsylvania constitutional convention in 1830, and Whig candidate
for the vicepresidency on the ticket with Henry Clay in 1832. He
declined the mission to England in 1841, and his last public service
was that of arbitrator to determine a long-pending controversy. The
question at issue concerned the title to Pea Patch island as derived
by the United States from the state of Delaware, and by James
Humphrey claiming through Henry Gale from the state of New
Jersey. This involved the question of the boundary between the two
states, or, in other words, the claim to Delaware river, and the
decision in favor of the United States incidentally decided the
boundary dispute in favor of Delaware. — Another son of Jonathan
Dickinson, Thomas, jurist, b. in Philadelphia, Pa., 14 Jan., 17S2; d.
there, 8 May. 1860, was graduated at Princeton in 1798, studied law
under Jared Ingersoll, and was admitted to the bar of Philadelphia in
1802. He was in the legislature in 1812-'14, in the latter year was
appointed associate justice of the district court of Philadelphia, and
was secretary of the commonwealth in 1817-'19. While holding that
office he began the formation of the state law library at Harrisburg.
He was attorney-general in 1819-'20, postmaster of Philadelphia
in"l828-'32, and in February, 1834, became associate-justice of the
state supreme court, which office he held till his resignation in 1846.
His judicial decisions were esteemed for their brevity, clearness, and
accuracy, and it is said that he was the only judge that e'rer sat on
the Pennsylvania bench not one of whose decisions was reversed.
He was the chief expounder of the limited equity jurisdiction of the
court, and was of service in bringing this into an intelligible and
convenient shape. He returned to the bar in 1847, and successfully
practised until the failure of his health compelled his gradual
abandonment of professional labor. He was provost of the
lawacademy of Philadelphia in 1844-'5o. for many years president of
the Pennsylvania historical society, a member of the American
philosophical society, and a trustee of the University of Pennsylvania.
He married, on 14 Sept., 1812, Sarah Bache, a granddaughter of
Benjamin Franklin. His publications include " Treatise upon the Law
of Pennsylvania relative to the Proceedings by Foreign Attachment "
(Philadelphia, 1811); "Report of Cases adjudged in the Supreme
Court of Pennsylvania," with William Rawle, Jr. (17 vols., 1814-'29);
"Constitutional Law " (1822) ; "Sketch of the National Judiciary
Powers exercised in the United States Prior to the Adoption of the
Present Federal Constitution " (1824) ; and " View of the Land Laws
of Pennsylvania " (1838). SERNA, Jos6 de la (sair-nah). last viceroy
of Peru, b. in Jerez de la Frontera, Spain, in 1770; d. in Cadiz in
1832. At an early age he entered the army, seeing his first service as
a cadet in the defence of Ceuta against the Moors in 1784. He
served afterward against the French in Catalonia in 1795, under
Admiral Mazarredo against the British in 1797, and in the second
siege of Saragossa in 1809, where he was captured and carried to
France as a prisoner. Soon he escaped, and, after travelling for some
time in Switzerland and the Orient, returned in 1811 to Spain, and
served under Wellington till the expulsion of the French in 1813. In
1816 he held the rank of major-general and was appointed to take
command in Peru. He arrived on 22 Sept. in Callao, and, proceeding
at once to upper Peru, took charge of the army in Cotagaita on 12
Nov. The viceroy urged Serna to begin offensive operations against
the province of Tucuman, which was occupied by the Argentine
patriots. Serna objected to the' insufficiency of his forces, but
Pezuela insisted, when suddenly they were surprised by the
victorious march of San Martin across the Andes and the reconquest
of Chili. The army of upper Peru was henceforth reduced to a
defensive warfare against the insurrectionary movements in several
parts of the country. Serna's opposition to the viceroy increased, and
at last he asked for permission to retire to Spain. His leave of
absence arrived in May, 1819, and in September he resigned the
command of the army to Gen. Canterac. On his arrival in Lima in
December, his partisans made a demonstration in favor of not
allowing Serna to leave Peru on the eve of a threatened invasion
from Chili, and the viceroy, to avoid disagreement, promoted him
lieutenant-general and appointed him president of a consulting
council of war. After the landing of San Martin in Pisco, 8 Sept.,
1820, Serna, through secret machinations, obtained an appointment
as Commander-in-chief of the army that was gathered at
Aznapuquio, to protect the capital against the advance of San
Martin, and was ordered by the viceroy to march to Chancay. On 29
Jan., "1821. the principal officers of the camp, partisans of Serna,
presented a petition to the viceroy, requesting him to resign in favor
of the latter.' Pezuela refused, and ordered Serna to subdue the
mutiny ; but the latter pretended to be unable to do so, and. after
vain resistance, the viceroy delivered to him the executive on the
evening of the same day. When San Martin threatened the capital, a
Spanish commissioner, Capt. Manuel Abreu, arrived from Europe
with orders to negotiate for a pacific arrangement, and Serna sent
him to make proposals to San Martin. The negotiations lasted from
•} May till 24 June, but produced no result, and on the next day
hostilities began again. As the situation became daily more
dangerous, Serna abandoned the capital on 6 July, 1821, and retired
to Jauja, where he reorganized his army, sending Gen. Canterac on
24 Aug. with a force of 4.000 men to relieve Callao. Afterward Serna
established his headquarters at Cuzco, but after a campaign of
variable success there were dissensions in the army, and Gen.
Olafieta refused obedience and maintained an independent position
in upper Peru. Canterac was defeated on 6 Aug., 1824, by Bolivar, at
Junin. The viceroy now resolved to crush the patriot army by a
supreme effort, and left Cuzco in October with a well-disciplined
army of 10.000 infantry and 1,600 cavalry. He met the patriot army
in the mountain plain of Ayacucho on 8 Dec., and on the next day
was totally defeated by Gen. Sucre and wounded and taken prisoner.
The Spanish army lost 2.000 wounded and dead and 3,000
prisoners, and as the rest was entirely dispersed, Gen. Canterac, the
second in command, signed an honorable capitulation the next dav.
and the viceroy, who on the date of the battle had been created by
the king Count de los Andes, was soon afterward permitted to sail
for Europe. He was honorably received at court, his administration
was approved, and he was appointed captain-general of several
provinces. SERRA, Angel (sair -rah). Mexican linguist, b. in Zitacuaro,
Michoacan, about 1640; d. in t^ueretaro about 1700. He entered the
Franciscan order
The text on this page is estimated to be only 27.53%
accurate

464 SERRANO Y DOMING UEZ SERVOSS in Mexico, and


became guardian of the Convent of San Pedro y San Pablo, where he
studied the Tarasco language, in which he soon became the
recognized authority in Mexico. Wishing to utilize his knowledge, he
was sent to the Sierra Gorda as inissionary to the Indians, and was
appointed parish priest "f Charapan. and afterward of Queretaro. He
wrote " Manual Trilingtie, Latino, Castellano y Tarasco, para
administrar los Sacramentos a los Espanoles y a los Indies" (Mexico,
1697); "El Catecisnio ilcl P. I'.artnlouie ( 'a-taim. traducido al'
Tarasco" (Queretaro, 1699); and " Arte, Diccionario y Confesionario
en Tarasco," which was ready for publication at the author's death.
SERRANO V I»03HX(ilTEZ, Francisco, Duke de la Torre, Spanish
soldier, b. at San Fernando, ii.-ar Cadiz. 17 Oct., 1810; d. in Madrid,
26 Nov., 1885. He was the son of a Spanish general, entered the
military college as a cadet in 1822, and in 1825 became ensign. He
served till 1833 in the coastguard, but after the death of Ferdinand
VII. he espoused the cause of the child-queen. Isabella II. He was
promoted in 1840 major-general and second chief of the captaincy-
general of Valencia, and in 1843 elected to the cortes, of which he
became vicepresident. He joined in the overthrow of the regency of
Espartero on 24 July, and the declaration that Queen Isabella was of
age. In November of the same year he was for ten days minister of
war, in 1845 he became lieutenant-general and senator, and after
the young queen's marriage in 1846 he obtained such influence over
her that a public scandal followed, and he was appointed
captaingeneral of Granada. In order to bring him to Madrid again,
the queen appointed him inspectorgeneral of cavalry and captain-
general of New Castile; he took part in several short-lived ministries
and many military pronuneiamientos, and in February, 1854. was
exiled for participation in the insurrection of Saragossa. In June he
returned to take part in the successful revolution under Espartero
and O'Donnell, and in July, 1856, he joined the latter in his
successful i-nii/i il'ttat, and was sent in 1857 as ambassador to Paris.
In 1860 he went as captain-general to Cuba, and during his
administration the annexation of Santo Domingo to the Spanish
crown was brought about. For this, although it cost the nation
millions of money and thousands of lives, he was created Duke de la
Torre on his return to Spain, and made captain-general of the army.
In 1866 he was imprisoned in Alicante for his protest, as president of
the senate, against the illegal dissolution of the cortes, and in July,
1868. was exiled to the Canary islands, but on HI Sept. he landed at
Cadiz, and aided in overthrowing the government of Queen Isabella,
vanquishing the royal troops at AlcoJea on 28 Sept. On 8 Oct. he
became chief of the provisional government, and on 16 June, 1869,
he was elected regent of the kingdom, which place he occupied till
the acceptation of the crown by Prince Amadeo, who in January,
1871, made him prime minister. In 1872 he took the field as
commander-in-chief against the Carlists, and, after the proclamation
of ill.' republic in 1873, he retired to France. He returned to Spain
toward the end of the year, and after the coup d'etat of Gen. Pavia
was made chief of the executive, 4 Jan., 1874. negotiating privately,
it is thought, with Martinez Campos the restoration of the monarchy
under Alfonso XII. on 9 Jan., 1875. He continued to take an active
part in politics as chief of the right centre, and in 1883 was
appointed ambassador of Spain to l-'ranre. I Ie married a Cuban lady
of great beauty, and left a son and two daughters. SERRELL, Edward
AVellman, civil engineer, b. in New York city, 5 Nov., 1826. He was
educated at schools in his native city, and then studied surveying
and civil engineering under the direction of an elder brother. In 1845
he became assistant engineer in charge of the Central railroad of
New Jersey, and he subsequently served in a similar capacity on the
construction of other roads. He accompanied the expedition that in
1848 located the route of the railroad between Aspinwall and
Panama, and on his return, a year later, was engaged in building the
suspension-bridge across the Niagara river at Lewiston ; also that at
St. Johns, New Brunswick. Mr. Serrell was in charge of the Hoosac
tunnel in 1858, and was concerned in the construction of the Bristol
bridge over Avon river, in England, which had the largest span of any
bridge in that country at the time it was built. At the beginning of
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookname.com

You might also like