100% found this document useful (1 vote)
71 views

Download Probability and Computing Randomization and Probabilistic Techniques in Algorithms and Data Analysis 2 Edition Edition Michael Mitzenmacher ebook All Chapters PDF

Probability

Uploaded by

avanaeverssq
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
71 views

Download Probability and Computing Randomization and Probabilistic Techniques in Algorithms and Data Analysis 2 Edition Edition Michael Mitzenmacher ebook All Chapters PDF

Probability

Uploaded by

avanaeverssq
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Download the Full Version of textbook for Fast Typing at textbookfull.

com

Probability and Computing Randomization and


Probabilistic Techniques in Algorithms and Data
Analysis 2 Edition Edition Michael Mitzenmacher

https://textbookfull.com/product/probability-and-computing-
randomization-and-probabilistic-techniques-in-algorithms-
and-data-analysis-2-edition-edition-michael-mitzenmacher/

OR CLICK BUTTON

DOWNLOAD NOW

Download More textbook Instantly Today - Get Yours Now at textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Probability and Computing Randomization and Probabilistic


Techniques in Algorithms and Data Analysis 2nd Edition
Michael Mitzenmacher
https://textbookfull.com/product/probability-and-computing-
randomization-and-probabilistic-techniques-in-algorithms-and-data-
analysis-2nd-edition-michael-mitzenmacher/
textboxfull.com

Probabilistic data structures and algorithms for big data


applications Gakhov

https://textbookfull.com/product/probabilistic-data-structures-and-
algorithms-for-big-data-applications-gakhov/

textboxfull.com

Algorithms Design Techniques and Analysis M H Alsuwaiyel

https://textbookfull.com/product/algorithms-design-techniques-and-
analysis-m-h-alsuwaiyel/

textboxfull.com

An Introduction to Categorical Data Analysis 3rd Edition


Wiley Series in Probability and Statistics Agresti

https://textbookfull.com/product/an-introduction-to-categorical-data-
analysis-3rd-edition-wiley-series-in-probability-and-statistics-
agresti/
textboxfull.com
Scientific Computing and Algorithms in Industrial
Simulations Projects and Products of Fraunhofer SCAI 1st
Edition Michael Griebel
https://textbookfull.com/product/scientific-computing-and-algorithms-
in-industrial-simulations-projects-and-products-of-fraunhofer-
scai-1st-edition-michael-griebel/
textboxfull.com

Data Analysis in the Cloud : Models, Techniques and


Applications 1st Edition Marozzo

https://textbookfull.com/product/data-analysis-in-the-cloud-models-
techniques-and-applications-1st-edition-marozzo/

textboxfull.com

Analysis for Computer Scientists Foundations Methods and


Algorithms Michael Oberguggenberger

https://textbookfull.com/product/analysis-for-computer-scientists-
foundations-methods-and-algorithms-michael-oberguggenberger/

textboxfull.com

Inequalities in Analysis and Probability 2nd Edition Odile


Pons

https://textbookfull.com/product/inequalities-in-analysis-and-
probability-2nd-edition-odile-pons/

textboxfull.com

Topological Methods in Data Analysis and Visualization IV


Theory Algorithms and Applications 1st Edition Hamish Carr

https://textbookfull.com/product/topological-methods-in-data-analysis-
and-visualization-iv-theory-algorithms-and-applications-1st-edition-
hamish-carr/
textboxfull.com
Probability and Computing

Randomization and probabilistic techniques play an important role in modern computer


science, with applications ranging from combinatorial optimization and machine learning
to communication networks and secure protocols.
This textbook provides an indispensable teaching tool to accompany a one- or two-
semester course for advanced undergraduate or beginning graduate students in computer
science and applied mathematics. It offers a comprehensive introduction to the role of ran-
domization and probabilistic techniques in modern computer science, in particular to tech-
niques and paradigms used in the development and probabilistic analysis of algorithms
and for data analyses. It assumes only an elementary background in discrete mathematics
and gives a rigorous yet accessible treatment of the material, with numerous examples and
applications.
The first half of the book covers core material, including random sampling, expecta-
tions, Markov’s inequality, Chebyshev’s inequality, Chernoff bounds, balls-and-bins mod-
els, the probabilistic method, and Markov chains. In the second half, the authors delve
into more advanced topics such as continuous probability, applications of limited indepen-
dence, entropy, Markov chain Monte Carlo methods, coupling, martingales, and balanced
allocations.
This greatly expanded new edition includes several newly added chapters and sec-
tions, covering topics including normal distributions, sample complexity, VC dimension,
Rademacher complexity, power laws and related distributions, cuckoo hashing, and appli-
cations of the Lovász Local Lemma. New material relevant to machine learning and big
data analysis enables students to learn up-to-date techniques and applications. Among the
many new exercises and examples are programming-related exercises that provide students
with practical experience and training related to the theoretical concepts covered in the text.

Michael Mitzenmacher is a Professor of Computer Science in the School of Engineering


and Applied Sciences at Harvard University, where he was also the Area Dean for Com-
puter Science from 2010 to 2013. Michael has authored or co-authored over 200 confer-
ence and journal publications on a variety of topics, including algorithms for the Internet,
efficient hash-based data structures, erasure and error-correcting codes, power laws, and
compression. His work on low-density parity-check codes shared the 2002 IEEE Informa-
tion Theory Society Best Paper Award and won the 2009 ACM SIGCOMM Test of Time
Award. He is an ACM Fellow, and was elected as the Chair of the ACM Special Interest
Group on Algorithms and Computation Theory in 2015.
Eli Upfal is a Professor of Computer Science at Brown University, where he was also the
department chair from 2002 to 2007. Prior to joining Brown in 1998, he was a researcher and
project manager at the IBM Almaden Research Center, and a Professor of Applied Math-
ematics and Computer Science at the Weizmann Institute of Science. His main research
interests are randomized algorithms, probabilistic analysis of algorithms, and computa-
tional statistics, with applications ranging from combinatorial and stochastic optimization,
computational biology, and computational finance. He is a Fellow of both the IEEE and the
ACM.
Probability and Computing
Randomization and Probabilistic
Techniques in Algorithms and
Data Analysis

Second Edition

Michael Mitzenmacher Eli Upfal


University Printing House, Cambridge CB2 8BS, United Kingdom
One Liberty Plaza, 20th Floor, New York, NY 10006, USA
477 Williamstown Road, Port Melbourne, VIC 3207, Australia
4843/24, 2nd Floor, Ansari Road, Daryaganj, Delhi - 110002, India
79 Anson Road, #06-04/06, Singapore 079906

Cambridge University Press is part of the University of Cambridge.


It furthers the University’s mission by disseminating knowledge in the pursuit of
education, learning, and research at the highest international levels of excellence.

www.cambridge.org
Information on this title: www.cambridge.org/9781107154889
10.1017/9781316651124
© Michael Mitzenmacher and Eli Upfal 2017
This publication is in copyright. Subject to statutory exception
and to the provisions of relevant collective licensing agreements,
no reproduction of any part may take place without the written
permission of Cambridge University Press.
First published 2017
Printed in the United States of America by Sheridan Books, Inc.
A catalogue record for this publication is available from the British Library.
Library of Congress Cataloging in Publication Data
Names: Mitzenmacher, Michael, 1969– author. | Upfal, Eli, 1954– author.
Title: Probability and computing / Michael Mitzenmacher Eli Upfal.
Description: Second edition. | Cambridge, United Kingdom ;
New York, NY, USA : Cambridge University Press, [2017] |
Includes bibliographical references and index.
Identifiers: LCCN 2016041654 | ISBN 9781107154889
Subjects: LCSH: Algorithms. | Probabilities. | Stochastic analysis.
Classification: LCC QA274.M574 2017 | DDC 518/.1 – dc23
LC record available at https://lccn.loc.gov/2016041654
ISBN 978-1-107-15488-9 Hardback
Additional resources for this publication at www.cambridge.org/Mitzenmacher.
Cambridge University Press has no responsibility for the persistence or accuracy
of URLs for external or third-party Internet Web sites referred to in this publication
and does not guarantee that any content on such Web sites is, or will remain,
accurate or appropriate.
To

Stephanie, Michaela, Jacqueline, and Chloe


M.M.

Liane, Tamara, and Ilan


E.U.
Contents

Preface to the Second Edition page xv


Preface to the First Edition xvii

1 Events and Probability 1


1.1 Application: Verifying Polynomial Identities 1
1.2 Axioms of Probability 3
1.3 Application: Verifying Matrix Multiplication 8
1.4 Application: Naïve Bayesian Classifier 12
1.5 Application: A Randomized Min-Cut Algorithm 15
1.6 Exercises 17

2 Discrete Random Variables and Expectation 23


2.1 Random Variables and Expectation 23
2.1.1 Linearity of Expectations 25
2.1.2 Jensen’s Inequality 26
2.2 The Bernoulli and Binomial Random Variables 27
2.3 Conditional Expectation 29
2.4 The Geometric Distribution 33
2.4.1 Example: Coupon Collector’s Problem 35
2.5 Application: The Expected Run-Time of Quicksort 37
2.6 Exercises 40

3 Moments and Deviations 47


3.1 Markov’s Inequality 47
3.2 Variance and Moments of a Random Variable 48
3.2.1 Example: Variance of a Binomial Random Variable 51

vii
contents

3.3 Chebyshev’s Inequality 51


3.3.1 Example: Coupon Collector’s Problem 53
3.4 Median and Mean 55
3.5 Application: A Randomized Algorithm for Computing the Median 57
3.5.1 The Algorithm 58
3.5.2 Analysis of the Algorithm 59
3.6 Exercises 62

4 Chernoff and Hoeffding Bounds 66


4.1 Moment Generating Functions 66
4.2 Deriving and Applying Chernoff Bounds 68
4.2.1 Chernoff Bounds for the Sum of Poisson Trials 68
4.2.2 Example: Coin Flips 72
4.2.3 Application: Estimating a Parameter 72
4.3 Better Bounds for Some Special Cases 73
4.4 Application: Set Balancing 76
4.5 The Hoeffding Bound 77
4.6∗ Application: Packet Routing in Sparse Networks 79
4.6.1 Permutation Routing on the Hypercube 80
4.6.2 Permutation Routing on the Butterfly 85
4.7 Exercises 90

5 Balls, Bins, and Random Graphs 97


5.1 Example: The Birthday Paradox 97
5.2 Balls into Bins 99
5.2.1 The Balls-and-Bins Model 99
5.2.2 Application: Bucket Sort 101
5.3 The Poisson Distribution 101
5.3.1 Limit of the Binomial Distribution 105
5.4 The Poisson Approximation 107
5.4.1∗ Example: Coupon Collector’s Problem, Revisited 111
5.5 Application: Hashing 113
5.5.1 Chain Hashing 113
5.5.2 Hashing: Bit Strings 114
5.5.3 Bloom Filters 116
5.5.4 Breaking Symmetry 118
5.6 Random Graphs 119
5.6.1 Random Graph Models 119
5.6.2 Application: Hamiltonian Cycles in Random Graphs 121
5.7 Exercises 127
5.8 An Exploratory Assignment 133

6 The Probabilistic Method 135


6.1 The Basic Counting Argument 135

viii
contents

6.2 The Expectation Argument 137


6.2.1 Application: Finding a Large Cut 138
6.2.2 Application: Maximum Satisfiability 139
6.3 Derandomization Using Conditional Expectations 140
6.4 Sample and Modify 142
6.4.1 Application: Independent Sets 142
6.4.2 Application: Graphs with Large Girth 143
6.5 The Second Moment Method 143
6.5.1 Application: Threshold Behavior in Random Graphs 144
6.6 The Conditional Expectation Inequality 145
6.7 The Lovász Local Lemma 147
6.7.1 Application: Edge-Disjoint Paths 150
6.7.2 Application: Satisfiability 151
6.8∗ Explicit Constructions Using the Local Lemma 152
6.8.1 Application: A Satisfiability Algorithm 152
6.9 Lovász Local Lemma: The General Case 155
6.10∗ The Algorithmic Lovász Local Lemma 158
6.11 Exercises 162

7 Markov Chains and Random Walks 168


7.1 Markov Chains: Definitions and Representations 168
7.1.1 Application: A Randomized Algorithm for 2-Satisfiability 171
7.1.2 Application: A Randomized Algorithm for 3-Satisfiability 174
7.2 Classification of States 178
7.2.1 Example: The Gambler’s Ruin 181
7.3 Stationary Distributions 182
7.3.1 Example: A Simple Queue 188
7.4 Random Walks on Undirected Graphs 189
7.4.1 Application: An s–t Connectivity Algorithm 192
7.5 Parrondo’s Paradox 193
7.6 Exercises 198

8 Continuous Distributions and the Poisson Process 205


8.1 Continuous Random Variables 205
8.1.1 Probability Distributions in R 205
8.1.2 Joint Distributions and Conditional Probability 208
8.2 The Uniform Distribution 210
8.2.1 Additional Properties of the Uniform Distribution 211
8.3 The Exponential Distribution 213
8.3.1 Additional Properties of the Exponential Distribution 214
8.3.2∗ Example: Balls and Bins with Feedback 216
8.4 The Poisson Process 218
8.4.1 Interarrival Distribution 221
ix
contents

8.4.2 Combining and Splitting Poisson Processes 222


8.4.3 Conditional Arrival Time Distribution 224
8.5 Continuous Time Markov Processes 226
8.6 Example: Markovian Queues 229
8.6.1 M/M/1 Queue in Equilibrium 230
8.6.2 M/M/1/K Queue in Equilibrium 233
8.6.3 The Number of Customers in an M/M/∞ Queue 233
8.7 Exercises 236

9 The Normal Distribution 242


9.1 The Normal Distribution 242
9.1.1 The Standard Normal Distribution 242
9.1.2 The General Univariate Normal Distribution 243
9.1.3 The Moment Generating Function 246
9.2∗ Limit of the Binomial Distribution 247
9.3 The Central Limit Theorem 249
9.4∗ Multivariate Normal Distributions 252
9.4.1 Properties of the Multivariate Normal Distribution 255
9.5 Application: Generating Normally Distributed Random Values 256
9.6 Maximum Likelihood Point Estimates 258
9.7 Application: EM Algorithm For a Mixture of Gaussians 261
9.8 Exercises 265

10 Entropy, Randomness, and Information 269


10.1 The Entropy Function 269
10.2 Entropy and Binomial Coefficients 272
10.3 Entropy: A Measure of Randomness 274
10.4 Compression 278
10.5∗ Coding: Shannon’s Theorem 281
10.6 Exercises 290

11 The Monte Carlo Method 297


11.1 The Monte Carlo Method 297
11.2 Application: The DNF Counting Problem 300
11.2.1 The Naïve Approach 300
11.2.2 A Fully Polynomial Randomized Scheme for DNF Counting 302
11.3 From Approximate Sampling to Approximate Counting 304
11.4 The Markov Chain Monte Carlo Method 308
11.4.1 The Metropolis Algorithm 310
11.5 Exercises 312
11.6 An Exploratory Assignment on Minimum Spanning Trees 315

x
contents

12 Coupling of Markov Chains 317


12.1 Variation Distance and Mixing Time 317
12.2 Coupling 320
12.2.1 Example: Shuffling Cards 321
12.2.2 Example: Random Walks on the Hypercube 322
12.2.3 Example: Independent Sets of Fixed Size 323
12.3 Application: Variation Distance Is Nonincreasing 324
12.4 Geometric Convergence 327
12.5 Application: Approximately Sampling Proper
Colorings 328
12.6 Path Coupling 332
12.7 Exercises 336

13 Martingales 341
13.1 Martingales 341
13.2 Stopping Times 343
13.2.1 Example: A Ballot Theorem 345
13.3 Wald’s Equation 346
13.4 Tail Inequalities for Martingales 349
13.5 Applications of the Azuma–Hoeffding Inequality 351
13.5.1 General Formalization 351
13.5.2 Application: Pattern Matching 353
13.5.3 Application: Balls and Bins 354
13.5.4 Application: Chromatic Number 355
13.6 Exercises 355

14 Sample Complexity, VC Dimension, and Rademacher


Complexity 361
14.1 The Learning Setting 362
14.2 VC Dimension 363
14.2.1 Additional Examples of VC Dimension 365
14.2.2 Growth Function 366
14.2.3 VC dimension component bounds 368
14.2.4 -nets and -samples 369
14.3 The -net Theorem 370
14.4 Application: PAC Learning 374
14.5 The -sample Theorem 377
14.5.1 Application: Agnostic Learning 379
14.5.2 Application: Data Mining 380
14.6 Rademacher Complexity 382
14.6.1 Rademacher Complexity and Sample Error 385

xi
contents

14.6.2 Estimating the Rademacher Complexity 387


14.6.3 Application: Agnostic Learning of a Binary Classification 388
14.7 Exercises 389

15 Pairwise Independence and Universal Hash Functions 392


15.1 Pairwise Independence 392
15.1.1 Example: A Construction of Pairwise Independent Bits 393
15.1.2 Application: Derandomizing an Algorithm for Large Cuts 394
15.1.3 Example: Constructing Pairwise Independent Values Modulo
a Prime 395
15.2 Chebyshev’s Inequality for Pairwise Independent Variables 396
15.2.1 Application: Sampling Using Fewer Random Bits 397
15.3 Universal Families of Hash Functions 399
15.3.1 Example: A 2-Universal Family of Hash Functions 401
15.3.2 Example: A Strongly 2-Universal Family of Hash Functions 402
15.3.3 Application: Perfect Hashing 404
15.4 Application: Finding Heavy Hitters in Data Streams 407
15.5 Exercises 411

16 Power Laws and Related Distributions 415


16.1 Power Law Distributions: Basic Definitions and Properties 416
16.2 Power Laws in Language 418
16.2.1 Zipf’s Law and Other Examples 418
16.2.2 Languages via Optimization 419
16.2.3 Monkeys Typing Randomly 419
16.3 Preferential Attachment 420
16.3.1 A Formal Version 422
16.4 Using the Power Law in Algorithm Analysis 425
16.5 Other Related Distributions 427
16.5.1 Lognormal Distributions 427
16.5.2 Power Law with Exponential Cutoff 428
16.6 Exercises 429

17 Balanced Allocations and Cuckoo Hashing 433


17.1 The Power of Two Choices 433
17.1.1 The Upper Bound 433
17.2 Two Choices: The Lower Bound 438
17.3 Applications of the Power of Two Choices 441
17.3.1 Hashing 441
17.3.2 Dynamic Resource Allocation 442
17.4 Cuckoo Hashing 442
17.5 Extending Cuckoo Hashing 452
17.5.1 Cuckoo Hashing with Deletions 452

xii
contents

17.5.2 Handling Failures 453


17.5.3 More Choices and Bigger Bins 454
17.6 Exercises 456

Further Reading 463


Index 464

Note: Asterisks indicate advanced material for this chapter.

xiii
Preface to the Second Edition

In the ten years since the publication of the first edition of this book, probabilistic
methods have become even more central to computer science, rising with the growing
importance of massive data analysis, machine learning, and data mining. Many of the
successful applications of these areas rely on algorithms and heuristics that build on
sophisticated probabilistic and statistical insights. Judicious use of these tools requires
a thorough understanding of the underlying mathematical concepts. Most of the new
material in this second edition focuses on these concepts.
The ability in recent years to create, collect, and store massive data sets, such as
the World Wide Web, social networks, and genome data, lead to new challenges in
modeling and analyzing such structures. A good foundation for models and analysis
comes from understanding some standard distributions. Our new chapter on the nor-
mal distribution (also known as the Gaussian distribution) covers the most common
statistical distribution, as usual with an emphasis on how it is used in settings in com-
puter science, such as for tail bounds. However, an interesting phenomenon is that in
many modern data sets, including social networks and the World Wide Web, we do not
see normal distributions, but instead we see distributions with very different proper-
ties, most notably unusually heavy tails. For example, some pages in the World Wide
Web have an unusually large number of pages that link to them, orders of magnitude
larger than the average. The new chapter on power laws and related distributions covers
specific distributions that are important for modeling and understanding these kinds of
modern data sets.
Machine learning is one of the great successes of computer science in recent years,
providing efficient tools for modeling, understanding, and making predictions based on
large data sets. A question that is often overlooked in practical applications of machine
learning is the accuracy of the predictions, and in particular the relation between accu-
racy and the sample size. A rigorous introduction to approaches to these important
questions is presented in a new chapter on sample complexity, VC dimension, and
Rademacher averages.

xv
preface to the second edition

We have also used the new edition to enhance some of our previous material. For
example, we present some of the recent advances on algorithmic variations of the pow-
erful Lovász local lemma, and we have a new section covering the wonderfully named
and increasingly useful hashing approach known as cuckoo hashing. Finally, in addi-
tion to all of this new material, the new edition includes updates and corrections, and
many new exercises.
We thank the many readers who sent us corrections over the years – unfortunately,
too many to list here!

xvi
Preface to the First Edition

Why Randomness?

Why should computer scientists study and use randomness? Computers appear to
behave far too unpredictably as it is! Adding randomness would seemingly be a dis-
advantage, adding further complications to the already challenging task of efficiently
utilizing computers.
Science has learned in the last century to accept randomness as an essential com-
ponent in modeling and analyzing nature. In physics, for example, Newton’s laws led
people to believe that the universe was a deterministic place; given a big enough calcu-
lator and the appropriate initial conditions, one could determine the location of planets
years from now. The development of quantum theory suggests a rather different view;
the universe still behaves according to laws, but the backbone of these laws is proba-
bilistic. “God does not play dice with the universe” was Einstein’s anecdotal objection
to modern quantum mechanics. Nevertheless, the prevailing theory today for subparti-
cle physics is based on random behavior and statistical laws, and randomness plays a
significant role in almost every other field of science ranging from genetics and evolu-
tion in biology to modeling price fluctuations in a free-market economy.
Computer science is no exception. From the highly theoretical notion of probabilis-
tic theorem proving to the very practical design of PC Ethernet cards, randomness
and probabilistic methods play a key role in modern computer science. The last two
decades have witnessed a tremendous growth in the use of probability theory in comput-
ing. Increasingly more advanced and sophisticated probabilistic techniques have been
developed for use within broader and more challenging computer science applications.
In this book, we study the fundamental ways in which randomness comes to bear on
computer science: randomized algorithms and the probabilistic analysis of algorithms.
Randomized algorithms: Randomized algorithms are algorithms that make random
choices during their execution. In practice, a randomized program would use values
generated by a random number generator to decide the next step at several branches
of its execution. For example, the protocol implemented in an Ethernet card uses ran-
dom numbers to decide when it next tries to access the shared Ethernet communication
xvii
preface to the first edition

medium. The randomness is useful for breaking symmetry, preventing different cards
from repeatedly accessing the medium at the same time. Other commonly used applica-
tions of randomized algorithms include Monte Carlo simulations and primality testing
in cryptography. In these and many other important applications, randomized algo-
rithms are significantly more efficient than the best known deterministic solutions.
Furthermore, in most cases the randomized algorithms are also simpler and easier to
program.
These gains come at a price; the answer may have some probability of being incor-
rect, or the efficiency is guaranteed only with some probability. Although it may seem
unusual to design an algorithm that may be incorrect, if the probability of error is suf-
ficiently small then the improvement in speed or memory requirements may well be
worthwhile.
Probabilistic analysis of algorithms: Complexity theory tries to classify computa-
tion problems according to their computational complexity, in particular distinguishing
between easy and hard problems. For example, complexity theory shows that the Trav-
eling Salesman problem is NP-hard. It is therefore very unlikely that we will ever know
an algorithm that can solve any instance of the Traveling Salesman problem in time that
is subexponential in the number of cities. An embarrassing phenomenon for the clas-
sical worst-case complexity theory is that the problems it classifies as hard to compute
are often easy to solve in practice. Probabilistic analysis gives a theoretical explanation
for this phenomenon. Although these problems may be hard to solve on some set of
pathological inputs, on most inputs (in particular, those that occur in real-life applica-
tions) the problem is actually easy to solve. More precisely, if we think of the input as
being randomly selected according to some probability distribution on the collection of
all possible inputs, we are very likely to obtain a problem instance that is easy to solve,
and instances that are hard to solve appear with relatively small probability. Probabilis-
tic analysis of algorithms is the method of studying how algorithms perform when the
input is taken from a well-defined probabilistic space. As we will see, even NP-hard
problems might have algorithms that are extremely efficient on almost all inputs.

The Book

This textbook is designed to accompany one- or two-semester courses for advanced


undergraduate or beginning graduate students in computer science and applied math-
ematics. The study of randomized and probabilistic techniques in most leading uni-
versities has moved from being the subject of an advanced graduate seminar meant
for theoreticians to being a regular course geared generally to advanced undergraduate
and beginning graduate students. There are a number of excellent advanced, research-
oriented books on this subject, but there is a clear need for an introductory textbook.
We hope that our book satisfies this need.
The textbook has developed from courses on probabilistic methods in computer sci-
ence taught at Brown (CS 155) and Harvard (CS 223) in recent years. The emphasis in
these courses and in this textbook is on the probabilistic techniques and paradigms, not
on particular applications. Each chapter of the book is devoted to one such method or
xviii
preface to the first edition

technique. Techniques are clarified though examples based on analyzing randomized


algorithms or developing probabilistic analysis of algorithms on random inputs. Many
of these examples are derived from problems in networking, reflecting a prominent
trend in the networking field (and the taste of the authors).
The book contains fourteen chapters. We may view the book as being divided into
two parts, where the first part (Chapters 1–7) comprises what we believe is core mate-
rial. The book assumes only a basic familiarity with probability theory, equivalent to
what is covered in a standard course on discrete mathematics for computer scientists.
Chapters 1–3 review this elementary probability theory while introducing some inter-
esting applications. Topics covered include random sampling, expectation, Markov’s
inequality, variance, and Chebyshev’s inequality. If the class has sufficient background
in probability, then these chapters can be taught quickly. We do not suggest skipping
them, however, because they introduce the concepts of randomized algorithms and
probabilistic analysis of algorithms and also contain several examples that are used
throughout the text.
Chapters 4–7 cover more advanced topics, including Chernoff bounds, balls-and-
bins models, the probabilistic method, and Markov chains. The material in these chap-
ters is more challenging than in the initial chapters. Sections that are particularly chal-
lenging (and hence that the instructor may want to consider skipping) are marked with
an asterisk. The core material in the first seven chapters may constitute the bulk of a
quarter- or semester-long course, depending on the pace.
The second part of the book (Chapters 8–17) covers additional advanced material
that can be used either to fill out the basic course as necessary or for a more advanced
second course. These chapters are largely self-contained, so the instructor can choose
the topics best suited to the class. The chapters on continuous probability and entropy
are perhaps the most appropriate for incorporating into the basic course. Our intro-
duction to continuous probability (Chapter 8) focuses on uniform and exponential
distributions, including examples from queueing theory. Our examination of entropy
(Chapter 10) shows how randomness can be measured and how entropy arises naturally
in the context of randomness extraction, compression, and coding.
Chapters 11 and 12 cover the Monte Carlo method and coupling, respectively; these
chapters are closely related and are best taught together. Chapter 13, on martingales,
covers important issues on dealing with dependent random variables, a theme that con-
tinues in a different vein in Chapter 15 is the development of pairwise independence
and derandomization. Finally, the chapter on balanced allocations (Chapter 17) covers
a topic close to the authors’ hearts and ties in nicely with Chapter 5 concerning analysis
of balls-and-bins problems.
The order of the subjects, especially in the first part of the book, corresponds to
their relative importance in the algorithmic literature. Thus, for example, the study
of Chernoff bounds precedes more fundamental probability concepts such as Markov
chains. However, instructors may choose to teach the chapters in a different order. A
course with more emphasis on general stochastic processes, for example, may teach
Markov chains (Chapter 7) immediately after Chapters 1–3, following with the chapter
on balls, bins, and random graphs (Chapter 5, omitting the Hamiltonian cycle exam-
ple). Chapter 6 on the probabilistic method could then be skipped, following instead
xix
preface to the first edition

with continuous probability and the Poisson process (Chapter 8). The material from
Chapter 4 on Chernoff bounds, however, is needed for most of the remaining material.
Most of the exercises in the book are theoretical, but we have included some pro-
gramming exercises – including two more extensive exploratory assignments that
require some programming. We have found that occasional programming exercises are
often helpful in reinforcing the book’s ideas and in adding some variety to the course.
We have decided to restrict the material in this book to methods and techniques based
on rigorous mathematical analysis; with few exceptions, all claims in this book are fol-
lowed by full proofs. Obviously, many extremely useful probabilistic methods do not
fall within this strict category. For example, in the important area of Monte Carlo meth-
ods, most practical solutions are heuristics that have been demonstrated to be effective
and efficient by experimental evaluation rather than by rigorous mathematical analy-
sis. We have taken the view that, in order to best apply and understand the strengths
and weaknesses of heuristic methods, a firm grasp of underlying probability theory and
rigorous techniques – as we present in this book – is necessary. We hope that students
will appreciate this point of view by the end of the course.

Acknowledgments

Our first thanks go to the many probabilists and computer scientists who developed
the beautiful material covered in this book. We chose not to overload the textbook
with numerous references to the original papers. Instead, we provide a reference list
that includes a number of excellent books giving background material as well as more
advanced discussion of the topics covered here.
The book owes a great deal to the comments and feedback of students and teaching
assistants who took the courses CS 155 at Brown and CS 223 at Harvard. In particular
we wish to thank Aris Anagnostopoulos, Eden Hochbaum, Rob Hunter, and Adam
Kirsch, all of whom read and commented on early drafts of the book.
Special thanks to Dick Karp, who used a draft of the book in teaching CS 174 at
Berkeley during fall 2003. His early comments and corrections were most valuable in
improving the manuscript. Peter Bartlett taught CS 174 at Berkeley in spring 2004, also
providing many corrections and useful comments.
We thank our colleagues who carefully read parts of the manuscript, pointed out
many errors, and suggested important improvements in content and presentation: Artur
Czumaj, Alan Frieze, Claire Kenyon, Joe Marks, Salil Vadhan, Eric Vigoda, and the
anonymous reviewers who read the manuscript for the publisher.
We also thank Rajeev Motwani and Prabhakar Raghavan for allowing us to use some
of the exercises in their excellent book Randomized Algorithms.
We are grateful to Lauren Cowles of Cambridge University Press for her editorial
help and advice in preparing and organizing the manuscript.
Writing of this book was supported in part by NSF ITR Grant no. CCR-0121154.

xx
chapter one
Events and Probability

This chapter introduces the notion of randomized algorithms and reviews some basic
concepts of probability theory in the context of analyzing the performance of simple
randomized algorithms for verifying algebraic identities and finding a minimum cut-set
in a graph.

1.1. Application: Verifying Polynomial Identities

Computers can sometimes make mistakes, due for example to incorrect programming
or hardware failure. It would be useful to have simple ways to double-check the results
of computations. For some problems, we can use randomness to efficiently verify the
correctness of an output.
Suppose we have a program that multiplies together monomials. Consider the prob-
lem of verifying the following identity, which might be output by our program:
?
(x + 1)(x − 2)(x + 3)(x − 4)(x + 5)(x − 6) ≡ x6 − 7x3 + 25.
There is an easy way to verify whether the identity is correct: multiply together the
terms on the left-hand side and see if the resulting polynomial matches the right-hand
side. In this example, when we multiply all the constant terms on the left, the result
does not match the constant term on the right, so the identity cannot be valid. More
generally, given two polynomials F (x) and G(x), we can verify the identity
?
F (x) ≡ G(x)
 d i

by converting the two polynomials to their canonical forms i=0 ci x ; two polynomi-
als are equivalent if and only if all the coefficients in their canonical forms are equal.
From thisdpoint on let us assume that, as in our example, F (x) is given as a product
F (x) = i=1 (x − ai ) and G(x) is given in its canonical form. Transforming F (x) to
its canonical form by consecutively multiplying the ith monomial with the product of

1
events and probability

the first i − 1 monomials requires (d 2 ) multiplications of coefficients. We assume in


what follows that each multiplication can be performed in constant time, although if
the products of the coefficients grow large then it could conceivably require more than
constant time to add and multiply numbers together.
So far, we have not said anything particularly interesting. To check whether the
computer program has multiplied monomials together correctly, we have suggested
multiplying the monomials together again to check the result. Our approach for check-
ing the program is to write another program that does essentially the same thing we
expect the first program to do. This is certainly one way to double-check a program:
write a second program that does the same thing, and make sure they agree. There
are at least two problems with this approach, both stemming from the idea that there
should be a difference between checking a given answer and recomputing it. First, if
there is a bug in the program that multiplies monomials, the same bug may occur in
the checking program. (Suppose that the checking program was written by the same
person who wrote the original program!) Second, it stands to reason that we would like
to check the answer in less time than it takes to try to solve the original problem all over
again.
Let us instead utilize randomness to obtain a faster method to verify the identity. We
informally explain the algorithm and then set up the formal mathematical framework
for analyzing the algorithm.
Assume that the maximum degree, or the largest exponent of x, in F (x) and G(x) is
d. The algorithm chooses an integer r uniformly at random in the range {1, . . . , 100d},
where by “uniformly at random” we mean that all integers are equally likely to be
chosen. The algorithm then computes the values F (r) and G(r). If F (r) = G(r) the
algorithm decides that the two polynomials are not equivalent, and if F (r) = G(r) the
algorithm decides that the two polynomials are equivalent.
Suppose that in one computation step the algorithm can generate an integer chosen
uniformly at random in the range {1, . . . , 100d}. Computing the values of F (r) and
G(r) can be done in O(d) time, which is faster than computing the canonical form of
F (r). The randomized algorithm, however, may give a wrong answer.
How can the algorithm give the wrong answer?
If F (x) ≡ G(x), then the algorithm gives the correct answer, since it will find that
F (r) = G(r) for any value of r.
If F (x) ≡ G(x) and F (r) = G(r), then the algorithm gives the correct answer since
it has found a case where F (x) and G(x) disagree. Thus, when the algorithm decides
that the two polynomials are not the same, the answer is always correct.
If F (x) ≡ G(x) and F (r) = G(r), the algorithm gives the wrong answer. In other
words, it is possible that the algorithm decides that the two polynomials are the
same when they are not. For this error to occur, r must be a root of the equation
F (x) − G(x) = 0. The degree of the polynomial F (x) − G(x) is no larger than d and,
by the fundamental theorem of algebra, a polynomial of degree up to d has no more
than d roots. Thus, if F (x) ≡ G(x), then there are no more than d values in the
range {1, . . . , 100d} for which F (r) = G(r). Since there are 100d values in the range
{1, . . . , 100d}, the chance that the algorithm chooses such a value and returns a wrong
answer is no more than 1/100.
2
1.2 axioms of probability

1.2. Axioms of Probability

We turn now to a formal mathematical setting for analyzing the randomized algorithm.
Any probabilistic statement must refer to the underlying probability space.
Definition 1.1: A probability space has three components:
1. a sample space , which is the set of all possible outcomes of the random process
modeled by the probability space;
2. a family of sets F representing the allowable events, where each set in F is a subset1
of the sample space ; and
3. a probability function Pr : F → R satisfying Definition 1.2.
An element of  is called a simple or elementary event.
In the randomized algorithm for verifying polynomial identities, the sample space
is the set of integers {1, . . . , 100d}. Each choice of an integer r in this range is a simple
event.
Definition 1.2: A probability function is any function Pr : F → R that satisfies the
following conditions:
1. for any event E, 0 ≤ Pr(E ) ≤ 1;
2. Pr() = 1; and
3. for any finite or countably infinite sequence of pairwise mutually disjoint events
E1 , E2 , E3 , . . . ,
 

Pr Ei = Pr(Ei ).
i≥1 i≥1

In most of this book we will use discrete probability spaces. In a discrete probability
space the sample space  is finite or countably infinite, and the family F of allow-
able events consists of all subsets of . In a discrete probability space, the probability
function is uniquely defined by the probabilities of the simple events.
Again, in the randomized algorithm for verifying polynomial identities, each choice
of an integer r is a simple event. Since the algorithm chooses the integer uniformly at
random, all simple events have equal probability. The sample space has 100d simple
events, and the sum of the probabilities of all simple events must be 1. Therefore each
simple event has probability 1/100d.
Because events are sets, we use standard set theory notation to express combinations
of events. We write E1 ∩ E2 for the occurrence of both E1 and E2 and write E1 ∪ E2 for
the occurrence of either E1 or E2 (or both). For example, suppose we roll two dice. If
E1 is the event that the first die is a 1 and E2 is the event that the second die is a 1, then
E1 ∩ E2 denotes the event that both dice are 1 while E1 ∪ E2 denotes the event that at
least one of the two dice lands on 1. Similarly, we write E1 − E2 for the occurrence

1 In a discrete probability space F = 2 . Otherwise, and introductory readers may skip this point, since the events
need to be measurable, F must include the empty set and be closed under complement and union and intersection
of countably many sets (a σ -algebra).

3
events and probability

of an event that is in E1 but not in E2 . With the same dice example, E1 − E2 consists
of the event where the first die is a 1 and the second die is not. We use the notation Ē
as shorthand for  − E; for example, if E is the event that we obtain an even number
when rolling a die, then Ē is the event that we obtain an odd number.
Definition 1.2 yields the following obvious lemma.

Lemma 1.1: For any two events E1 and E2 ,

Pr(E1 ∪ E2 ) = Pr(E1 ) + Pr(E2 ) − Pr(E1 ∩ E2 ).

Proof: From the definition,

Pr(E1 ) = Pr(E1 − (E1 ∩ E2 )) + Pr(E1 ∩ E2 ),


Pr(E2 ) = Pr(E2 − (E1 ∩ E2 )) + Pr(E1 ∩ E2 ),
Pr(E1 ∪ E2 ) = Pr(E1 − (E1 ∩ E2 )) + Pr(E2 − (E1 ∩ E2 )) + Pr(E1 ∩ E2 ).

The lemma easily follows. 

A consequence of Definition 1.2 is known as the union bound. Although it is very


simple, it is tremendously useful.

Lemma 1.2: For any finite or countably infinite sequence of events E1 , E2 , . . . ,


 

Pr Ei ≤ Pr(Ei ).
i≥1 i≥1

Notice that Lemma 1.2 differs from the third part of Definition 1.2 in that Definition
1.2 is an equality and requires the events to be pairwise mutually disjoint.
Lemma 1.1 can be generalized to the following equality, often referred to as the
inclusion–exclusion principle.

Lemma 1.3: Let E1 , . . . , En be any n events. Then


 n 
 n
Pr Ei = Pr(Ei ) − Pr(Ei ∩ E j ) + Pr(Ei ∩ E j ∩ Ek )
i=1 i=1 i< j i< j<k
 

+1
− · · · + (−1) Pr Eir + ··· .
i1 <i2 <···<i r=1

The proof of the inclusion–exclusion principle is left as Exercise 1.7.


We showed before that the only case in which the algorithm may fail to give the
correct answer is when the two input polynomials F (x) and G(x) are not equivalent;
the algorithm then gives an incorrect answer if the random number it chooses is a root
of the polynomial F (x) − G(x). Let E represent the event that the algorithm failed to
give the correct answer. The elements of the set corresponding to E are the roots of
the polynomial F (x) − G(x) that are in the set of integers {1, . . . , 100d}. Since the
polynomial has no more than d roots it follows that the event E includes no more than
4
1.2 axioms of probability

d simple events, and therefore


d 1
Pr(algorithm fails) = Pr(E ) ≤ = .
100d 100
It may seem unusual to have an algorithm that can return the wrong answer. It may
help to think of the correctness of an algorithm as a goal that we seek to optimize in
conjunction with other goals. In designing an algorithm, we generally seek to minimize
the number of computational steps and the memory required. Sometimes there is a
trade-off; there may be a faster algorithm that uses more memory or a slower algorithm
that uses less memory. The randomized algorithm we have presented gives a trade-off
between correctness and speed. Allowing algorithms that may give an incorrect answer
(but in a systematic way) expands the trade-off space available in designing algorithms.
Rest assured, however, that not all randomized algorithms give incorrect answers, as
we shall see.
For the algorithm just described, the algorithm gives the correct answer 99% of
the time even when the polynomials are not equivalent. Can we improve this prob-
ability? One way is to choose the random number r from a larger range of integers.
If our sample space is the set of integers {1, . . . , 1000d}, then the probability of a
wrong answer is at most 1/1000. At some point, however, the range of values we
can use is limited by the precision available on the machine on which we run the
algorithm.
Another approach is to repeat the algorithm multiple times, using different random
values to test the identity. The property we use here is that the algorithm has a one-sided
error. The algorithm may be wrong only when it outputs that the two polynomials are
equivalent. If any run yields a number r such that F (r) = G(r), then the polynomials are
not equivalent. Thus, if we repeat the algorithm a number of times and find F (r) = G(r)
in at least one round of the algorithm, we know that F (x) and G(x) are not equivalent.
The algorithm outputs that the two polynomials are equivalent only if there is equality
for all runs.
In repeating the algorithm we repeatedly choose a random number in the range
{1, . . . , 100d}. Repeatedly choosing random numbers according to a given distribution
is generally referred to as sampling. In this case, we can repeatedly choose random
numbers in the range {1, . . . , 100d} in two ways: we can sample either with replace-
ment or without replacement. Sampling with replacement means that we do not remem-
ber which numbers we have already tested; each time we run the algorithm, we choose
a number uniformly at random from the range {1, . . . , 100d} regardless of previous
choices, so there is some chance we will choose an r that we have chosen on a previous
run. Sampling without replacement means that, once we have chosen a number r, we
do not allow the number to be chosen on subsequent runs; the number chosen at a given
iteration is uniform over all previously unselected numbers.
Let us first consider the case where sampling is done with replacement. Assume
that we repeat the algorithm k times, and that the input polynomials are not equiva-
lent. What is the probability that in all k iterations our random sampling from the set
{1, . . . , 100d} yields roots of the polynomial F (x) − G(x), resulting in a wrong output
by the algorithm? If k = 1, we know that this probability is at most d/100d = 1/100.
5
events and probability

If k = 2, it seems that the probability that the first iteration finds a root is at most 1/100
and the probability that the second iteration finds a root is at most 1/100, so the prob-
ability that both iterations find a root is at most (1/100)2 . Generalizing, for any k, the
probability of choosing roots for k iterations would be at most (1/100)k .
To formalize this, we introduce the notion of independence.
Definition 1.3: Two events E and F are independent if and only if
Pr(E ∩ F ) = Pr(E ) · Pr(F ).
More generally, events E1 , E2 , . . . , Ek are mutually independent if and only if, for any
subset I ⊆ [1, k],
 
Pr Ei = Pr(Ei ).
i∈I i∈I

If our algorithm samples with replacement then in each iteration the algorithm chooses
a random number uniformly at random from the set {1, . . . , 100d}, and thus the choice
in one iteration is independent of the choices in previous iterations. For the case where
the polynomials are not equivalent, let Ei be the event that, on the ith run of the algo-
rithm, we choose a root ri such that F (ri ) − G(ri ) = 0. The probability that the algo-
rithm returns the wrong answer is given by
Pr(E1 ∩ E2 ∩ · · · ∩ Ek ).
Since Pr(Ei ) is at most d/100d and since the events E1 , E2 , . . . , Ek are independent,
the probability that the algorithm gives the wrong answer after k iterations is
k k k
d 1
Pr(E1 ∩ E2 ∩ · · · ∩ Ek ) = Pr(Ei ) ≤ = .
i=1 i=1
100d 100

The probability of making an error is therefore at most exponentially small in the num-
ber of trials.
Now let us consider the case where sampling is done without replacement. In this
case the probability of choosing a given number is conditioned on the events of the
previous iterations.
Definition 1.4: The conditional probability that event E occurs given that event F
occurs is
Pr(E ∩ F )
Pr(E | F ) = .
Pr(F )
The conditional probability is well-defined only if Pr(F ) > 0.
Intuitively, we are looking for the probability of E ∩ F within the set of events defined
by F. Because F defines our restricted sample space, we normalize the probabilities
by dividing by Pr(F ), so that the sum of the probabilities of all events is 1. When
Pr(F ) > 0, the definition can also be written in the useful form
Pr(E | F ) Pr(F ) = Pr(E ∩ F ).

6
1.2 axioms of probability

Notice that, when E and F are independent and Pr(F ) = 0, we have


Pr(E ∩ F ) Pr(E ) Pr(F )
Pr(E | F ) = = = Pr(E ).
Pr(F ) Pr(F )
This is a property that conditional probability should have; intuitively, if two events are
independent, then information about one event should not affect the probability of the
second event.
Again assume that we repeat the algorithm k times and that the input polynomials are
not equivalent. What is the probability that in all the k iterations our random sampling
from the set {1, . . . , 100d} yields roots of the polynomial F (x) − G(x), resulting in a
wrong output by the algorithm?
As in the analysis with replacement, we let Ei be the event that the random num-
ber ri chosen in the ith iteration of the algorithm is a root of F (x) − G(x); again, the
probability that the algorithm returns the wrong answer is given by
Pr(E1 ∩ E2 ∩ · · · ∩ Ek ).
Applying the definition of conditional probability, we obtain
Pr(E1 ∩ E2 ∩ · · · ∩ Ek ) = Pr(Ek | E1 ∩ E2 ∩ · · · ∩ Ek−1 ) · Pr(E1 ∩ E2 ∩ · · · ∩ Ek−1 ),
and repeating this argument gives
Pr(E1 ∩ E2 ∩ · · · ∩ Ek )
= Pr(E1 ) · Pr(E2 | E1 ) · Pr(E3 | E1 ∩ E2 ) · · · Pr(Ek | E1 ∩ E2 ∩ · · · ∩ Ek−1 ).
Can we bound Pr(E j | E1 ∩ E2 ∩ · · · ∩ E j−1 )? Recall that there are at most d values
r for which F (r) − G(r) = 0; if trials 1 through j − 1 < d have found j − 1 of them,
then when sampling without replacement there are only d − ( j − 1) values out of the
100d − ( j − 1) remaining choices for which F (r) − G(r) = 0. Hence
d − ( j − 1)
Pr(E j | E1 ∩ E2 ∩ · · · ∩ E j−1 ) ≤ ,
100d − ( j − 1)
and the probability that the algorithm gives the wrong answer after k ≤ d iterations is
bounded by
k k
d − ( j − 1) 1
Pr(E1 ∩ E2 ∩ · · · ∩ Ek ) ≤ ≤ .
j=1
100d − ( j − 1) 100

Because (d − ( j − 1))/(100d − ( j − 1)) < d/100d when j > 1, our bounds on the
probability of making an error are actually slightly better without replacement. You
may also notice that, if we take d + 1 samples without replacement and the two poly-
nomials are not equivalent, then we are guaranteed to find an r such that F (r) − G(r) =
0. Thus, in d + 1 iterations we are guaranteed to output the correct answer. However,
computing the value of the polynomial at d + 1 points takes (d 2 ) time using the stan-
dard approach, which is no faster than finding the canonical form deterministically.
Since sampling without replacement appears to give better bounds on the probability
of error, why would we ever want to consider sampling with replacement? In some
cases, sampling with replacement is significantly easier to analyze, so it may be worth
7
Discovering Diverse Content Through
Random Scribd Documents
would be going on a century and a half later.
4
THE PIRATES OF NEW ORLEANS
On November 24, 1813, most citizens of New Orleans were
chuckling over a new proclamation, bearing the signature of Gov.
W. C. C. Claiborne, which had been posted on bulletin boards
throughout the city. They were not so much amused because the
Governor had accused a pirate of attacking a U.S. Customs officer
(although this was amusing enough to many), but because Claiborne
actually expected someone to take seriously his offer of a $500
reward for the capture of the pirate Jean Laffite.
Invade the pirate hideout in the swamps and capture Jean Laffite
—or even his brother Pierre—for a mere $500? And Claiborne was
naive if he thought that most of Louisiana’s politicians and merchants
had any desire to halt the smuggling of pirated merchandise while a
war was being fought against Great Britain. Any kind of merchandise
was hard to obtain.
Two days after the posting of the proclamation, a wave of
raucous laughter sounded in the coffee houses, taverns and drawing
rooms of New Orleans. The laughter exploded over a proclamation
posted throughout the city which was a parody of the Claiborne
document. It offered a reward of $1,500 for the arrest of Governor
Claiborne and his delivery to the pirate hideout at Grande Terre in
the bayou country south of New Orleans. The proclamation was
signed by Jean Laffite.
Laffite’s arrogance was no laughing matter to government
officials in Louisiana and Washington. Not only were the pirates
openly defying Federal and state authority, but a legitimate merchant
had little chance to compete against those who purchased their
goods at the pirates’ auctions. The auctions were held regularly on
islands in the swamps near New Orleans. Hundreds of thousands of
dollars’ worth of merchandise—captured on the high seas—could be
bought cheaply and with no payment of Customs duties.
The enemies of the Laffites and their cutthroat crew were in the
minority. Everyone knew—including Claiborne—that a majority of the
people were sympathetic to the Laffites. The general view was that
the pirates actually were performing a patriotic service when they
attacked ships of the enemy countries, England and Spain, and then
made their booty available to Louisiana citizens at ridiculously
reduced prices.
Before Claiborne issued his proclamation, the general attitude of
the citizenry was fairly summed up in a letter received by the
Louisiana Gazette and signed “The Agent of the Freebooters.” There
had been a complaint against piracy and smuggling in the
newspaper, and the freebooter (perhaps it was Jean Laffite) wrote a
reply saying:

Gentlemen:
Your paper of Wednesday contained a letter written by some
idiot ... (who) makes a great outcry against a few honest fellows
of us, who are using extraordinary exertions to punish the
common enemy, the British and their allies, the Spaniards....
Does he wish to discourage our profession and put an end to
trade altogether?...
Cannot the booby perceive that without us there would not
be a bale of goods at market; and does he not see, by the open
manner in which our business is done, that the government of
the United States has no objection either to the fitting out of our
prizes and the sale of their cargoes, without troubling ourselves
about the payment of duties; which I assure you we would find
extremely inconvenient when we sell so low for real cash in
these hard times....

The legislature paid little attention to Governor Claiborne’s


appeals for help in suppressing piracy and smuggling because too
many of its members were profiting from the operations of the
smugglers or were close friends of Jean and Pierre Laffite.
Honest merchants, competing at great disadvantage against
those who bought their goods from the pirates’ stores, were the first
to raise a clamor for Claiborne to do something to halt the
smuggling. As a result of these demands, Customs Officer Walker
Gilbert invaded the pirate country with a company of armed men. As
they moved through the marshes south of New Orleans, they
encountered Laffite convoying a shipload of contraband goods
toward New Orleans. Gilbert and his men attacked the pirates and
there was a brief, savage skirmish. Laffite and his group fled from
the ship, leaving Gilbert in possession. But before Gilbert could
reorganize his forces the pirates counterattacked. One of Gilbert’s
men was badly wounded. The Federal officers were driven off. The
pirates took over the ship and resumed their journey to New Orleans
to sell their booty. It was this episode which caused Claiborne to
issue his proclamation that stirred so much amusement.
Piracy and smuggling, after the turn of the century, had become
profitable for two prime reasons. The first of these was the
worsening relations between the United States and Great Britain,
which brought about the embargo on shipping in 1807 and later the
War of 1812. Any kind of merchandise was hard to come by and
could be sold for a handsome profit. The second reason was the
U.S. government’s efforts to outlaw the traffic in slaves.
With the passage of the embargo on slave trade in 1808, the
price of Negro slaves in the United States skyrocketed. Slaves could
be bought in Cuba for about $300 each and sold for three or four
times that amount on the illegal markets in the United States. But
then the pirates found it more profitable simply to bypass Cuba and
waylay the slave ships at sea. They found a ready market for the
slaves in the lower Mississippi Valley, where huge cotton and sugar
plantations were being developed and where plantation owners were
willing to bid against each other for this cheap labor.
The pirates found Barataria Bay near New Orleans an ideal
place from which to operate. In this maze of canals, marshes,
bayous and meandering waterways at the mouth of the Mississippi
were many hiding places. In addition, the location furnished easy
access to the markets of New Orleans.
The Baratarians had a loose organization headed by an Italian
named Gamby. The outlaws were forever quarrelling among
themselves and Gamby was not strong enough to control them, an
internal weakness which was more serious for the pirates than the
opposition of Governor Claiborne.
At this time (1810) Jean and Pierre Laffite were living in New
Orleans with their younger brother, Antoine, whose name was never
to figure in any of their piratical exploits of later years. The elder
brothers, all reports agree, were striking-looking men with great
personal charm and wit.
Pierre was of medium height, well-built, and handsome, although
an illness had affected the muscles in the left side of his face and
one of his eyes was slightly crossed. Jean stood 6 feet 2 inches tall,
had blue eyes, and black hair. Like his brother, he was shrewd and
fearless.
The brothers operated a blacksmith shop in the very heart of the
city on St. Philips Street, not far from Bourbon Street. Even on the
sultriest, hottest days of summer they kept the bellows blowing on
the red-hot coals of the smithy fire. The anvil rang with the blows of
their hammers beating lengths of iron into light, strong chains—
chains to which slaves would be manacled before being brought to
the auctions in the city.
Almost always there were groups of Baratarians lounging in the
smithy, rough men with cutlasses at their waists and pistols stuck
into their belts. They advised the Laffites on how they wished the
chains to be made. And they also talked of their raids on English and
Spanish ships, the booty they had brought to their hiding places in
the swamps, and the wild parties they had after seizing a ship’s
stores of rum, wines and liquors.
For years, the Laffites acted as the “fences” for the pirates,
handling slaves as well as other merchandise. It can only be
assumed they listened to the tales of adventure, excitement and
stolen riches—and became envious of the wealth acquired with such
ease by men who were not as intelligent as they. At any rate, Jean
Laffite left the smithy regularly to make trips to the pirates’ hideout at
Barataria, where he studied their operation.
After a time, Jean won the confidence of the ruffians to such a
degree that he was invited to become their leader—despite the
mumbling of Gamby. But this gentleman proved to be no problem
whatever. He abdicated without a fight after he saw Laffite coldly
shoot down one man who questioned his authority.
Laffite had exceptional executive abilities, along with a bold
courage. Under his command the pirates of Barataria reached new
heights in prosperity and arrogance. Men flocked from New Orleans
to his pirate hideout to enlist their services. The stores of stolen
goods sometimes were worth hundreds of thousands of dollars.
Regular auctions were held on islands near New Orleans.
While he brought piracy to a new height in the Gulf, Laffite
perhaps organized it too well. He made it so efficient that some
historians believe he was responsible for its downfall sooner than
otherwise might have been. This may well be true, because the
pirates achieved such great power under Laffite that the government
of the United States could not forever condone their brazen
disregard of revenue laws, the slave trade embargo, and the
authority of government.
At one period, Laffite had from 800 to 1,000 men under his
command—an outlaw army which was equipped with the best
artillery and huge stores of powder and ammunition. His artillerymen
had no peers in any army in the world. His storehouses were filled
with stolen merchandise. On one day 400 Negroes were sold during
an auction—which meant a gross business of approximately
$500,000 in slaves alone.
Laffite had his choice of the best wines of the Old World. He
dined on silverplate. In New Orleans, he and Pierre were seen on
the streets, in the coffee houses, and in the taverns in the company
of many of the city’s leading businessmen, merchants and lawyers.
Jean Laffite did not like the name of pirate. He called himself a
privateer. But in the New Orleans of that day his occupation did not
bar him from the society of the leading citizens of the city.
In the summer of 1812, Capt. Andrew Hunter Holmes was sworn
in as a Customs officer and led an expedition against Laffite and his
men. Captain Holmes took a group of thirty or forty men in small
boats and proceeded toward Barataria. But Jean Laffite was
forewarned of the expedition by his brother Pierre. According to the
records of the times Jean Laffite laughed uproariously when he
learned that Holmes was heading for Barataria with such a small
company. Laffite took his boats loaded with merchandise through
devious waterways and avoided Holmes and his men rather than get
involved in a skirmish which could end only in disaster for Holmes.
But Holmes was not to be outwitted so easily. In the fall, Holmes
returned with a larger force, surprised Jean and Pierre Laffite with
contraband merchandise and took them to New Orleans as
prisoners. The brothers were released on bail and neither showed up
for trial. Six writs of arrest were issued for Jean and Pierre but all
were returned with the notation “Not found in New Orleans.” It
seemed reasonable to assume that no one seriously wanted to find
them.
The acceptance of piracy and smuggling as elements of
legitimate trade in New Orleans was not so astonishing as it might
seem because Louisiana commerce for many years prior to this time
had been built on such a foundation. Smuggling to avoid tariffs and
then selling at cheap prices, as one historian said, “had become a
part of the habits of life there.” The people were satisfied for the most
part because the smuggled goods were cheaper than they would
have been had duties been paid. Merchants were satisfied because
they were able to obtain scarce merchandise and make a good
profit. Men such as Laffite were regarded as performing a necessary
function for the community at no little personal risk—and only a
minority attached any moral stigma to the trade.
The confidential and intimate relationship between the pirates
and their customers reached its peak on New Year’s Day, 1814.
Handbills were boldly scattered in public places and prominently
displayed throughout New Orleans announcing that the brothers
Laffite, on January 20, would offer at auction at “The Temple” a
quantity of slaves and merchandise.
The Temple was a favorite market place for the Laffites. It was
an ancient Indian mound of white shells where, legend had it,
Indians of the area had gathered to give human sacrifices to
appease their gods. The pirates had built a platform at the edge of
the water, onto which they could unload their boats. There they
spread their merchandise for all to see who came from New Orleans.
Since it was easily accessible, there was never any lack of buyers
when the auctioneer went to work.
Claiborne was infuriated by the distribution of handbills
announcing the auction. The pirates were offering for sale 415
slaves in addition to a supply of “fine foreign merchandise.”
Claiborne called in the U.S. Collector of Customs to discuss what
could be done about this outrageous disregard of the laws of the
United States. Both must have known that there was not much that
could be done. Nevertheless the Collector ordered a small force to
proceed to The Temple to “defeat the purpose of the law infractors.”
Jean Laffite and his companions attacked the Customs men,
killing one man and wounding two others fatally. Nine of the officers
were held as prisoners by the pirates while they proceeded with their
auction as planned. It was reported that buyers came from many
parts of Louisiana. They bought all the slaves that were put on the
block. And the Laffites considered the auction a great success.
The Collector wrote to Governor Claiborne: “It is high time that
the contrabandists, dispersed throughout the State, should be taught
to respect our laws, and I hold it my duty to call on your excellency
for a force adequate to the exigency of the case.”
Claiborne could have dispatched a state militia against the pirate
gang, but he hesitated to do this because he already had had one
embarrassing experience with the militia. He had ordered a company
of troops to move against the Baratarians, but Jean Laffite had met
this problem by the simple expedient of bribing the entire
expeditionary force. It has been recorded that “the brave leaders of
the Baratarians had spared their lives, loaded them with costly
presents and had allowed them to return safely to New Orleans.”
Again Claiborne pleaded in a letter to the legislature for action,
saying: “The evil requires a strong corrective. Force must be
resorted to. These lawless men can alone be operated upon by their
fears and the certainty of punishment. I have not been able to
ascertain their numbers ... but they are represented to be from 300
to 500, perhaps more.... So numerous and bold are the followers of
Laffite, and, I grieve to say it, such is the countenance afforded him
by some of our citizens, to me unknown, that all efforts to apprehend
this high offender have hitherto been baffled.” As is the history of
many complaints to legislatures, the matter was referred to a
committee, where it died quietly.
Failing again to get any action from the legislature, Claiborne
resorted to another course. He arranged to have a grand jury,
friendly to his views, chosen from the city’s merchants and bankers.
Witnesses were called who, in the strictest secrecy, swore to
knowledge of piratical acts by the Laffites and their men. Before the
news of this grand jury’s meeting spread through the city, Pierre
Laffite was arrested. He was taken to jail and bail was denied. Jean
Laffite, when he heard of Pierre’s arrest, hurried secretly to the city to
talk with friends about releasing his brother. But this time nothing
could be done.
It was while Pierre was in jail, on September 3, 1814, that His
Britannic Majesty’s brig Sophie sailed into the narrow strait off the
island of Grande Terre and dropped anchor. A small boat was
lowered and two officers were rowed ashore by sailors. The British
were met on the beach by a tall, dark-haired man whom they asked
to lead them to Monsieur Laffite. Their guide led them across the
beach onto the porch of a breeze-swept house. And then the guide
turned and said, “Messieurs, I myself am Laffite.” The visitors were
Captain Lockyer from the Sophie and Captain McWilliams of the
Royal Colonial Marines. They had brought a most unusual offer.
Laffite refused to discuss business until lunch had been served.
They ate their lunch from silverplate. It was an excellent meal, the
officers recalled, with fine wines and good conversation. When the
meal was finished, the men lit cigars and then Laffite was ready to
hear what they had to say. Captain Lockyer disclosed that he
brought an offer from Admiral Sir William H. Percy, commanding the
British squadron at Pensacola. In brief, the British were offering
Laffite $30,000 if he would bring his ships, guns and men to the side
of Britain in the war against the United States. Laffite asked them to
leave the Admiral’s letters with him and to give him fifteen days in
which to study the proposition. Then, he said, “I will be entirely at
your disposal.”
But Jean Laffite, pirate and cutthroat though he was, had no
intention of betraying the United States. He hastily wrote a letter to
his friend Jean Blanque, in New Orleans. He enclosed the letters
handed to him by the British, together with a personal message for
Governor Claiborne. Laffite said in his letter to Blanque: “Our
enemies exerted on my integrity a motive which few men would have
resisted. They have represented to me a brother in irons, a brother
who to me is very dear! of whom I can become the deliverer ... from
your enlightenment will you aid me in a circumstance so grave.”
And then he enclosed this letter to be delivered to the Governor:

MonSieur:
... I offer to return to this State many citizens who perhaps
have lost to your eyes that sacred title. I offer ... their efforts for
the defense of the country.
This point of Louisiana that occupies great importance in the
present situation, I offer myself to defend it ... I am the lost sheep
who desires to return to the flock ... for you to see through my
faults such as they are....
In case, MonSieur Le Gouverneur, your reply should not be
favorable in my ardent wishes I declare to you that I leave
immediately so not to be held to have cooperated with an
invasion.... This cannot fail to take place, and puts me entirely on
the judgment of my conscience.
I have the honor to be, MonSieur Le Gouverneur,
Laffite.

Jean Laffite had put loyalty to the United States before profit in
this time of peril. But even as Captain Lockyer conferred with Laffite,
Claiborne was going forward with plans for a land-sea operation
against the pirate stronghold.
After receiving Laffite’s letter, Claiborne called his military
advisers into conference. These were Major General Jaques Villere,
Commodore Patterson of the U.S. Navy, and Colonel Ross of the
regular U.S. Army. The questions they discussed were whether the
documents sent by Laffite were genuine and whether the governor
should enter into correspondence with Laffite. Villere thought the
documents genuine and that the Governor should reply immediately
to the Laffite letter. However, Ross and Patterson voted against
Villere. The majority favored an attack on the pirate stronghold.
Peculiarly enough, the morning after this meeting at the
Governor’s mansion, newspapers carried notices that Pierre Laffite
had mysteriously escaped from jail. A notice was posted offering
$1,000 reward for his capture.
Eight days after Laffite wrote to Claiborne, the Ross-Patterson
expedition set out for Barataria. Three barges were loaded with men
and ammunition. It left the levee at New Orleans before dawn and
drifted silently downstream with the current. Near the mouth of the
river, the barges joined forces with Colonel Ross’ fleet of six
gunboats and the schooner Carolina. The pirate hideout on the
islands of Grande Terre and Grande Isle was sighted on the early
morning of September 16.
There were indications at first that the Baratarians were going to
resist. They began placing cannon into position and arming
themselves. But then apparently they saw the American flag on the
approaching ships. They broke ranks and ran. Without firing a shot,
the expeditionary force captured the pirate fleet, guns, and stores of
merchandise valued at more than $500,000.
When news of the attack on Barataria was received in New
Orleans, there was much criticism of the expedition. And there was
even more indignation when it was learned that the expedition was
launched after Laffite had offered his services and those of his
companions to the government in the defense of New Orleans
against the expected attack by the British.
The British bribe offer to Laffite came as General Andrew
Jackson arrived in New Orleans to arrange for the defense of the
city. Claiborne sent copies of Laffite’s documents to Jackson on the
chance that they were genuine and contained military information
which would be important. Jackson made it quite clear he wanted no
traffic with “this hellish banditti” and he rebuked Claiborne for having
permitted Laffite and his men in the past to visit the city.
At last Jean Laffite made a secret trip to see Jackson himself.
There is no record of what went on between the two men and what
was said in that conference. It became known that Laffite offered to
put in Jackson’s hands a supply of 750,000 pistol flints and some of
the most skilled artillerymen in the world, including Laffite’s
lieutenants, Dominique You and Beluche.
Jackson relented and accepted Laffite’s offer of help. The
General made You and Beluche captains and they were given
command of batteries on the right side of the American line.
On January 8, 1815, the decisive battle of New Orleans was
fought. As dawn was breaking, Jackson visited the troops along the
front lines and stopped at a battery where the Baratarians were
making coffee in an old iron pot.
“That smells good,” Jackson said. “It’s better coffee than we get.
Where did it come from? Did you smuggle it in?”
Dominique You grinned. “That may be,” he said, and he ordered
a cup filled for the General. As Jackson sat on his horse sipping the
strong, black coffee, he remarked to an aide, “I wish I had fifty such
guns on this line, with five hundred devils such as those fellows
behind them.”
The British advanced on the American positions to be mowed
down by withering fire. At the height of the battle, Jackson again
visited Dominique You’s battery to see how things were going.
“Ah, we do not make much damage,” You said.
“Why is that?” Jackson demanded.
“The powder!” You replied. “It is not good. The cannon balls, they
fall short.”
Jackson said, “I’ll remedy that!” He ordered an aide to see to it
that the Baratarians received the best ammunition possible. The
General was heard to say later, “Were I ordered to storm the very
gates of hell with Dominique You as my lieutenant, I would have no
misgivings as to the outcome.”
Jean and Pierre Laffite acquitted themselves with honor in the
battle for New Orleans. They and their men were given Presidential
pardons, clearing the slate of past crimes, and for some time after
the city was saved, they were great heroes. They were wined, dined,
and cheered wherever they went.
Perhaps the brothers became bored with respectability. At any
rate they drifted back into piracy. They pulled out of their old haunts
and moved to Galveston to set up operations, and for several years
they carried on business in the slave trade. In 1820, Jean Laffite
boarded his favorite boat, The Pride, and sailed away into legend.
There were stories that he died in Yucatan. Some claimed that he
carried on his piracy in the Mediterranean. Still others said that he
settled in France and lived to be an old man. All recorded history of
Jean Laffite ended when he sailed from Galveston. Pierre Laffite was
said to have lived in Louisiana to an old age and to have died a poor
man.
The quality and the quantity of piracy subsided along with the
fortunes of the brothers Laffite. But not smuggling. Smuggling was to
continue to plague the Customs Service, and some of the smuggling
would make the Laffites look like amateurs.
5
THE DARK YEARS
Slavery was the issue which exploded into the Civil War in 1861, but
twenty-eight years before the first shot was fired on Fort Sumter, the
nation was on the edge of open war over a dispute involving the
Federal government’s right to force the collection of customs duties.
The spirit of revolt flamed high in South Carolina in 1832–1833.
It was fed, too, by sympathy in Virginia and Georgia and other
agricultural states which bitterly opposed the system of protective
tariffs as being oppressive to the farm states.
Did the central government have the Constitutional right to force
the collection of duties in a state which opposed such collections?
No! said the “Nullifiers,” who favored striking down the Federal tariff
laws. They insisted that any state had the right to withdraw from the
Union if it so desired.
The Nullifiers gained control of the government of South Carolina
and a call was issued for a convention to meet and abolish by formal
state action the collection of duties. There also were loud demands
from some state leaders for mobilization of South Carolina troops to
oppose any Federal intervention. Customs officers, more
sympathetic to the state of South Carolina than to the Union, refused
to collect duties. This convention call was the aftermath of a previous
convention at which a grim resolution was adopted saying in part:
“The state looks to her sons to defend her in whatever form she may
proclaim to Resist.”
The tariff collection issue became so divisive that reports
reached President Jackson in August and September, 1832, that the
loyalty of army officers in command of Federal troops at Charleston
was suspect.
It was reported to Jackson that in event of Federal “aggression”
against South Carolina to enforce tariff collection and oppose
secession, these officers were ready to surrender their troops to the
state rather than fight to protect the Charleston forts. These same
reports said overtures had been made “perhaps not without success”
to switch the allegiance of the naval officer in command at
Charleston, in order to prevent a Federal blockade of the port.
Jackson advised his Secretary of State, Edward Livingston, that
“the Union must be preserved, without blood if this be possible, but it
must be preserved at all hazards and at any price.” He changed the
garrison at Charleston and sent Maj. Gen. Winfield Scott to take over
the command. He warned Secretary of War Lewis Cass that a
surprise attack would be made on the Charleston forts by South
Carolina militia and directed that such an attack must be “repelled
with prompt and exemplary punishment.”
While taking these precautions, Jackson argued that if the
doctrine of nullification of customs duties by the states were ever
established, then every Federal law for raising revenue could be
annulled by the states. He denied the right of secession, declaring
that “to say that any state may at pleasure secede from the Union is
to say that the United States is not a nation.”
The controversial tariff laws had been a national issue long
before Jackson entered the White House in 1828. The major issue in
the Presidential campaign of 1823–1824 revolved around tariffs and
the use of Federal funds for such internal improvements as roads,
harbors, and like projects. The leading candidates for the Presidency
that year were John C. Calhoun of South Carolina, Henry Clay of
Kentucky, John Q. Adams of Massachusetts, William H. Crawford of
Georgia, and finally, Andrew Jackson—with Jackson and Adams
emerging as the showdown antagonists.
Adams won the election when Henry Clay threw his support to
the New Englander. Under the leadership of Clay and Daniel
Webster a high-duty system of tariffs was adopted, which was
termed the “Act of Abominations” by its opponents.
When Jackson entered the White House in 1828 the tariff issue
was still the most important and also the most divisive issue of the
day. In January, 1830, Senator Robert Y. Hayne of South Carolina
launched a strong attack in the Senate against the excessively high
tariffs. The young Senator sought a coalition between the West and
the South, the agricultural areas, to oppose the duties favored by the
industrial states.
Hayne’s argument rested on the states’ rights questions which
were to plague the nation for many years to come. Hayne contended
that “no evil was more to be deprecated than the consolidation of this
government.” He argued for the right of any state to set aside
“oppressive” Federal legislation, including tariffs.
Daniel Webster picked up the argument against Hayne. He
contended that “the Constitution is not the creature of the state
government. The very chief end, the main design, for which the
whole constitution was framed and adopted was to establish a
government that should not ... depend on the state opinion and state
discretion.” He said it was folly to support a doctrine of “liberty first
and union afterwards,” and he spoke the famous line: “Liberty and
union, now and forever, one and inseparable.”
By 1832, the South Carolina Nullifiers were openly led by Vice
President Calhoun. The extremists were in control in South Carolina
to push events toward the crisis which forced Jackson to rush back
to Washington from Nashville. The state’s legislature proclaimed that
any effort by Federal authorities to collect the duties after February
1, 1833, would cause South Carolina to secede from the Union.
When news of this proclamation reached Jackson, he ordered
seven revenue cutters and a warship dispatched to Charleston. Maj.
Gen. Winfield Scott set his men to work preparing harbor defenses
against attack from the land. The situation was at the stage where
only recklessness was needed to set off a conflict. At this time
Jackson wrote a friend that “no state or states has the right to
secede ... nullification therefore means insurrection and war; and
other states have a right to put it down....”
Jackson issued a proclamation warning the citizens of South
Carolina not to follow the Nullifiers, whose “object is disunion.” He
warned that “disunion by armed force is treason” and that those who
followed this path must suffer the “dreadful consequences.” The
proclamation spread excitement throughout the country. Many men
volunteered for military duty in case of a conflict. Several state
legislatures met to denounce nullification. But in South Carolina
Robert Y. Hayne—who had resigned from his Senate seat to
become governor of the state—issued his own proclamation in which
he vowed to maintain South Carolina’s sovereignty or else to perish
“beneath its ruins.” Hayne called for the organization of “Mounted
Minute Men,” which, he said, would permit him to place “2,500 of the
elite of the whole state upon a given point in three or four days....”
The only concession held forth by Jackson in this cold war was
his approval of a bill for introduction in the House which would call
for a reduction of tariff rates. His willingness to go along with this
measure did not eliminate the threat of a shooting conflict.
While holding an olive twig of compromise in one hand, Jackson
held a sword in the other. He sent a request to Congress asking
authority to use Federal troops if necessary to collect the customs.
Even as the cheers and curses sounded over this move, Jackson
sent a letter to Gerald R. Poinsett, a Unionist leader in South
Carolina, outlining his plans to use strong measures to enforce
Federal authority. No doubt he intended his letter to reach the hands
of the Nullificationists. He said should Congress fail to act on his
request for authority to use military force, and should South Carolina
oppose with armed force the collection of the customs duties, then “I
stand prepared to issue my proclamation warning them to disperse.
Should they fail to comply I will ... in ten or fifteen days at fartherest
have in Charleston ten to fifteen thousand well organized troops well
equipped for the field, and twenty or thirty thousand more in their
interior. I have a tender of volunteers from every state in the Union. I
can if need be, which God forbid, march 200,000 men in forty days
to quell any and every insurrection that might arise....”
Not only would he take these measures against South Carolina,
Jackson added, but if the governor of Virginia should make any
move to prevent Federal troops from moving through the state
against South Carolina then “I would arrest him....” The President
also was prepared to call on Pennsylvania, New York, Virginia, North
Carolina, Ohio, Tennessee, Alabama, Georgia and South Carolina to
furnish 35,000 troops to carry out his orders.
Jackson’s request of Congress for authority to use troops in
forcing the collection of customs was immediately called the “Force
Bill.” To the extremists it was known as the “Bloody Bill.” Vice
President John Calhoun said darkly that if the bill should pass then
“it will be resisted at every hazard, even that of death.”
It was at this point that the Great Compromiser, Henry Clay,
moved to seek the solution which would avoid bloodshed and
perhaps civil war. He introduced in the House his own bill to lower
tariffs by 20 per cent over a period of ten years. The Clay bill was
pushed through Congress along with the Jackson Force Bill and both
were sent to the President for his signature. Jackson won his
demand for authority to send troops to South Carolina to put down
any move toward secession or nullification of the tariff laws, and he
signed the compromise tariff bill even though it was, in a measure,
appeasement of the Nullificationists. Clay’s tariff bill was a face-
saving measure for South Carolina. The head-on conflict between
Federal and state forces was averted—at least for the time being.
In Jackson’s administration there was one man whose name
appears mostly in the footnotes of that turbulent period, but it is a
name that deserves special mention in this chronicle. The man was
Samuel Swartwout, Collector of Customs in New York City during
Jackson’s two terms in the White House. He rates special mention
and a shadowy niche in American history because he was the first
and only man to steal a million dollars from the Treasury of the
United States. In fact, he stole $1,250,000.
Swartwout was a young man when he plunged into New York
politics. He was a dark-haired, personable man who made himself
useful by running errands for the political bosses until he reached a
position of backroom fixer and schemer with no small amount of
influence. He was the bluff, hearty type who made friends easily. And
while he never was a central figure in the making of history, he was
one of those men whose names continually cropped up in the affairs
of the men who did make history in his time.
Swartwout was a protege and confidante of Aaron Burr during
the period of Burr’s shady adventure in the West when he was
accused of treason in an alleged plot to establish an empire in the
southwestern United States. And when Jackson’s star began to rise
as a Presidential candidate, Swartwout attached himself to the cause
of the Tennessean.
Many of Jackson’s friends and followers resented Swartwout’s
close association with Jackson because they regarded him as a
doubtful character smeared by the tar of the Burr affair. But when
Jackson entered the White House in 1828, Swartwout was among
the honored guests at the celebrations.
Jackson’s friends were concerned when it became known that
the New Yorker had easy access to the office of the President and
was seen coming and going as though he were one of Jackson’s
intimate advisers—which he wasn’t. The concern became dismay
when rumors spread that Swartwout had come to town seeking from
Jackson the nomination as Collector of Customs for New York City, a
post of no little prestige and political influence in those days.
Jackson’s Secretary of State, Martin Van Buren, was so upset by the
reports that he refused to admit Swartwout to his office or to enter
into correspondence with him.
Jackson must have felt he owed a political debt to Swartwout
because on April 25, 1829, during a recess of Congress, he handed
the New Yorker the political plum he had been seeking. Swartwout
continued in the office until March 29, 1838, with never any public
suspicion that he was involved in thefts of money collected by the
Customs House in New York. Only when the records were checked
by his successor was the discovery made that his accounts were
short by $1,250,000.
The scandal which followed broke like a storm over the young
Customs Service. Demands were made in Congress for safeguards
to prevent any such future looting of the Treasury. Enemies of
Jackson attacked the “spoils system” of appointments and centered
much of their assault on Customs.
As for Swartwout, he had foreseen the storm that was to come.
He had bade his friends farewell and boarded a ship for Europe
several weeks before the shortages in his accounts were discovered.
He was in France, safely out of reach of the law, when the scandal
broke—and he didn’t bother to return.
By 1849, the Customs Service spanned the continent. It reached
the coast of California in the person of John Collier, who was
appointed as the first Collector of Customs for San Francisco just as
the state was clearing the way for entry into the Union. Collier
reached San Francisco on November 13, 1849, after a perilous trip
across the country. He arrived at the beginning of the gold rush to
find the city and the customs situation in a state of disorganization
and confusion.
Collier was overwhelmed by the amount of business being
carried on in San Francisco, by the number of vessels arriving and
leaving the harbor, by the smuggling which was going on, and by the
high prices he found in the city. He advised Secretary of the Treasury
W. M. Meredith in a long, rambling letter: “I am perfectly astounded
at the amount of business in this office.... The amount of tonnage ...
on the 10th instance in port, was 120,317 tons; of which 87,494 were
American, and 32,823 were foreign. Number of vessels in the harbor
on that day, 10th instance, 312, and the whole number of arrivals
since the first of April, 697; of which 401 were American, and 296
foreign. This state of things, so unexpected, has greatly surprised
me....”
He found that Customs clerks were being paid from $1800 to
$3000 per annum but that the salaries were not particularly attractive
in a city gripped by the get-rich-quick fever. Flour was selling for $40
per barrel and pork for $60. Board was $5 a day and a room with a
single bed was $150 a month. Wood was $40 a cord and prices for
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like