0% found this document useful (0 votes)
279 views2 pages

The Information - A History, A Theory, A Flood - James Gleick - Book Review PDF

Uploaded by

ir_teixeira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
279 views2 pages

The Information - A History, A Theory, A Flood - James Gleick - Book Review PDF

Uploaded by

ir_teixeira
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

6332 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO.

9, SEPTEMBER 2011

Book Review
The Information: A History, A Theory, A Flood—James Gleick Wheatstone in England and Morse and Vail in New Jersey developed
(New York: Pantheon Books, 2011, 526 pp.). Reviewed by the first commercially successful electrical telegraphs in the 1830s.
Sergio Verdú Speed (letters per minute) was the name of the game and Morse and
“Sexily theoretical” trumpets The New York Times1 review of The Vail’s motivation for their invention of a non-prefix variable-length
Information—a book whose prologue regales us with some music to source code. Earlier, a variable-length source code that did not take
our ears: into account letter frequencies had been invented by none other than
“It may have far-reaching significance in electronics and elec- Gauss for his (and Weber’s) short-distance relayless telegraph.
trical communications,” Bell Labs declared in a press release, and One of the heroes of The Information is the English mathematician
for once the reality surpassed the hype. The transistor sparked Charles Babbage to whom Gleick devotes a whole chapter chronicling
the revolution in electronics, setting the technology on its path of his valiant, ultimately unsuccessful, efforts to design a computing ma-
miniaturization and ubiquity, and soon won the Nobel Prize for chine in the 19th century. By then, the logarithm, introduced two and
its three chief inventors. For the laboratory it was the jewel in a half centuries earlier by Napier, was already well-established as the
the crown. But it was only the second most significant develop- indispensable tool for convenient multiplication. Manually computed
ment of that year. The transistor was only hardware. An invention logarithm tables were rife with errors, and Babbage set out to design
even more profound and more fundamental came in a monograph an automated computing machine. He even foresaw programmable ma-
spread across seventy-nine pages of The Bell System Technical chines not driven by manual cranks but by steam. Babbage’s young col-
Journal in July and October. No one bothered with a press release. laborator and programmer was not eligible for university admission on
Author of bestsellers such as Chaos and Genius: The Life and account of her gender. She was Lord Byron’s daughter, Ada Lovelace,
Science of Richard Feynman, James Gleick is, according to the book who had the brilliant insight that the automated computer should go be-
jacket, “our leading chronicler of science and modern technology.” yond churning out numbers: It should be able to perform any sequence
The Information is, by far, Gleick’s most ambitious project to date, of logical operations. Before her untimely death, Ada came up with
and the one where his gift for the craft of writing shines most vividly. the software, but the machine tolerances of the day failed to deliver a
The reader is in for a sprawling historical ride with many heroes working version of Babbage’s hardware, funding dried out, and that
and a towering super-hero. If the bit is the main plot, we are treated to was the end of that.
fascinating excursions far afield and recurring sub-plots, which stray, Although not in the way Babbage had envisioned, steam and loga-
wisely, from a chronological account. This renders chapter boundaries rithms would soon meet. Thermodynamics, “the theoretical study of the
fairly arbitrary, but makes The Information a very lively read, powered steam engine,” is the subject of Entropy and its Demons. It is one of the
by Gleick’s wide-eyed enthusiasm for the alluring notion of informa- least successful chapters in the book, perhaps not surprisingly in light
tion and its many facets. of the pronouncement by the famous physicist Arnold Sommerfeld:
Thirty-five hundred years ago, the first alphabets containing about Thermodynamics is a funny subject. The first time you go
two dozen speech-coding letters were introduced by the Semitic tribes through it, you don’t understand it at all. The second time you
of the Eastern Mediterranean. Literature, law, and religion, up to then go through it, you think you understand it, except for one or two
residing in volatile mortal memories, emerged as the killer applications small points. The third time you go through it, you know you don’t
of the new digital recording medium. Put a checkmark on communi- understand it, but by that time you are so used to it, it doesn’t
cation across time; but how about communication across space beyond bother you anymore.
the fundamental limits imposed by the speed of bipeds or quadrupeds?
Entropy, Gleick writes, not quite accurately, “turned out to be a quantity
Sub-Saharan Africans developed a curious drum language that con- as measurable as temperature, volume, or pressure.” Sticking to the
veys the tones of the vowels and then appends a sequence of sounds standard physics narrative, Gleick perpetuates the myth that Shannon’s
tailored to enable the recreation of the information lost in the missing
crowning achievement was the rediscovery of a well-known quantity:
consonants. Think of a systematic joint source/channel encoder for a
deterministic channel. A few months after those immortal seventy-nine Shannon reinvented the mathematics of entropy. . . To the physi-
pages appeared in BSTJ, an English missionary published The Talking cist, entropy is a measure of uncertainty about the state of a phys-
Drums of Africa, which is the subject of the first chapter. ical system: one state among all the possible states it can be in.
The engineers enter the story in the chapter entitled A Nervous These microstates may not be equally likely, so the physicist writes
System for the Earth. A France-wide-web was developed during
the time of Fourier and Laplace based on a mechanical semaphore S=0 pi log pi : (1)
whose inventor, Claude Chappe, called le télégraphe. Soon, hill-top
To the information theorist, entropy is a measure of uncertainty
telegraph towers were sprouting in places like Cairo, Calcutta, and
about a message: one message among all the possible messages
San Francisco. But they suffered from two nasty problems: error
that a communications source can produce. The possible mes-
accumulation and weather outages. Capitalizing on the recent inven-
sages may not be equally likely, so Shannon wrote
tion of the relay by Princeton professor Joseph Henry, Cooke and
H=0 pi log pi : (2)
The reviewer is with the School of Engineering and Applied Science,
Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). Actually, both Botzmann and Gibbs were in the business of writing
Communicated by James L. Massey, Associate Editor for Book Reviews.
integrals, not sums. However, in 1877 Boltzmann did come very close
Digital Object Identifier 10.1109/TIT.2011.2162990
to writing (1) when, to add intuition, he imagined molecules that could
1Janet Maslin, “Drumbeat to E-Mail: The Medium and the Message,” The take only a finite number of positions/velocities. The advent of quantum
New York Times, March 6, 2011. mechanics lent a firmer justification for such an approximation and, by

0018-9448/$26.00 © 2011 IEEE


IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6333

1948, Shannon was well aware of (1) as the expression for the statistical the story. The Information makes a courageous effort to elevate Claude
mechanical entropy in the quantum setting. Shannon from his underrated status in the mainstream popular percep-
The description of the laws of physics has become increasingly prob- tion of the history of science. Superbly researched, Gleick chronicles
abilistic, and consequently more dependent on the role of information, Shannon’s early years in Michigan; his graduate studies at MIT in-
or as Wheeler put it, with a hefty dose of poetic license: it from bit. Con- cluding the most important Master’s thesis of the century, his work
versely, the imperialistic instincts latent in the physics community cou- on the differential analyzer, and his unpublished doctoral thesis on an
pled with Shannon’s “original sin” of borrowing a quantity from statis- algebra for genetics; his sojourn at the Institute for Advanced Study
tical mechanics made it inevitable that some would decree that infor- at Princeton; his work on cryptography, and the gestation of A Math-
mation is physical. How our community has remained aloof from those ematical Theory of Communication at Bell Labs.2 Claude Shannon’s
slogans and debates, and, in fact, has developed a culture of steering marriages to Norma Levor and Betty Moore, as well as his passion for
clear of ventures beyond our original and natural habitat, would have gadgets, unicycling, and juggling, get their dues. Also mentioned are
made for a good side story. his contributions to the design of computers with unreliable compo-
Thankfully, Gleick avoids promoting Norbert Wiener to the exag- nents, to the max-flow min-cut theorem, and to programming a ma-
gerated and distorted role that he occupies in all too many information chine to play chess. But Shannon disappears from the story abruptly.
theory histories, and, in particular, the balderdash that Wiener was to The reader hopes, in vain, to find out what ever happened to the genius
analog information what Shannon was to digital information. In Cy- who, in his early thirties, revolutionized the world.
bernetics (published contemporaneously with A Mathematical Theory If anything, Gleick reinforces the lay misconception that Shannon’s
of Communication), Wiener credits von Neumann with suggesting that salient contribution is his introduction, or rediscovery, of entropy.
(the negative of) the differential entropy of a density function f1 is “a Nothing is said on the key role of measuring with bits not just likeli-
reasonable measure of the amount of information associated with the hood but dependence. Moreover, Gleick fails to make the point that
curve f1 (x),” maximizes (minimizes) it under a variance constraint, information theory is about theorems, rather than definitions; that it
and considers the difference between unconditional and conditional is about posing, and answering, questions on the best performance of
differential entropies. In his autobiography I am a Mathematician, algorithms for information compression and transmission and beyond.
Wiener went on to write “the Shannon–Wiener definition of quantity The reader is told about redundancy and the Shannon limits, but why
of information (for it belongs to the two of us equally).” Unfortunately, is channel capacity indeed more relevant to Qualcomm than the speed
both Shannon and Wiener referred to differential entropy as “entropy” of light is to Ferrari?
so it is not surprising that Gleick fails to make the crucial point that In view of his obvious gifts, I wish Gleick had attempted a lay nar-
differential entropy, unlike (2), is not a measure of information nor rative of what we have learned since 1948 about the Shannon limits
does it have an operational meaning (i.e., it is not the answer to themselves (or, in the language of this Transactions, Shannon theory),
an optimization problem of interest to the engineer). In fact, there and of those problems which remain stubbornly elusive. Along the
is no evidence that Wiener ever grasped the notion, at the heart of way, he may have uncovered the cult of the pretty formula that we, the
information theory, of operational meaning lent by a coding theorem. People of the Article, profess, and how those formulas have a way to
Among all the heroes in The Information, one actually had been cer- inspire engineers. Most unfortunately, particularly in view of Gleick’s
tified as such (as Hero of Socialist Labor): Andrei Kolmogorov, who engaging account of the various 19th century information technologies
takes center stage in The Sense of Randomness, the excellent brief and his repute as a leading popularizer of modern science and engi-
chapter devoted to algorithmic complexity. Gleick relates some of the neering, The Information makes no effort to explain how technology
major nuggets that tie the great Russian academician to information has evolved in the last 63 years in pursuit of the Shannon limits. No
theory: his enthusiastic embrace (contrasting with his cold disdain for attempt is made to elucidate the algorithms that convert images and
cybernetics) of Shannon’s paper, initially viewed as suspect by hard- sounds into bits. Or the captivating race to convert bits into sounds
core mathematicians on both sides of the Atlantic; his foundation of the at faster and faster speeds to send data through the telephone wire.
first mathematical school of information theory; his import of entropy Nothing on the fascinating story that starts with Hamming and culmi-
into ergodic theory; and his definition of -entropy. A notable omission nates with sparse-graph codes. Ending with Huffman, the lossless data
is Kolmogorov’s pioneering introduction of the notion of universal data compression narrative misses one of the crown jewels of information
compression. theory: the magical Lempel–Ziv algorithm, implemented everywhere
A subject whose founding paper states in its second paragraph: and conqueror of the Shannon source coding limit. Nothing on the re-
[The] semantic aspects of communication are irrelevant to the naissance of information theory brought about by the wireless revo-
engineering problem lution. And how about some elementary iTheory for the iPhone? The
was bound to meet with less than a universal welcome. With admirable reader would have loved to find out that Shannon is the answer to the
scholarship and keen sense of humor, Gleick captures the dismal recep- question: Why do more bars result in faster downloads?
tion accorded to information theory by the social scientists: no seman- Although we are told that
tics, meaningless beep beeps, the horror! The social science take on Shannon’s theory [. . .] led to compact discs and fax machines,
information/communication receives plenty of attention, with a liberal computers and cyberspace, Moore’s law and all the world’s Sil-
sprinkling of quotes by Marshall McLuhan, and whole chapters de- icon Alleys
voted to the data deluge and to memes. Genes are to living organisms the reader leaves with the impression that information theory is very
what memes are to thoughts, and therefore, subject to the evolutionary much a thing of 1948. We are going to need a bigger book.
dictum of the survival of the fittest—the Journal of Memetics was an
2Likely based on his interview of Betty Shannon, Gleick reveals that Claude
early victim.
There are many other fascinating side histories to be found in The Shannon was among the last researchers to move from the Bell Labs New York
city building in the West Village to the gigantic new facility in Murray Hill, NJ.
Information, such as those of writing, the dictionary, Wikipedia and So, in fact, information theory did not come into existence in the state that gave
its paper forefathers, the DNA code, cryptography, and quantum in- us the lightbulb, the transistor, the Morse code, FM, and color TV, but in the city
formation/computing. But let us finally get on with the super-hero of that gave us abstract expressionism and hip hop.

You might also like