Quantum Steampunk - The Physics of Yesterday's Tomorrow - Nicole Yunger Halpern - 2022 - Johns Hopkins University Press - 9782021028553 - Anna's Archive
Quantum Steampunk - The Physics of Yesterday's Tomorrow - Nicole Yunger Halpern - 2022 - Johns Hopkins University Press - 9782021028553 - Anna's Archive
CHAPTER 0
PROLOGUE
ONCE UPON A TIME IN PHYSICS
CHAPTER 1
INFORMATION THEORY
OF PASSWORDS AND PROBABILITIES
CHAPTER 2
QUANTUM PHYSICS
EVERYTHING AT ONCE, OR, ONE THING AT A TIME?
CHAPTER 3
QUANTUM COMPUTATION
EVERYTHING AT ONCE
CHAPTER 4
THERMODYNAMICS
“MAY I DRIVE?”
CHAPTER 5
A FINE MERGER
THERMODYNAMICS, INFORMATION THEORY, AND QUANTUM PHYSICS
CHAPTER 6
THE PHYSICS OF YESTERDAY'S TOMORROW
THE LANDSCAPE OF QUANTUM STEAMPUNK
CHAPTER 7
PEDAL TO THE METAL
QUANTUM THERMAL MACHINES
CHAPTER 8
TICK TOCK
QUANTUM CLOCKS
CHAPTER 9
UNSTEADY AS SHE GOES
FLUCTUATION RELATIONS
CHAPTER 10
ENTROPY, ENERGY, AND A TINY POSSIBILITY
ONE-SHOT THERMODYNAMICS
CHAPTER 11
RESOURCE THEORIES
A HA’PENNY OF A QUANTUM STATE
CHAPTER 12
THE UNSEEN KINGDOM
WHEN QUANTUM OBSERVABLES DON’T COOPERATE
CHAPTER 13
ALL OVER THE MAP
ROUNDING OUT OUR TOUR
CHAPTER 14
STEPPING OFF THE MAP
QUANTUM STEAMPUNK CROSSES BORDERS
EPILOGUE
WHERE TO NEXT?
THE FUTURE OF QUANTUM STEAMPUNK
ACKNOWLEDGMENTS
GLOSSARY
REFERENCES
INDEX
CHAPTER 0
PROLOGUE
ONCE UPON A TIME IN PHYSICS
“Forgive me for arriving late, Audrey.” A shadow detached itself from the doorway’s other shadows
and stepped into the drawing room. The firelight played over a man in his mid-twenties, looking like a
raven who’s tumbled into a ditch. One wouldn’t believe that the cloak he’d handed the butler had borne
the brunt of the storm. “Our friend Ewart—ah, arranged for me to be delayed.”
“ ‘Our friend,’ indeed.” A girl, about ten years younger than Caspian, shut her book and beckoned
him into the warmth. “You have never flattered a soul in your life, Caspian, so pray do not begin with
Ewart. Daisy insisted on bringing out the tea things before you arrived,” she added as Caspian
squelched his way across the red-and-gold carpet. “Would you care for a cup?”
“Please.” Caspian stretched his hands out toward the fire as a raindrop trickled into his right ear. “I
trust that your brother is well?”
“If any unwellness were ill-informed enough to approach Baxter, he would turn a cartwheel, and it
would sail straight past.” Audrey set down the teapot. “Sugar?”
“If you would be so kind.” Caspian turned from the fire to settle himself into the high-backed chair
beside the table, as only a long-time family friend or a cat can in another’s house. “You said that you
had a discovery to report.” He leaned closer. “A discovery about time, and about rewriting time on the
quantum scale.”
Looking up to pass Caspian his cup and saucer, Audrey saw a spot flicker by his right cheek—and
not because of the firelight. Frowning, she set the china down.
“Indeed,” she said, feeling around the table without taking her eyes off the spot. Her hand met a
miniature crystal vase, which she emptied of its violets and water. “Precisely why I invited you.” The
flicker hovered closer to Caspian’s cheek. “In fact—”
Thwack! Audrey clapped the vase’s mouth to Caspian’s right cheek and clapped a hand to his left
cheek, immobilizing his head. Caspian flinched, and his eyes grew wide, but he didn’t move otherwise.
“Keep still,” Audrey hissed, “and hold this.” Caspian took hold of the vase while she fetched a sheet
of paper from a nearby table. She slid it under the vase, then lifted the vase away from his face,
pressing the paper against the vase’s mouth. The two of them stared at the vase in the firelight as a
small, copper body buzzed against the glass.
“That is no insect,” Caspian murmured.
Audrey shook her head.
“Whilst flattered by Ewart’s eagerness to overhear my discovery,” she said, “I could do without the
compliment.” Audrey set the vase upside-down atop the table and peered more closely. “I could never
have caught this spy-fly of his if it had not been moving so slowly, but it must be nearly drained of
energy.”
“Poor thing,” Caspian murmured. Audrey smiled.
“Baxter is developing a superior version, which can extract energy from a certain type of light quite
efficiently. Not the type here,” she added, waving at the fireplace, “but a type that Ewart would have in
his laboratory.”
The two of them watched the mechanical buzzing insect awhile longer, the fireplace crackling behind
them, till a raindrop trickled down Caspian’s nose.
“I neglect my duties as hostess,” Audrey said, reaching for his cup and saucer. “Tea?”
and harness their correlations (we’ll see how in chapter 3). Such a quantum
computer could solve, in minutes, certain problems that would cost classical
computers many years. Potential applications include chemistry, materials
science, and drug design. Quantum computers could also break part of the
encryption that protects web transactions from hackers. (Don’t worry too
much; postquantum cryptographers are developing codes that quantum
computers can’t break.) Quantum computers couldn’t help with all our
problems; for instance, I wouldn’t recommend preparing your taxes on one.
Still, commercial giants such as Google, IBM, Honeywell, and Microsoft are
building quantum computers. So are startups, such as IonQ and Rigetti, and
the governments of multiple countries. Tech giant Amazon offers an online
portal through which consumers can use startups’ early-generation quantum
computers.
Don’t expect web encryption to break in the next few years, though.
Controlling entanglement among tens of particles has cost generations of
graduate students. Controlling entanglement among tens of thousands of atoms
will cost loads more. A few skeptics believe that we’ll never be able to
control entanglement among many particles, although most quantum-
computing scientists disagree. Time will tell, provided that funding for
quantum computing doesn’t dry up first.
To exhibit quantum phenomena such as entanglement, most quantum
systems need to operate at low temperatures. Cooling—the expulsion of heat,
or random energy—falls in the purview of thermodynamics. But how to
measure heat, in quantum contexts, requires thought: measuring a quantum
system changes, or disturbs, it. In contrast, sticking a thermometer in your
(classical) mouth won’t influence your fever. But measuring how much heat a
few atoms emit can influence the amount of heat emitted. Nineteenth-century
thermodynamics needs reenvisioning for twenty-first-century quantum
science: we must replace gears, pulleys, and levers in the theory of
thermodynamics.
What mathematical, conceptual, and experimental toolkit should we use?
Quantum information science. The promise of quantum technologies
galvanized the development of quantum information science as steam engines
galvanized the development of thermodynamics. Since the Industrial
Revolution, scientists have applied thermodynamics to understand everything
from the stars to the origins of life. Over the past three decades, scientists
have been applying quantum information science to understand anew
computer science, mathematics, chemistry, materials, and more. Quantum
information science offers a toolkit for revolutionizing thermodynamics to
describe small, quantum, and information-processing systems.
I am participating in the revolution. I’m not the first in the field; nor am I
the only revolutionary. My cohorts span the globe, and augurs of our mission
whispered as early as the 1930s. Many call our field quantum
thermodynamics or quantum-information thermodynamics. But
thermodynamics developed as science was emerging from natural
philosophy, which has an aesthetic. Natural philosophers understood
aesthetics, as they studied philosophy, literature, and history, in addition to
geometry and astronomy. Today’s physicists invoke aesthetics, too—when
favoring a simple equation that describes much of the world over a
complicated equation, or over an equation that describes little. But aesthetics
arguably used to play a broader role in science and natural philosophy. For
instance, scientific instruments shared the elegance of musical instruments
during the Victorian era: brass gleamed against mahogany; a curve
occasionally arched without necessity because it pleased the eye. Honoring
aesthetics ties one to the grandness, richness, and inspiration of the tradition
inherited by today’s scientists. Quantum thermodynamics has the aesthetic of
steampunk, I realized while pursuing my PhD: I marry thermodynamics with
quantum information science—the Victorian era with futuristic technology.
Hence the term quantum steampunk.
* Octopodes feature in Twenty Thousand Leagues under the Sea, and they exhibit intelligence and
engineering prowess. For example, an octopus escaped from the National Aquarium of New Zealand,
apparently by slithering through a gap in his tank, propelling himself across the floor, and clambering into
a drainpipe that fed into a nearby bay.1
CHAPTER 1
INFORMATION THEORY
OF PASSWORDS AND PROBABILITIES
A slat of wood rattled as it slid across the grating in the oaken door.
“Password,” rasped a voice through the grating.
“You know me perfectly well, Baxter,” Audrey snapped. “And I refuse to recite that ludicrous—”
“T’aint Baxter,” said the voice, sounding less raspy and more put-upon. “Baxter left for th’ water
closet ’alf an hour ago.”
“Oh. Very well.” Audrey drew a breath and dashed through the password as though it were a
scandal sheet that she didn’t want to be caught reading. “Lord-Buntiford-drank-two-bottles-of-wine-
and-then-danced-a-jig-without-clothes-on.”
The slat of wood slid back into place with a rattle, and the door creaked open.
TEASPOONS OF INFORMATION
We measure things in terms of units—time in seconds, sugar in teaspoons,
and length in meters or inches or (if you’re webcomic artist Randall Munroe)
giraffes.2 How do we measure information, and what is the unit of
information? We’ll approach these questions like physicists: start with
examples, form a guess, and then check the guess against more examples and
against principles, modifying our guess when necessary.
In Audrey’s story, the gatekeeper must distinguish whether Audrey belongs
to his cabal. Suppose that, before hearing the password, he hasn’t the foggiest
idea whether she does. From Audrey’s communication of the password, he
learns a substantial amount of information. Now, suppose that the gatekeeper
recognizes Audrey’s voice the first time she speaks. He ascribes a high
probability—say, 75%—to the possibility that Audrey belongs to his cabal.
Hearing the password doesn’t surprise him; the password conveys little
information.
Events—the hearing of a password, the reading of a book, a lover’s
getting down on one knee—convey information. The more expected the
event, or the higher the event’s probability, the less information the event
conveys. As an event’s probability increases, the probability’s inverse
decreases. As a reminder, the inverse of two is one-half, the inverse of three
is one-third, and so on. Let’s guess that, if an event with some probability of
happening occurs, the probability’s inverse measures the information
conveyed.
Let’s check how reasonable this guess is. Our rule for measuring
information should reflect how amounts of information add up. Suppose that,
after Audrey responds, the gatekeeper asks whether the pub next door has
closed. The conversation will give the gatekeeper two pieces of information:
Audrey belongs to the cabal (rather than not belonging), and the pub is open
(rather than closed). The total amount of information learned should be the
first amount plus the second.
We can measure the total amount of information another way. The
gatekeeper’s conversation with Audrey produces one of four possible
outcomes: (1) Audrey belongs to the cabal, and the pub remains open; (2)
Audrey belongs to the cabal, and the pub is closed; (3) Audrey doesn’t
belong to the cabal, and the pub remains open; or (4) Audrey doesn’t belong
to the cabal, and the pub is closed. Each joint event consists of two
constituent events. Each joint event’s probability is the first constituent
event’s probability times the second constituent event’s probability. For
example, say that the gatekeeper recognizes Audrey’s voice but has no idea
whether the pub is open. He ascribes a three-quarters probability to Audrey’s
belonging, a one-quarter probability to Audrey’s not belonging, a one-half
probability to the pub’s remaining open, and a one-half probability to the
pub’s being closed. The joint event “Audrey belongs, and the pub remains
open” has a probability of three-quarters times one-half, or three-eighths
(figure 1.1).
FIGURE 1.1
How much information does the gatekeeper gain from the joint event?
According to our earlier stab at quantifying information, the amount of
information is the inverse of the joint event’s probability—the inverse of
three-eighths, or eight-thirds. But we also concluded that the total amount of
information should be the sum of the constituent amounts. Drat, as Audrey
would say. We have two expressions for the total amount of information
learned by the gatekeeper. One expression is a product, and the other is a
sum. We’ll have to tweak our rule for quantifying information—to turn the
product into a sum, so that both expressions are the same.
The Hungarian mathematician Alfréd Rényi said, “A mathematician is a
machine for turning coffee into theorems,” or proven mathematical facts.3 A
logarithm is a mathematical machine for turning products into sums. By
“mathematical machine,” I mean that the logarithm takes in numbers and
outputs possibly different numbers. For our purposes, turning products into
sums is the logarithm’s crown-jewel property.
The logarithm will improve our rule for measuring the information
conveyed by an event: Let the amount of information be the logarithm of the
inverse of the event’s probability. This number is called the event’s surprisal
because it measures how much the event surprises you. The more an event
surprises you, the more information you learn. According to our revised rule,
amounts of information add together, even though event probabilities
multiply.
We’d like to break an amount of information down into units, as we can
break a bowl of sugar down into teaspoons. What should we designate as the
teaspoon of information?
Information, we established, is an ingredient needed to distinguish
between alternatives. The least possible number of alternatives is two:
Audrey belongs to the cabal or doesn’t. Without prior information (lacking
familiarity with Audrey’s voice), the gatekeeper assigns both possibilities
equal likelihoods, of one-half. According to our rule for measuring
information, the amount of information he learns is the logarithm of two. This
amount is the unit of information, called the bit. You learn a bit of
information upon flipping a fair coin and (after crawling under the table
where it rolled) finding that it landed heads-up.
Imagine an unfair coin that has a three-quarters probability of landing
heads-up and a one-quarter probability of landing heads-down.
(Equivalently, suppose that the guard recognizes Audrey’s voice.) Seeing the
coin land heads-up, you learn log(4/3) bits of information. Seeing the coin
land heads-down, you learn log(4) bits. Any time a random event happens to
you—the weather is sunny rather than snowy or cloudy, or your favorite pub
closes early, or your steam-powered time machine breaks down—you can
measure in bits the information you’ve learned.
Bit refers to the unit of information, but information scientists also use the
word in other ways. Bit can refer to an event that can play out in two ways.
In one of the least romantic examples imaginable, the response to a marriage
proposal constitutes a bit. Also, we sometimes designate as a bit a physical
system able to be in one of two possible states. Examples include a candle
that’s lit or unlit. The two options—for the physical system or for the event—
may be “lit” or “unlit,” or “yes” or “no,” or “shepherd’s pie” or
“ploughman’s lunch,” or any other dichotomy. Information scientists
represent the options with zero and one (0 and 1). This convention simplifies
our work, much as I’d enjoy reading papers about shepherd’s pies and
ploughman’s lunches. Alas …
FIGURE 1.2
But not all those possible strings are likely. Probably, about a fraction pG
of Audrey’s letters are G’s, and about a fraction ps are S’s. Say that, on any
given day, Audrey has a decent probability of catching Baxter in either
activity; that is, neither probability lies close to 0 or 1. Also, Baxter’s habits
aren’t completely random; he’s more likely to stand guard than to sleep.
Audrey might find Baxter standing guard every night for 30 years. But her
likelihood of recording 11,000 G’s is so low that we can call that string
basically impossible. Most strings are basically impossible, mathematics
shows. Audrey has a decent probability of writing only strings of a certain
type—the strings in which about a fraction pG of the letters are G’s and about
a fraction ps of the letters are S’s. This fact parallels how, if we flip a coin
11,000 times, we expect the coin to land heads-up about half the time and
tails-up about half the time.
How does Audrey compress her string? She labels the first not-virtually-
impossible string 1, the second not-virtually-impossible string 2, and so on
(figure 1.2). Her journal probably contains one of those strings, and she
records that string’s label. That label is the result of her data compression:
Audrey has squeezed 11,000 letters into one label. At least, Audrey has
replaced 11,000 letters with one label. I haven’t shown that the label
contains fewer than 11,000 letters. Let’s figure out how small the label is—
how many bits Audrey needs to specify the label.
Every time she checks on Baxter, Audrey receives information. How much
information—how many bits—on average? The answer is the Shannon
entropy of pG and ps, which measures her average uncertainty about what
she’ll find Baxter up to. Across 11,000 days, Audrey receives a number of
bits that about equals the Shannon entropy times 11,000. That’s how many
bits form Audrey’s label. Say that Baxter is four times more likely to stand
guard than to sleep: pG is 4/5, and ps is 1/4. The Shannon entropy turns out to
be about 0.7 bits, so the label requires about 0.7 bits for each of the 11,000
days, or about 8,000 bits total. Data compression saves Audrey 11,000 minus
8,000 bits—or 3,000 bits—measured with the Shannon entropy.
QUANTUM PHYSICS
EVERYTHING AT ONCE, OR, ONE THING AT A TIME?
A painting beckoned to Audrey from across the hallway, from an alcove tucked away from the hum and
bustle of the natural philosophers. She approached without hearing her boots tapping on the marble
floor; she wove between pairs of murmuring gentlemen without seeing them. The painting depicted a
wood-paneled study, the likes of which Audrey yearned to curl up in with a pencil, a notebook, and
silence. Against the study’s far wall stood a desk formed from three stacked shelves, the top of which
formed the writing surface. The desk supported a globe, two mechanical contraptions, and what could
have been a mathematical compass or a nutcracker. A broad sheaf of paper rested on the desk, trapped
beneath a pile of books, and partially hanging off the wood.
Audrey stepped closer. Was she looking at a desk, or was it a platform onto which the scholar could
climb? The platform hypothesis gained support from a mahogany-colored door, behind the globe, that led
out of the study. Above the door, drawings had been inked on the wall; Audrey peered at them more
closely. Was that a tree inked on the right-hand side? Was light entering a lens inked on the left?
Leftward of the drawings stood a column, plainer than its Ionic cousins but evoking the classical world
of Euclid and Pythagoras. Leftward of the column, a gold-latticed window afforded a view of the sky.
A sigh escaped Audrey as she peered through the painting’s window: in the bottom panes, a cloud
drifted above gleaming water that wove through a golden-brown landscape. The study must have been
floating in the heavens—and what more fitting place for a heavenly study? Hands clasped behind her
back, Audrey bent forward and read the brass label beneath the painting.
“Everything at once, or, one thing at a time?”
FIGURE 2.1
For instance, the atom in the figure begins on a high-up rung, with an
amount E4 of energy. The atom can descend to the next-highest rung while
emitting a packet of energy—a photon, or particle of light. The photon
carries the difference between the two rungs’ energies. Then, the atom can
repeat this process—descend to the next rung while emitting another photon.
Each photon carries off an amount of energy calculable with quantum theory.
Imagine the atom descending, step by step, to the ladder’s bottom rung. The
final photon carries 10−19 times less energy than the gravitational energy of a
pumpkin sitting one yard off the ground.
We call the packets quanta of energy because they contain fixed amounts.
Likewise, the atom’s electric energy is said to be quantized because the
amounts accessible are fixed. “Quantum physics” means “physics of fixed-
size packets.” Which makes my work sound like the science of airline
snacks; but I’ve heard graver insults.
In middle school science class, we envision electrons as particles zipping
around the nucleus. This vision doesn’t capture the whole truth—we’ll see
how later—but the picture captures some important features of the truth; so
it’s worth using sometimes. The electron follows a path that curves around
the nucleus. The electron’s velocity consists of the electron’s speed and
direction; and as the electron curves, its direction changes. Its velocity
therefore changes, and so the electron accelerates.
Also as we learn in middle school science class, the electron carries
negative charge. Charged particles usually radiate, or emit, photons as they
accelerate. This radiation enables us to play Billie Holiday, one of my
favorite jazz musicians, over the radio. The radio DJ forces electrons in an
antenna to vibrate up and down. Every time the electrons switch direction—
for instance, from up to down—they accelerate and emit photons. The
photons travel to your receiver, which helps transform their energy into
sound.
Curving around the nucleus should cause electrons to radiate photons,
which would carry away energy. The more energy the electrons radiated, the
less these negatively charged particles could resist the attraction of the
positively charged nucleus. The electrons would spiral into the nucleus, and
the atom would implode. Matter couldn’t exist; radios and DJs and ears
couldn’t exist; and we’d never hear Billie Holiday’s music. Since I have
heard Billie Holiday’s music, this story, based on classical physics, must be
wrong.
Quantization rushes to the rescue by allowing the atom to have only
certain amounts of electronic energy. Namely, the atom’s energy ladder ends
with a lowest rung. When the atom occupies that rung, it can’t drop any
lower. On that rung, furthermore, the electron orbits the nucleus one ten-
millionth of a millimeter away. The electron can’t spiral into the nucleus
because it can’t emit photons, because the atom lacks any lower energy rung
to which to drop.
Why does the ladder have a lowest rung? Quantum theory offers little
insight about such questions. Physics elucidates what and how questions. It
also sheds some light on why questions, such as “Why does the sky look
blue?” or “Why don’t atoms implode?” or “Why does a ponytail have the
shape it has?” (Yes, really.)1 But why should the laws of physics be such that
atoms don’t implode? Attempts to answer this question veer into philosophy,
circularity, or theology. I endorse the questions but lack the expertise to
answer. At least atoms don’t implode; quantum theory reveals what prevents
them from imploding; and matter’s stability allows us to keep asking
questions.
Quantization isn’t as nonclassical as one might think; a classical system’s
energy can act almost as though it were quantized. For example, consider a
jar of apricot preserves owned by Audrey’s family, the Stoqhardts, since her
father was a boy. The jar has occupied one or another of the pantry’s ten
shelves for decades. A shelf’s height determines the gravitational potential
energy that the jar has when on that shelf. So, the jar’s gravitational potential
energy has been one of ten fixed numbers for most of several decades.* That
is, the energy behaves almost as though it were quantized.† How nonclassical
a behavior is will concern us later, when we evaluate which parts of quantum
steampunk we could mimic classically.
For now, we’ll just survey behaviors exhibited by quantum systems. We’ll
begin with spin.
FIGURE 2.3
Why care about spin? We’ll encounter a reason in the next chapter: spins
store information in a simple way. While we can often imagine spins as
teensy arrows clutched by electrons that resemble teensy balls, this cartoon
sometimes spits on the truth too forcefully. Our next quantum phenomenon
explains why.
MAKING WAVES
I prepared for my career in physics, during childhood, by jumping rope. At
recess, friends and I took turns turning the rope tied to a metal fence.
Occasionally, after the school day ended, I wandered by and experimented
with shaking the rope.
Shake one end of a rope, and a wave propagates down it. The distance
between successive crests is the wavelength. Shaking slowly produces a
long wavelength, and shaking quickly produces a short wavelength (figure
2.3).
Waves don’t only propagate down ropes, or in the ocean. Quantum theory
ascribes wave properties to matter. This ascription partially explains why I
don’t fully endorse the middle school vision of the electron. Viewing the
electron as a minuscule ball zipping around the nucleus, we can understand
some properties of the electron. For example, imagine measuring an
electron’s position. Our detector will flash the coordinates of some location,
akin to the coordinates that flash on a GPS screen. But the electron isn’t truly
a miniature ball sitting at that point. How does the electron differ?
We can ascribe a location to the electron only at the moment when we
measure the particle’s position. Before the measurement, and shortly after the
measurement, the electron resembles a wave more than a ball. A wave
doesn’t occupy just one point. It extends across a distance, consisting of
crests, troughs, and the spaces between. Quantum theory ascribes to the
electron a wave that extends throughout space. The wave peaks near an
atom’s nucleus. The higher the wave rises, at any point, the more likely is our
detector to flash that point’s coordinates if we measure the electron’s
position. We can’t predict the measurement’s outcome, usually, because the
wave stretches throughout space. We can predict only the measurement’s
probability of pointing to this position or that position.
So the mathematics that describes the waves in a jump rope also
describes the electron. According to quantum theory, every chunk of matter
and light resembles a wave. We call this property wave-particle duality, and
it extends even to you. Your wavelength depends on your speed, as when
you’re strolling down the street. While strolling, you have a wavelength 1026
times smaller than a hydrogen atom: properties of you undulate across that
distance as a wave undulates across a jump rope. Don’t worry that the
undulation will fill your life with tremors, as though you lived in San
Francisco and quantum theory were an earthquake. No one can observe such
short lengths, so you’ll never notice your wavelike properties.
FIGURE 2.4
A HUNDRED INDECISIONS
We’ve explored four phenomena critical to quantum theory: quantization,
spin, wave-particle duality, and superpositions. The next phenomenon
features in a 1915 poem by the American-British poet T. S. Eliot, “The Love
Song of J. Alfred Prufrock.” The speaker, a middle-aged extra in the play of
life, wrestles with himself:
And indeed there will be time
To wonder, “Do I dare?” and, “Do I dare?”
Time to turn back and descend the stair.
FIGURE 2.5b
The foregoing story may give us pause, but entanglement challenges our
intuitions further. Suppose that, after entangling the spins, the siblings don’t
measure their particles individually. Instead, the siblings perform a special
joint measurement of both the spins together (figure 2.5c). Audrey and Baxter
can predict the outcome with certainty before performing the measurement.
FIGURE 2.5c
This conclusion should surprise us. Audrey can predict nothing about her
particle; no matter how she measures it, she has no idea what her detector
will report. That is, Audrey has no information about the outcome of any
measurement of her particle. Neither does Baxter have information about the
outcome of any measurement of his particle. But the siblings have complete
information about the outcome of a measurement of both particles; the
siblings can predict the outcome perfectly. So, the siblings have complete
information about the pair of particles, yet no information about either
individual particle—complete information about the whole, yet no
information about the parts.
In classical physics, if we know everything about the whole, we know
everything about the parts. For example, before my wedding, I knew that a
Michelle and a Miles would attend; I knew everything relevant about the
whole pair of guests. Therefore, by classical-physics logic, I knew
everything relevant about the parts: I knew that Michelle (one part) would
attend, and I knew that Miles (the other part) would attend. So, Audrey and
Baxter’s situation, in which they know everything about the whole but nothing
about the parts, sounds like codswallop. But a whole entangled system is
greater than the sum of its parts.
I think of entanglement as something shared between particles. The
entanglement isn’t in one particle, and it isn’t in the other. It isn’t in the sum
of particles measured individually. It’s in the collection of particles.
This collectiveness enables entanglement to produce correlations stronger
than any producible by classical particles. We’ve imagined Audrey and
Baxter measuring whether their particles’ spins point upward. In figure 2.5b,
Baxter’s detector reads “yes” if and only if Audrey’s does; the results are
correlated perfectly. This correlation sounds strong, but classical particles
can mimic them. We can imagine that, when Audrey and Baxter brought their
particles together, the particles flipped a coin and agreed to respond “yes” if
the coin landed heads-up and to respond “no” otherwise. (Not that particles
can flip coins or speak. But they can behave in ways that have the same
effect.)
So perfect correlations in such simple experiments shouldn’t impress us.
But we can design more-devious experiments, as the physicist John Stewart
Bell did in 1964.5 Bell requires Audrey to choose her measurement (to
choose which property she measures) randomly in each trial and Baxter to
choose his measurement randomly in each trial. The details of Bell’s
experiment lie outside the scope of this book; you can find them in John
Gribbin’s book Schrödinger’s Kittens and the Search for Reality.6 We need
to know only the following: Suppose that Audrey and Baxter run many trials
of Bell’s experiment. Audrey obtains many measurement outcomes, as does
Baxter. The siblings can calculate the correlations between their
measurement outcomes—how much the changes in Baxter’s outcomes track
the changes in Audrey’s outcomes. The correlations can be stronger than any
producible with classical particles.
Hence my colleagues’ wedding wishes. “May you stay forever entangled”
meant, “May you share a strong partnership.” My husband and I weigh far
more than electrons, take up far more space, and consist of far more
particles; therefore, classical physics describes us, and we can’t entangle.
But isn’t the thought sweet?
LET US GO THEN
We’ve overviewed seven quantum phenomena: Quantization limits an atom
to having only certain amounts of energy. Spin obeys the same mathematics as
angular momentum but doesn’t stem from rotations. We can picture an
electron’s spin as an arrow pointing in some direction. Wave-particle duality
likens quantum systems to waves spread across space. As you can superpose
waves on a jump rope, so can a quantum system occupy a superposition of
locations (or momenta, or spin directions, and so on). If you measure the
system’s location, you can’t predict which spot your detector will report. The
better you can predict a position measurement’s outcome, the less you can
predict a momentum measurement’s outcome, and vice versa, by the
uncertainty principle. Measuring a quantum system disturbs it, forcing a
superposition across locations into just one location.
Particles can share entanglement, information encoded not in one particle
or in the sum of particles addressed individually, but in the collection of
particles. Entanglement is monogamous. The more entanglement Baxter’s
particle shares with Audrey’s, the less entanglement Baxter’s particle can
share with the rest of the world. The latter entanglement decoheres particles,
preventing them from producing strong correlations.
Let us go then, as J. Alfred Prufrock says, you and I. Let us go from
learning about quantum physics to putting it to work in information
processing.
* Granted, the jar’s gravitational potential energy changes fluidly whenever the chef moves the jar to a
higher or lower shelf. But the jar occupies a shelf throughout most of the years.
†
Granted, every object in our everyday world consists of quantum particles and so has quantized
energies. But the energy-ladder rungs are separated by a teensy amount because the object is large. So,
the distance between the energy rungs—the quantization—is basically impossible to observe. We can
regard the ladder rungs as basically touching each other, and we call the object classical. In contrast, the
jar’s approximate energy quantization is observable and is imposed by the jar’s classical environment,
not by quantum physics.
* The electron doesn’t technically have a length, for a reason explained later. But we can, loosely
speaking, imagine the electron as filling a certain volume. One coarse estimate of this volume’s length is
10-13 inches.2
* Granted, light harms wood in extreme situations. For example, by concentrating light on wood through
a magnifying glass, you can start a fire. But driftwood can withstand many hours of sunlight without
altering much.
* In figure 2.5b, I’m assuming that Audrey and Baxter share an entangled state of a particular type, in
addition to assuming that Audrey obtains the "yes" outcome. The nature of this entangled state ensures
that if Audrey obtains a “yes,” then Baxter obtains the same outcome, rather than the opposite.
* Why can’t Audrey tell Baxter her choice of measurement before the siblings separate? Warning
Baxter would amount to cheating: Audrey is hoping to communicate information (her measurement’s
outcome) to Baxter ultra-quickly, using entanglement. If she communicates her choice of measurement
before the game begins, Baxter starts with some of the information needed to infer Audrey’s
measurement outcome. Figuring out Audrey’s outcome is a little like figuring out a crossword puzzle.
Knowing Audrey’s measurement at the outset is like figuring out a crossword puzzle that’s already filled
in partially—it’s cheating.
CHAPTER 3
QUANTUM COMPUTATION
EVERYTHING AT ONCE
Audrey whispered the painting’s name to herself again: “Everything at once, or, one thing at a time?”
She remained tilted forward for a moment, head inclined toward the painting’s brass label, hands
clenched behind her back, before straightening up suddenly, like a compressed spring uncoiling.
“Everything at once,” she declared.
FIGURE 3.1
Suppose that, every Sunday, our Jumble contains one more letter than the
previous Jumble. The amount of paper we’ll need to brute-force-solve the
puzzle will grow each week. The growth would be huge—comparable to
exponential growth. I’d welcome a quantum alternative to the growing
mountain of squares.
Let’s quit representing each letter with an arrangement of pencil lead on
paper. Instead, let’s represent each letter with a rung in an atom’s energy
ladder. Occupying its lowest rung, an atom encodes an A; occupying its
second-lowest rung, a B; and so on for the rest of the 26 lowest rungs. (We
can typically prevent the atom from climbing higher—from acquiring more
energy—by keeping the atom cool.) We can take eight atoms and put them in
a superposition of the relevant energies—a superposition of the possible
letter orderings. We can even superpose all the orderings of any eight letters,
as shown in figure 3.1.
We don’t need hordes of squares to encode all the letter orderings; eight
atoms suffice, thanks to superpositions. Quantum systems can, in a sense,
store information more compactly than classical systems can. Furthermore,
preparing a superposition can take less than a second; whereas writing out
40,000 letter orderings would take days. More generally, quantum systems
can help us solve certain problems much more quickly than classical
computers can.
Despite these advantages, we can’t solve the Jumble puzzle just by putting
the atoms in a superposition. Only one component of the superposition—only
one of the rows in figure 3.1—represents the puzzle’s solution. We have to
extract the solution from the atoms, by running some algorithm—by
following some recipe terminated with a measurement. The simplest
algorithm would require no steps before the measurement. Immediately after
preparing the superposition, we’d measure every atom’s energy, obtaining
some permutation of letters. But that permutation would be a random
selection from all the possible permutations—and likely not the solution to
our puzzle. Extracting solutions from superpositions requires cunning.
Broadly speaking, we must prune the wrong components from the
superposition as much as possible.
Quantum computer scientists specialize in such pruning, like gardeners
armed with quantum physics instead of with shears. Quantum computer
science centers on a task that generalizes the solution of a Jumble puzzle:
using quantum computers to solve computational problems more quickly than
classical computers can. Quantum computers consist of atoms or other
quantum objects, rather than today’s transistors.
How do we typically solve a computational problem using a quantum
computer? We begin similarly to how we’d begin solving a problem in high
school math class. There, we’d pull out a blank sheet of paper, which we’d
later fill with scratch work. Like our high school selves, a computer needs
blank paper, at least metaphorically. In a classical computer, “blank paper”
consists of transistors that encode bits set to 0. The classical computer’s
computation flips some bits to 1, then flips other bits, then maybe flips some
bits back to 0, and so on. The bits’ final configuration records the answer to
the problem being solved. A quantum computer has, rather than transistors,
electron spins—or atoms or other quantum objects—that encode qubits. The
qubits start out in the quantum analog of 0—pointing upward. How do we
induce qubits—say, electron spins—to point upward? We can stick a
magnet’s south pole above the electrons and stick a magnet’s north pole
below them (figure 3.2). The magnetic field points from north to south, or
bottom to top. Then, we cool the spins, lowering their energies. A spin has
the least energy when aligned with the magnetic field, so the spins end
pointing upward.
FIGURE 3.2
LAB TOUR
What do quantum computers consist of? Most of today’s best computers are
classical and consist of transistors.* Granted, quantum theory describes how
electrons flow through transistors. But transistors serve the same purpose as
the vacuum tubes used in computers during the early 1900s. Transistors
represent bits as vacuum tubes do—just more compactly. The particles in my
laptop can’t entangle with each other (much, or usefully). Entanglement
underlies quantum speedups, so quantum computers need quantum
alternatives to transistors. Experimentalists and engineers are building
quantum computers from many platforms, or types of hardware. I’ll
overview three here, and we’ll encounter others later in this book.
One type of hardware, I first encountered at the IBM research facility an
hour’s drive from New York City. Boasting sweeping architecture frosted
with glass and stone, the facility reminded me of Fred Astaire: decades-old,
yet classy. The technology inside is anything but old. IBM is building a
quantum computer from superconducting qubits, tiny circuits cooled to low
temperatures. Superconductivity is a property that graces certain quantum
materials: current can flow through the material forever, without dissipating.
Imagine current flowing counterclockwise in a superconducting circuit. The
current plays the role of an upward-pointing electron spin, acting as the
quantum analog of a 0 bit. A clockwise current plays the role of a
downward-pointing spin, or a quantum 1. The current can also be in a
superposition of flowing in both directions.
When I visited IBM’s quantum-computing lab, it contained seven canisters
the size of linen closets. Experimentalist Nick Bronn gave me a lab tour.
Upon finding a canister that wasn’t running, he climbed half-inside. Gold-
and silver-colored wires, trays, and tubes surrounded him. A
cinematographer couldn’t have conjured a more steampunk scene in
Hollywood.*
“This is the fridge,” Nick said.
A romantic would hope that quantum computers, a technology of the
future, would look futuristic. A dreamer would envision the canister, the
silver, and the gold as the quantum computer. But the canister does to the
quantum computer what a Frigidaire does to a salmon filet. The wires help
the experimentalists wrangle the qubits into computing. And the quantum
computer? It consists of a chip that fits in your palm. Granted, the chip is
shiny. And designing it required the innovation and toil of a steampunk
invention.
I jest about the fridge, but it deserves as much applause as the chip does.
And no one should applaud louder than a quantum thermodynamicist. Cooling
—expelling heat—is a thermodynamic process. Nick’s fridge cools
superconducting qubits to near absolute zero, the lowest temperature
attainable. Upon learning how far such fridges cool their qubit filets, my
husband expressed astonishment at their name.
“Fridges?” he said. “They cool to the lowest temperatures in the world,
and physicists couldn’t at least call them freezers?” You can call them
dilution refrigerators, if you want to sound more scientific.
The second quantum-computing architecture nearly impacted my
entanglement—I mean, my wedding plans. A year after I moved to Harvard,
my then-boyfriend asked what type of engagement ring I’d like. I joked that
I’d prefer a diamond riddled with defects because I could run quantum
algorithms on it. Diamond consists of carbon atoms arranged in a repeating
pattern. Imagine expelling two neighboring carbon atoms and replacing one
with a nitrogen atom. The resulting structure has electrons that can encode
qubits. A jeweler would call such areas defects: they discolor the diamond,
lowering its desirability as a decoration. Moreover, quantum-computing
diamonds are either nanoscale—too small to see—or rectangular plates
mounted on a special material. My boyfriend gave me an heirloom stone
instead.
The third quantum-computing platform consists of nuclei in atoms. Nuclei
have spins, similarly to electrons, that can serve as qubits. Many nuclei—and
therefore many qubits—cluster together in a molecule. We can control these
qubits with a magnetic field, to perform quantum logic gates. This control,
combined with measurements of the spins, forms an experimental toolkit
called nuclear magnetic resonance (NMR). Medical doctors use NMR to
image people’s brains in magnetic resonance imaging (MRI) scanners. Not
that your doctor runs quantum computations on your brain. But MRI uses
magnetic fields to identify nuclear spins in your brain and take a picture of
them. Like your brain, a quantum computer run on NMR can operate at room
temperature. Not needing fridges the size of linen closets offers an advantage.
But scaling up an NMR quantum computer—stuffing a molecule with nuclei
—is difficult chemically.
We’ve glimpsed three platforms for quantum computers: superconducting
circuits, defects in diamond, and NMR. Many more platforms exist, and
different companies and universities are betting on different favorites. Each
platform, like a breed of horse, offers pros and cons. Comparing the
contenders would require another book, so I won’t do so here. Which
platform will win the race to quantum computing? No one knows. And
hybridizing platforms—leveraging each to accomplish what it excels at—
might triumph. I’m cheering for the hybrids, as an interdisciplinarian less
partial to the c-word competition than to the c-word collaboration.
* Why not necessarily exponentially more quickly? Because, loosely speaking, of the difficulty of
pruning undesirable components from a superposition.
* According to a joke among mathematicians, three experts in different fields have to identify a pattern
in the prime numbers. The physicist says, “Three is prime, five is prime, and seven is prime. So, the odd
numbers must be prime.” The biologist says, “Three is prime, five is prime, seven is prime, and eleven is
prime. The odd numbers must be prime, while nine must have a genetic anomaly.” The engineer says,
“Three is prime, five is prime, seven is prime, nine is prime, eleven is prime …”
* Computers can avoid wasting energy, scientists concluded—but only by operating infinitely slowly.4 I
prefer my laptop, regardless of overheating, to a computer that would put me off till the universe ends to
tot up my monthly grocery expenses.
* Albert Einstein called entanglement “spooky action at a distance.” My husband calls the deodorant
“spooky olfaction at a distance.”
* A minority dissents.
* Until a couple of years ago, I’d say, “Classical physics describes today’s computers.” But quantum
computers exist now. They’re small and riddled with defects, like the last apple left at the farmer’s
stand at the end of market day. But quantum computers exist.
* Others had recognized the technology’s steampunk air, I learned later. Three-and-a-half years after
visiting IBM, I presented about quantum steampunk at the Yale Quantum Institute. Faculty member Rob
Schoelkopf and others are building another quantum computer from superconducting qubits. Rob had
learned the term steampunk, he told me, from a journalist who’d visited his lab.
CHAPTER 4
THERMODYNAMICS
“MAY I DRIVE?”
Baxter asked without intending to; without thinking; without noticing anything except the thrumming, the
smoke, the scent of oil, and the sense of purpose in the locomotive.
“Nae, lad.” The driver spoke in a Cumbrian accent, barely parting his lips beneath his ruddy
mustache.
Baxter slumped down in his seat and picked at a thread in his jacket. But, two minutes later, he was
up and devouring the controls with his eyes again.
“Please, sir, may I try driving for a bit?”
“Nae, lad.” The response came as softly as before.
“Just a moment?”
“Nae, lad.”
“Just—”
“Baxter!” Audrey, having staggered up the carriage, snatched the back of her brother’s jacket. “Stop
harassing that poor fellow! How shall he concentrate on his work with you flitting about him like a
mayfly? How shall—oh!” She took in the knobs, the wheels, the ratchets, the buttons. She remained
silent for a moment, absorbed by the machinery, until the question rose unbidden: “Please, sir, may I
have a go?”
FIGURE 4.2
During act 1, the piston is unclamped so that it can slide. The gas,
imbibing heat from the bath as a toddler imbibes milkshakes from a diner,
can’t stand to stay confined. The gas pushes the piston upward, expanding
across most of the cylinder.
The curtain falls. A black-shirted technician carts the hot bath off the
stage, while another techie places a Wedgwood teapot atop the piston. The
curtain rises, and the gas expands further. Expanding—pushing the teapot
upward—requires the engine to perform work against gravity. The hot bath
no longer replenishes that energy, so the gas’s temperature falls.
Act 2 ends. Exit teapot at its final height—carrying newly acquired
gravitational potential energy. The teapot donates some of that energy toward
some worthy cause, offstage. For example, the teapot may be dropped a short
distance onto a ball of dough, to help flatten the dough into a biscuit.
The curtain rises on act 3 with the cylinder immersed in the cold bath.
Heat leaks into the bath from the gas, whose particles slow down. They
cease to beat against the piston, which begins to compress the gas into the
bottom of the cylinder. The curtain descends, and the techies cart away the
cold bath.
The gas’s final dialogue takes place during act 4. The Wedgwood teapot
has returned to the top of the piston. The teapot’s weight forces the piston
farther downward, performing work on the gas. The work heats up the gas,
which returns to the hot bath’s temperature. Once the gas returns to its initial
condition, the cycle closes.
What have we gained from the engine cycle? The gas performs more work
on the teapot, during act 2, than the gas receives from the teapot during act 4.
Giving more than it gets, the engine extracts work from the baths; the
audience applauds.
This work extraction comes at a cost: The engine absorbed lots of energy
from the hot bath. Only a fraction of that energy turned into useful work. The
engine dissipates the rest of the energy into the cold bath. Imagine running the
engine cycle again and again and again. After enough runs, the cold bath will
receive enough heat that its temperature will rise substantially; the cold bath
will grow less cold. It will be less able to facilitate work extraction in the
future. Therefore, extracting work comes at the cost of energy dissipation,
which hampers future work extraction.
Work extraction is a thermodynamic task just as data compression is an
information-processing task. The Shannon entropy measures the efficiency
with which we can compress classical data, and the von Neumann entropy
measures the efficiency with which we can compress quantum data.
Thermodynamic tasks, too, have efficiencies. The heat engine’s efficiency
depends on two quantities: the work performed by the engine and the heat
absorbed from the hot bath. Divide the first by the second, and you’ve
defined the bang you get for your buck.
The heat engine’s efficiency can’t exceed a certain bound, according to
Carnot, the French engineer. He proved a limit on the efficiency of any engine
that exchanges heat with exactly two baths. We call that limit the Carnot
efficiency, you might not be surprised to hear. The hotter the hot bath, and the
colder the cold bath, the greater the Carnot efficiency. Given an infinite-
temperature bath and a zero-temperature bath, you can extract work with an
efficiency of one. The smaller the gap between the temperatures, the closer
the best efficiency lies to zero. An engine running on the Carnot cycle
operates at the Carnot efficiency, as one might hope.
You’d be twiddling your thumbs before the cycle closed, though. The
Carnot engine operates infinitely slowly; the play that I described takes
forever. Speeding up dissipates more energy than necessary, lowering the
efficiency below its greatest possible value. So, no real engine is a Carnot
engine, and no real engine operates at the Carnot efficiency. Then why did
Carnot dream up his engine? To pinpoint a fundamental limitation on what an
engine can achieve when pushed to an extreme. Thermodynamics
encompasses not only practicality, but also ideality—not only the machines
that power factories but also the limits on what’s possible. Although
thermodynamics sprouted from engineering, it’s rooted in physics and
chemistry.
This decree has hit home more than I expected: I keep encountering
graveyards on trips I’ve taken for the sake of thermodynamics. The trend
began early in graduate school, at a conference at the University of
Cambridge. Entropy starred in most of the talks. One afternoon, the
conference organizers walked us participants to the Ascension Parish Burial
Ground, on the city’s outskirts. We wended our way among Nobel laureates’
graves, searching for Sir Arthur Eddington. Eddington was the astronomer
who transformed Albert Einstein into a household name. His 1919
observation of a solar eclipse supported Einstein’s theory of general
relativity. Four years earlier, Eddington had written in his book The Nature
of the Physical World:
The law that entropy always increases—the second law of thermodynamics—holds, I think, the
supreme position among the laws of Nature. If someone points out to you that your pet theory of
the universe is in disagreement with Maxwell’s equations [of electrodynamics]—then so much
the worse for Maxwell’s equations. If it is found to be contradicted by observations—well, these
experimentalists do bungle things sometimes. But if your theory is found to be against the second
law of thermodynamics, I can give you no hope; there is nothing for it but to collapse in deepest
humiliation.6
* A steam engine might not have powered the car witnessed by Toad and friends: In a steam-powered
car, fuel combusts outside the engine. Fuel combusts inside the engine in today’s cars and in models sold
before Grahame published his book. The car in the book more likely runs on internal combustion than on
steam. Still, the book captures the early motorcar’s spirit and seductiveness.
* As can many other physics concepts. Physicists adore models that encapsulate a concept’s essence
without the distraction of bells and whistles. The gas in a cylinder, or box, encapsulates many concepts.
Despite its humbleness, it’ll star in many more examples in this book.
†
Wouldn’t the temperature change if Audrey lit a fire in the fireplace? The fire would serve as another
heat bath. Two baths form rivals worthy of each other, unlike a bath and a much smaller human. The air
will heat up; and the fireplace will cool, unless stoked.
* A scientific law is a statement believed to be true because it’s withstood many experimental tests.
Some future experiment might break the law, in which case we’ll search for a more accurate law.
* If the steam is out of equilibrium, some thermodynamicists disagree about whether we should call the
product a thermodynamic entropy. But many thermodynamicists agree, leveraging mathematical and
conceptual arguments.
CHAPTER 5
A FINE MERGER
THERMODYNAMICS, INFORMATION THEORY, AND QUANTUM
PHYSICS
Other people had friendships; Audrey had a trade agreement with Lillian Quincy. Lillian kept Audrey
abreast of trends in politics and philosophy, and Audrey explained the latest experiments and inventions.
Today, Lillian served her guest cakes topped with lemon curd, a breath of sunshine on a rainy afternoon.
“I know you never attend the Royal Society’s public lectures,” Lillian said, setting her plate down, “but
they are marvelous this season. Mr. Raja is lecturing, and his explanation of the quantum vacuum was
exquisite. I could almost see how the vacuum contains nothing and yet has energy nonetheless.”
Audrey had gleaned that Lillian’s admiration for Mr. Raja didn’t stop at the quantum vacuum. Lillian
had admired his innovation, analogies, and manner before; and when Audrey had explained a discovery
that he’d made, Lillian’s eyes lit up. Nor would Mr. Raja remain indifferent to Lillian, Audrey expected.
He’d praised a drawing of hers that hung in the Stoqhardt manor; his father had come from the same
neighborhood in Madras as Lillian’s mother; and Mr. Raja fancied German Romanticism as much as
Lillian did. Audrey bit into her lemon cake. If he found his way to the salon hosted by Lillian’s mother,
Audrey would soon be visiting not Miss Quincy but Mrs. Raja.
HAVE YOU EVER had two friends who haven’t met but should; who
share a host of qualities, interests, and values; and who seem destined to ride
into the sunset together? Such are thermodynamics and quantum computation.
They share an operationalist philosophy, highlighting which tasks an agent
can accomplish if given limited resources. Quantum-information-processing
tasks include information compression, while thermodynamic tasks include
gas compression. Additionally, both theories reach across multiple
disciplines. Quantum computation draws physicists, engineers, chemists,
computer scientists, and mathematicians, while thermodynamics governs
physics, engineering, chemistry, astronomy, and biology. Entropies star in
each field, helping to determine the optimal efficiency with which an agent
can perform tasks. So, thermodynamics and quantum computation merit one
of those heart-shaped picture frames. Made from steel. Decorated with
brass-colored decals shaped like gears.
But a marriage—a relationship that thrives and lasts—requires more than
similarities. It requires each partner to enhance and enrich the other.
Thermodynamics and (quantum) computation meet this standard: information
can serve as a sort of thermodynamic fuel, and thermodynamic work can
reset information. This chapter highlights how thermodynamics intertwines
with information processing and how quantum physics transforms both. We’ll
start with an engine that turns useless heat into useful work, with help from
information. Running the engine backward, we can pay thermodynamic work
to reset information. The engine benefits from quantum phenomena including
entanglement. By invoking information, the engine shows, we can resolve one
of the oldest paradoxes in thermodynamics.
Suppose that Audrey has a gas in a box, as in figure 5.1. I warned that
physicists adore gases in boxes, which can capture a problem’s essence
without complications. Szilard pushed this simplification to its extreme,
envisioning a gas of one particle. The particle is classical, as Szilard was
illustrating the relationship between classical information and
thermodynamics. We’ll see later how his ideas change if the particle is
quantum. The box sits in a heat bath that has a temperature T. The bath
exchanges heat with the gas through the box’s walls.
Audrey slides a thin partition into the box’s center, then measures which
side of the partition the particle occupies. Never mind how Audrey measures
the particle; Szilard’s story is powerful because it doesn’t depend on such
minutiae, which depend on Audrey’s personal preferences and which
engineers call implementation details. I’ll omit unimportant implementation
details, focusing on crucial design elements, throughout this story. For
concreteness, suppose that the particle occupies the box’s right-hand side.
Audrey has acquired one bit of information: right, rather than left.
Audrey ties a rope to the top of the partition, passes the rope over the
box’s right-hand side, and runs the rope through a pulley. Then, she ties a
Wedgwood teapot to the dangling end of the rope.* Audrey unfixes the
partition, so that it can slide leftward or rightward within the box. The gas
expands, like the steam that emanates from an apple tartlet and fills a kitchen.
Audrey’s gas particle hits the partition, punching it rightward again and
again.† Nothing punches the partition from the left, so the partition eventually
reaches the box’s left-hand side. Any energy lost by the gas, in punching the
partition, is replenished through heat from the bath. Audrey removes the
partition from the box.
Two features of this outcome draw our attention. First, the particle can
now be anywhere in the box. Audrey has no idea where it is; she’s lost her
bit of information. Second, as the partition shifted, the teapot rose: the gas
performed work against gravity. Why did the teapot rise? Because Audrey
ran the rope over the box’s right-hand side, because she knew that the
particle initially occupied the right-hand side. Suppose that Audrey hadn’t
acquired her bit of information about the particle’s location. She’d have had
to guess where the particle was, right or left. She’d have had a 50% chance
of running the rope over the box’s left-hand side. If she’d done so, the gas’s
expansion would have lowered the teapot, depriving the teapot of
gravitational potential energy (figure 5.2). So, as Audrey actually ran the
rope over the box’s right-hand side, she leveraged information to perform
work on the teapot.
FIGURE 5.2
Whence came the work? Not from the particle, which has the same
temperature—and so the same amount of kinetic energy—as it had originally.
Heat from the bath transformed into work, with help from Audrey’s bit of
information. She traded information for work—a computational resource for
a thermodynamic resource.
How much work can the gas perform? The amount depends on two
important quantities—first, the bath’s temperature. The hotter the bath, the
more energy it can give the teapot, so the more work Szilard’s engine can
perform on the teapot. Second, the work performable by the gas depends on
how much the particle’s Shannon entropy grows—how much Audrey’s
uncertainty about the particle’s location grows. At the end of the experiment,
the particle can be anywhere in the box; the particle has a 50% probability of
occupying the box’s left-hand side and a 50% probability of occupying the
right-hand side. Audrey ends the experiment with the greatest possible
uncertainty about the particle’s position. The probabilities have a Shannon
entropy of one bit, according to chapter 1. Immediately after Audrey’s
measurement, the particle had a 0% chance of occupying the left-hand side
and a 100% chance of occupying the right-hand side. This earlier distribution
had a Shannon entropy of zero, as Audrey knew the particle’s position with
certainty. So, as the gas expands, the entropy grows by one bit. The entropy’s
growth—Audrey’s loss of knowledge—counterbalances the teapot’s gain in
gravitational potential energy. So, the amount of work performable by the gas
is proportional to the entropy’s growth.
I’ll call this amount of work a szilard. If the gas is at room temperature, a
szilard is an eensy-weensy amount of energy: an incandescent light bulb
radiates 1022 times more energy per second. Suppose that Audrey wanted to
lift the teapot about a yard, using a szilard of work. The teapot would have to
weigh as little as 500 of the molecules that power our body’s cells
(adenosine triphosphate, or ATP). If you requested such a teapot at a
department store, the employee behind the counter would raise an eyebrow.
An artisan in a quantum-steampunk novel, though, would fill the order in a
week.
Another engine lifted a Wedgwood teapot in chapter 4—a Carnot engine
that contacts a hot bath and a cold bath at different times. How does Carnot’s
engine compare to Szilard’s engine? Szilard’s engine involves one bath, but
we can regard Audrey’s information loosely as a cold bath. When Audrey
measured the particle’s location, obtaining a bit of information, the particle
occupied the right-hand side. Let’s label the box’s right-hand side as 0 and
the left-hand side as 1. Audrey’s bit was 0. A refrigerator can cool a qubit to
the quantum analog of a 0 bit, as we saw in chapter 3. Just as a 0 qubit is
cold, we can think of a 0 bit as cold. Therefore, Szilard’s engine resembles
Carnot’s engine, with information replacing the cold bath.
FIGURE 5.4
Whence comes the work surplus? Audrey’s and Baxter’s qubits begin
entangled; each qubit has quantum information about the other. At the end of
the protocol, neither qubit has information about the other; the two lack
entanglement. The protocol “burns” the entanglement, using quantum
information (with heat from the heat bath) as a sort of thermodynamic fuel.
Third, a quantum Szilard engine that contains many gas particles can differ
from a classical Szilard engine that does.6 We can illustrate with a Szilard
engine that contains two particles, one contributed by Audrey and one
contributed by Baxter.
Suppose, initially, that the particles are classical. The siblings measure
which side of the partition each particle occupies. One of four possible
outcomes results (figure 5.4a): both particles occupy the left-hand side; both
particles occupy the right-hand side; Audrey’s particle occupies the left-hand
side, while Baxter’s occupies the right-hand side; or Audrey’s occupies the
right-hand side, while Baxter’s occupies the left-hand side. In two of the four
cases, the particles occupy different sides. The left-hand gas exerts the same
pressure on the piston as the right-hand gas. The pressures balance each
other, so the piston can’t move; the engine can’t perform work. In half the
cases, the particles occupy the same side and so can perform work.
Now, suppose that the particles are quantum. Physicists have discovered
two classes of quantum particles: fermions and bosons. We, and all other
matter on Earth, consist of fermions. Example fermions include electrons, as
well as the particles that make up protons and neutrons. Bosons transmit
fundamental forces between chunks of matter. For instance, photons are
bosons that transmit the electromagnetic force that attracts negatively charged
electrons to positively charged protons.
Bosons tend to clump together, and fermions tend to separate. Wolfgang
Pauli, a cofounder of quantum theory, identified the rule behind fermion
separation. We call the rule Pauli’s exclusion principle, which you might
have learned in chemistry class. The exclusion principle explains how
electrons arrange themselves in atoms. According to Pauli, no two fermions
can be in the same quantum state. Say that we put two fermions in a box,
slide a partition down the box’s center, and measure which box halves the
fermions occupy. The measurement disturbs the particles, forcing them to
choose sides. Say that your measurement device detects one particle in the
right-hand side. The other particle must occupy the left-hand side, so that the
particles are in different quantum states.*
Knowing how fermions behave, we can apply our knowledge to Szilard’s
engine. Imagine Audrey and Baxter loading a Szilard engine with two
fermions, then running many trials. In each trial, the siblings measure the
fermions’ positions. The fermions always occupy opposite sides of the
barrier, according to Pauli’s principle (figure 5.4b). No pressure imbalance
ever moves the barrier, so the engine never performs work.
Now, imagine the siblings running a Szilard engine with two bosons. The
bosons don’t always occupy opposite sides of the barrier, because the
bosons don’t obey Pauli’s exclusion principle. But the bosons don’t behave
like classical particles, either. The siblings’ classical particles were
distinguishable; Audrey could always tell which particle was hers—for
instance, by tracking her particle’s movements. Likewise, Baxter could
always identify his particle. But the bosons are indistinguishable; once in
close quarters, they can’t be told apart. So, the siblings can’t identify either
particle as Audrey’s or Baxter’s. For instance, Audrey can’t distinguish a
boson as hers by tracking its trajectory, because a trajectory is a sequence of
positions and quantum particles lack definite positions.
Indistinguishability limits the possible outcomes of the siblings’
measurement. In some trials, the siblings find both particles in the box’s left-
hand side; and, in other trials, in the right-hand side. Both types of trials also
transpired when the siblings used classical particles. But the classical
particles underwent two other types of trials, too: Audrey’s particle could
occupy the box’s left-hand side, while Baxter’s particle occupied the right-
hand side, and vice versa. If the particles are bosons, neither can be labeled
as Audrey’s or Baxter’s. The siblings can only find one boson in the box’s
left-hand side and one boson in the right-hand side. So, measuring the bosons
yields one of only three possible outcomes, shown in figure 5.4c. In two-
thirds of the possibilities, the bosons occupy the same side of the box. So, the
engine performs work in two-thirds of the trials.
If a crazed engineer ever threatens to vandalize your workshop unless you
extract more work than he does with a two-particle Szilard engine, request a
bosonic engine. It’ll perform work in two-thirds of your trials (on average),
while a classical engine can perform work in half the trials (on average), and
a fermionic engine never performs work. So, the bosonic engine outperforms
the classical engine, which outperforms the fermionic engine. Quantum
particles can beat classical particles at a thermodynamic task, just as
quantum computers can beat classical computers at certain computations.
The demon watches the particles whizzing around the box. If a particle
approaches the partition from the left at high speed, he lets the particle
through the trapdoor. He opens the door also for slow particles that approach
from the right. After a while, the right-hand side contains only quick
particles. The quicker a gas’s particles, the greater the gas’s kinetic energy,
and the higher the gas’s temperature; so the box’s right-hand side contains a
hot gas. The left-hand side contains only slow particles, which form a cold
gas.
The demon has taken a mixed-up system and unmixed it. He might as well
take a bowl of scone batter and remove the cream that the cook stirred in.
The demon has reduced the gas’s entropy. Gadzooks! But the tale darkens
further.
A Carnot engine performs work, we saw in chapter 4, given a hot bath and
a cold bath. The demon has created such baths, so his box can drive an
engine and charge a battery. By the time the engine completes its cycle, the
gas has returned to its initial state: particles whiz around randomly, at
various speeds. The demon can again separate hot particles from cold, then
extract work again, and repeat these two steps as many times as his black
heart pleases. The demon can charge infinitely many batteries, run every
power plant out of business, and lift all the teapots in the world, without
paying any cost. The gas always returns to its initial state, without losing any
value. The demon will run a perpetuum mobile, or perpetual-motion
machine. Those can’t exist, according to the second law of thermodynamics.
Welcome to the game that I call Why Doesn’t the Second Law of
Thermodynamics Break? As a television show, the game would collapse in
flames; but it can spice up a lunch-table conversation if played with friends.
One person dreams up a perpetuum mobile, and everyone else figures out
why it can’t exist. Maxwell, having initiated round one, is eating a peanut-
butter sandwich as we turn to poking holes in his machine.
The second law of thermodynamics states that every closed, isolated
system’s entropy increases or remains constant. The gas can’t interact with
the outside world, so it seems closed and isolated. Or does it? The gas
interacts with the demon. Does the demon’s entropy grow, offsetting the
decrease in the gas’s entropy? It needn’t, alas. We can replace the demon
with an automatic mechanism that accomplishes the same task. The
mechanism can avoid radiating the heat that an organism would omit. The
mechanism can also avoid dissipating energy as friction or sound, if
engineered perfectly.
The demon measures particles’ speeds. Do the measurements increase the
system’s energy? Szilard postulated so in 1929, as quantum theory was
crystallizing.7 IBM researcher Charles Bennett proved otherwise decades
later; he detailed a means of measuring particles without dissipating energy.
He also resolved Maxwell’s paradox, drawing on Landauer’s principle, in
1982.8
Bennett argued as follows. The gas and the demon, together, form a
closed, isolated system. The demon contains a memory, which changes every
time the demon measures a particle’s speed: the demon must remember the
speed long enough to open the trapdoor or to block the particle. The demon
operates a perpetuum mobile only if his memory returns to its initial state
after each cycle, without dissipating much energy.
We omitted a step from our earlier description of the cycle: the demon
must erase his memory at the end. According to Landauer, erasure costs
work. The erasure’s work cost nixes the work performed by the Carnot
engine. So, the demon nets no work from the supposed perpetuum mobile, on
balance. An information-theoretic task—erasure—resolves a thermodynamic
paradox.
Another way to phrase the resolution is: erasure dissipates energy. That
dissipation raises the gas-and-demon system’s entropy at least as much as the
demon lowers the gas’s entropy. So, the whole system’s entropy doesn’t
decrease, and the second law reigns.
At the lunch table of my mind, Maxwell finishes his peanut-butter
sandwich and zips up his lunchbox. It’s your turn to invent a perpetuum
mobile.
We can now understand why one cartoon demon brandishes an eraser:
Landauer’s principle of erasure resolves Maxwell’s demon paradox. At
least, many physicists believe that Landauer’s principle has. Detractors
remain, but the proposal has gained widespread acceptance.
Experimentalists have checked Landauer’s bound by manipulating single
molecules and nanoscale magnets; and theorists have added bells and
whistles to Maxwell’s story.
At the beginning of the chapter, Audrey traded scientific expertise for
political and philosophical news from Lillian Quincy. Likewise, information
theorists can trade with thermodynamicists. Information can transform
useless heat into useful work; one can pay work to obtain information; and
information erasure resolves a thermodynamic paradox. We’ll knot the
threads that bind information theory and thermodynamics, as well as quantum
physics, in the next chapter.
* The teapot needs to be far smaller than the Wedgwoods you can purchase at a department store, as
we’ll see.
†
The partition must not weigh much, as it gives way to one particle. Never mind how we’d craft such a
light partition; in a thought experiment, the implementation details matter less than the overall concept.
Furthermore, experimentalists can realize Szilard’s idea, as we’ll see below.
* Irreversibility contrasts with reversibility, exhibited by operations such as the addition of 1. Add 1 to
any real number, and you can infer the original number. If adding 1 yields 267, the process began with
266.
* I’m supposing that the fermions lack spins. If the fermions have spins, then both particles can occupy
the box’s right-hand side: one particle can have an upward-pointing spin, while the other particle has a
downward-pointing spin. The spins will distinguish the particles’ quantum states. Natural fermions have
spins, but we can engineer systems that act like spinless fermions. And ignoring the spins simplifies the
explanation.
CHAPTER 6
IN THE BEGINNING
Detailing the field’s history would require another book, but I’ll sketch the
development of quantum thermodynamics here. The field’s roots stretch back
to the 1930s, the childhood of quantum theory. Physicists hoped to use
quantum uncertainty—the impossibility of knowing a quantum particle’s
position and momentum perfectly precisely—to prove the second law of
thermodynamics1 and resolve Maxwell’s demon paradox.2 Neither hope
prevailed,3 but both illustrated how thermodynamics sparks curiosity when
superimposed on quantum theory.
In 1956, Harvard physicist Norman Ramsey showed that qubits could
have temperatures below absolute zero.4 We’ll see in chapter 7 how such
temperatures are possible. Not that Ramsey thought of qubits as qubits, since
the quantum-information state of mind hadn’t crystallized by his time. Rather,
he described two rungs in an atom’s energy ladder. A bunch of such atoms
could serve as a quantum engine, Erich Schulz-DuBois and Henry Scovil
realized three years later.5 They detailed their vision with Joseph Geusic,6 as
we’ll see in chapter 7. All three physicists worked at Bell Labs—as had
Claude Shannon, the founder of information theory.
During the 1970s, quantum thermodynamics benefited from a crossbreed
between mathematicians and physicists. Mathematical physicists developed
equations that model how quantum systems settle down into equilibrium.7–11
The 1980s advanced the study of quantum engines—theoretical tools for
analyzing them and experimental tools for devising them. Ronnie Kosloff of
the Hebrew University in Jerusalem and Robert Alicki of Gdańsk University
in Poland adopted an abstract, mathematical approach.12,13 For instance,
Ronnie identified properties of a quantum system that enable it to serve as an
engine, be the system an atom, a spin, or anything else.
Adopting a more concrete approach, Marlan Scully explored the
thermodynamics of quantum systems such as lasers. I found Marlan’s epithet,
the quantum cowboy, apt when he visited the institute I belong to as a
postdoc. Raised in Wyoming, Marlan is a professor and cattle rancher in
Texas. He takes a tell-it-like-it-is approach to science and communication.*
Also during the 1980s, MIT theorists sought to construct a theory of
quantum thermodynamics. The cohort included Elias Gyftopoulos and George
Hatsopoulos, who recruited Gian Paolo Beretta. They reevaluated entropy,
equilibrium, and tenets of thermodynamics, with small and out-of-
equilibrium systems in mind. Some scientists dismissed the cohort’s work:
Thermodynamics had always focused on large systems. How could a
thermodynamics of small systems not be an oxymoron? Four decades later,
it’s a subfield of science.
A few years later, MIT hired another cofounder of quantum
thermodynamics: Seth Lloyd is the quantum-computing scientist (chapter 3)
who quips about the second law of thermodynamics (chapter 4). As I
warned, if you dream up any idea about quantum computation, Seth likely
wrote about it a few decades ago. The warning extends to quantum
thermodynamics. Seth’s 1988 PhD dissertation refers to Maxwell’s demon in
its title.15 The thesis explores how one can and can’t use information to lower
thermodynamic entropy. The thesis also shows how, under special
conditions, classical thermodynamics nearly predicts a quantum system’s
behavior—even though the system can be entangled.* Examples include
black holes, the densest regions in the universe.18
Quantum physics, information theory, and thermodynamics combine in the
study of black holes. Physicists including Jacob Bekenstein, Stephen
Hawking, Bill Unruh, and Paul Davies explored this setting during the 1970s
and 1980s. But the black-hole community has worked mostly separately from
the quantum-thermodynamics community. Exceptions exist and are growing;
examples include Seth Lloyd’s thesis and a story we’ll encounter in chapter
14.
We’ve seen how else the 1980s benefited quantum thermodynamics. First,
Paul Benioff and others explored the limitations on how little heat a
computer can dissipate. Their focus on fundamental limitations led to the
development of quantum computing. Second, Charles Bennett resolved
Maxwell’s demon paradox in 1982 (chapter 5). What a decade.
The embers of quantum thermodynamics burned lower during the
subsequent two decades. Among those blowing on the coals was Ilya
Prigogine, the archaeology-studying, piano-playing Nobel laureate who
explored nonequilibrium thermodynamics. Prigogine and collaborators
reformulated quantum theory around the notions of irreversibility and time’s
arrow.19
Quantum thermodynamics—dare I say—picked up steam during the early
2010s. Quantum information theory had matured as a mathematical and
conceptual toolkit, and it had begun shedding light on other fields. Scientists
were understanding chemistry and materials anew, in terms of the information
that quantum systems share and convey. Thermodynamics beckoned to
quantum information theory, calling for reexamination. The combination
roared into life over the past decade.
For much of the decade, quantum thermodynamics thrived mostly outside
the United States. Europe, Canada, and Japan caught on early. A few hot
spots burned beyond; for instance, Ronnie Kosloff had been pondering
quantum engines in Israel for decades. Hot spots then emerged in Singapore,
which had invested in quantum computing, and elsewhere. Only during the
past few years has quantum thermodynamics gained traction in the United
States. A handful of us, scattered across the country, waved the quantum-
thermodynamics banner. The first quantum-thermodynamics conference to
take place on American soil, to my knowledge, transpired in 2017. But
quantum thermodynamics has been gaining popularity in the United States at
last. Colleagues in other fields are borrowing our tools and inviting us to
apply for funding together. Students and postdocs email me inquiries about
opportunities to undertake research in quantum steampunk.
Why has the United States taken so long to catch on to quantum
thermodynamics? The question confounds especially because roots of
quantum thermodynamics grew in American soil during the 1980s. I haven’t
pursued the question with data and rigor, so I can’t answer with authority. But
a finding by a historian of science has given me pause. Much of quantum
thermodynamics, especially early quantum thermodynamics, is foundational
and theoretical. Quantum thermodynamicists have proved lemmata and
theorems, scrutinized subtleties in probability theory, and reformulated the
laws of thermodynamics. Even many proposers of quantum engines have had
little intention of fabricating their ideas. The field’s early foundational bent,
the historian pointed out, resonates with the European philosophical
tradition, which predates Socrates. In contrast, the United States has
developed a tradition of innovation and practicality. Consequently, much of
the country’s science centers on experiments, technology, and applications.
Little wonder that the United States began to welcome quantum
thermodynamics only after the field grew beyond abstract theory.
I adore the abstract theory, to which I contribute. But I also work with
experimentalists on ushering the theory into the real physical world. I also
usher quantum thermodynamics outside its neighborhood into other
disciplines, such as condensed matter, atomic and laser physics, chemistry,
and black-hole physics. As quantum information theory began transforming
other fields—including thermodynamics—during the early 2000s, so quantum
thermodynamics is illuminating other fields anew. I feel fortunate to belong to
a lively community of colleagues with whom I collaborate on all these aims.
QUANTUM IS DIFFERENT
The twentieth-century physicist Philip W. Anderson coined the slogan “More
is different.”20 He earned a Nobel Prize for elucidating statistical mechanics
and condensed matter. His slogan encapsulates why we should bother
studying statistical mechanics, the study of large, many-particle systems.
Granted, every particle in a large system obeys Newton’s laws or quantum
theory. So, to describe large systems, we seem to need no theory beyond
classical and quantum mechanics. But we’d apply Newton’s laws to a cloud
of steam by calculating every steam particle’s trajectory. The steam cloud
contains about 1024 particles; so, we’d spend oodles of time, energy, and
computation. Worse, we’d gain little insight. Many-particle systems exhibit
collective behaviors off-limits to individual particles. If you only ever watch
one raven, you can’t fathom a flock—its undulations and roilings, its fabric-
like qualities. More is different.
When it comes to heat and work, quantum is different. We characterized
heat as the unharnessed energy of random motion. In contrast, work is
organized energy available for performing tasks. We saw one example of
quantum work in chapter 5, when quantum particles lifted a tiny teapot via
Szilard’s engine. But we might want general rules, beyond one example, for
defining and measuring quantum work and heat. Furthermore, more subtleties
than I let on obscure the work and heat exchanged by Szilard’s engine.
To see why, imagine a variation on Szilard’s engine, another quantum gas
in a box. The gas exchanges heat with a heat bath through the box’s walls. A
piston compresses the gas, performing work on it. How can we define and
measure the heat absorbed by the gas and the work performed on the gas?
The gas is in some quantum state. In most quantum states, the gas lacks a
well-defined energy, due to quantum uncertainty. If the gas lacks a well-
defined energy now, and it lacks a well-defined energy after the piston and
bath act, how do we put a number on the change in the gas’s energy, let alone
split that change into heat and work?
We could measure the gas’s energy before and after the protocol. Each
measurement would force the gas into a state with a well-defined energy.
Viewed from another angle, however, this antidote serves as poison: Each
measurement disturbs the gas’s energy, changing the energy. Instead of having
to tease apart only heat and work, we now have to tease apart heat, work,
and the energy injected by the measurement. So how well can we port over,
into quantum thermodynamics, the classical definitions of heat and work?
About as well as we can port over, into statistical mechanics, the mindset
behind Newton’s laws.
What’s a scientific community to do when it can’t distinguish work from
heat as it’s used to? Everybody and their uncle proposes a definition of
quantum work and heat. The definitions could fill a menagerie in the
metropolis of ideas, as a testament to humans’ imagination and
argumentativeness. Many proposers believe that they’ve resolved the
problem, so not everyone agrees on any one resolution.
I haven’t proposed a definition. I believe that different definitions suit
different contexts, so no one definition need rule them all. Physicists are
famous for trying to unify theories—for instance, quantum theory, which
governs small objects, with general relativity, which governs enormous
objects. But the same approach—unification—doesn’t necessarily suit every
problem. Whenever I find a paper that introduces a new definition of
quantum work and heat, I save it in a folder labeled “Menagerie—definitions
of quantum heat and work.” Images of Victorian menageries depict cages
scarcely larger than their inhabitants. Lions and tigers lash their tails
alongside monkeys and elephants; none occupies an environment that
resembles its home. Therefore, I eschew the design of the Victorian
menagerie. I imagine my specimens—the different definitions of quantum
heat and work—as padding, climbing, and flitting around a Victorian
botanical conservatory. Silver beams curve upward into elegant shapes
through which you can glimpse the sky; the high, glass roof lends an airy
atmosphere. Let’s meet some of the denizens.
One definition of quantum work and heat, I envision as a
Yorkicockasheepapoo—a mix of Yorkshire terrier, cocker spaniel, English
sheepdog, and poodle. Nature wouldn’t concoct such a combination; only a
breeder would. Likewise, nature wouldn’t suggest this definition of quantum
work and heat; only a scientist—likely a theorist—would. We can use this
definition if two conditions are met. First, only one thing ever happens at a
time. Second, after anything happens, we measure our quantum system’s
energy.
For instance, consider the quantum gas in a box. The gas exchanges energy
with a heat bath through the box’s walls, and a piston compresses the gas. To
define heat and work in Yorkicockasheepapoo fashion, we break the process
into steps: we measure the gas’s energy, then let the gas interact with the bath,
then measure the gas’s energy, then move the piston inward, and then repeat
those steps. The energy reported by our detector after a heat-bath coupling,
minus the energy reported before, we define as heat. The energy reported
after a piston movement, minus the energy reported before, we define as
work.
This definition of work and heat offers two advantages. First, we’ve
defined the work and heat exchanged in every experimental trial—not only
on average over trials, as in a later definition. Second, this definition is
operational—expressed in terms of experimental steps. How to measure
work and heat, using this definition, is relatively clear.
On the downside, each measurement disturbs the system’s energy, as
we’ve discussed. Also, the definition suffers from artificiality: Systems often
exchange heat while absorbing work. Separating the steps is unnatural, as is
measuring the energy frequently. So, this definition looks to me like a
creature designed by breeders—a Yorkicockasheepapoo.*
Definition number two of quantum work and heat, I envision as an
elephant—an animal that has a solid physical presence. This definition calls
for a battery, an auxiliary system separate from the system of interest. We
should be able to reliably deposit energy in, and extract energy from, the
battery. For example, suppose that the system of interest is a tiny quantum
spring owned by Audrey. The battery can be an atom supplied by Baxter
(figure 6.1). The simplest quantum battery has only two energy levels. Baxter
can keep his atom at a low energy, so that it occupies the lowest or second-
lowest rung of its energy ladder. Let’s assume, for now, that the battery
always has a well-defined energy, E0 or E1.
FIGURE 6.1
Suppose that the siblings wish to extract work from a compressed spring.
Baxter prepares his battery in its lowest level, with an amount E0 of energy.
The siblings couple the spring to the battery, enabling energy to slosh
between the systems. The spring ends in a low-energy, relaxed state. The
atom ends on its upper ladder rung, with an amount E1 of energy. We define
the work performed by the spring as the energy gained by the atom, E1 − E0.
Now, suppose that the siblings want to perform work on the system of
interest—to compress the spring. Baxter prepares his battery on its upper
ladder rung, with an amount E1 of energy. The siblings use energy from the
battery to exert a force on the spring. Providing the force depletes the
battery’s energy to E0; we can say the spring has absorbed an amount E1 − E0
of work.
This definition offers three advantages. First, it prescribes a physical
protocol that an experimentalist can perform to measure the work. The ability
to measure a quantity, such as work, lends weight to its meaningfulness.
Second, the protocol doesn’t require us to measure the spring’s energy
directly. We therefore disturb the spring less than in the
Yorkicockasheepapoo protocol. We do disturb the spring, however, by
coupling it to the atom. But the Yorkicockasheepapoo protocol hits the
spring’s energy more like a lightning bolt.
Third, we can generalize the protocol: Suppose that you don’t know
precisely how much work you’ll perform to compress the spring; you can
only estimate. The work required might not equal the battery’s energy gap, E1
− E0. If not, even if the work required is less than E1 − E0, you can’t use the
battery reliably. The atom typically can’t pause between energy-ladder rungs,
offering less work than you need. But we can devise a battery that has many
closely spaced rungs. If you underestimate or overestimate the work
required, no problem. The battery can end a little higher or lower than
anticipated.
Two drawbacks mar the elephant definition. First, it prescribes a means
of defining and measuring just work, not heat. Second, we can’t always
assume that the battery has a well-defined energy: The battery is a quantum
system that can be in a superposition of energies.21 If the battery is, how much
work was performed remains unclear.
Definition three of quantum work and heat resembles a wildebeest that’s
neither fast nor slow, neither adventurous nor a straggler—the middle of the
pack, or the average. Quantum physicists invoke averages all the time.
Imagine running an experiment on some quantum particles in each of many
trials. At the same time in each trial, we measure the particles’ energy. Then,
we average the outcomes. A theorist can predict this average, knowing the
particles’ quantum state and knowing properties of the particles’
environment. As the state and the environment change across a trial, so does
the particles’ average energy. State-sourced changes to the average energy
constitute heat, and environment-sourced changes to the average energy
constitute work, according to the wildebeest definition.
Let’s see why this definition makes sense, starting with the environment.
By the particles’ environment, I mean, are the particles in an electric field, in
open space, or in a box? Is the box being pushed? Can the particles move
only on a two-dimensional surface, like a tabletop? Does the tabletop slope
like a hill? Experimentalists change the system’s environment by turning
knobs in the lab—by strengthening a magnetic field, turning on a laser, and so
on. Experimentalists control these changes, just as a thermodynamic agent
controls energy that’s serving as work. So, we define as work any changes
caused to the average energy by environmental changes.
The average energy depends, as well as on the environment, on the
particles’ quantum state. The state, we established, is the quantum analog of a
probability distribution. Probabilities dictate how accurately we can predict
how events will unfold. If the possible unfoldings have equal probabilities,
the event is completely random, and we can’t predict much about it. Heat, the
energy of random motion, randomizes events. So, heat homogenizes
probabilities and, likewise, homogenizes quantum states. We therefore define
as heat any changes caused to the average energy by changes in the quantum
state.
This wildebeest definition offers the advantage of according with
intuition: work is controlled, and heat randomizes probabilities. But a
quantum average involves many trials. The wildebeest definition doesn’t
define, or show how to measure, the work and heat exchanged in one trial.
Also, the wildebeest definition contradicts a definition favored by
condensed-matter physicists. Condensed-matter physicists study matter. A
chunk of matter has an energy ladder that contains many, many rungs. Imagine
thwacking the chunk periodically for a long time: Wack! Wack! Wack! Wack!
Condensed-matter physicists dignify this thwacking with the name Floquet
driving, after the nineteenth-century French mathematician Gaston Floquet.
Imagine measuring the matter’s energy afterward. Our detector will report
the number associated with some energy-ladder rung. The detector’s
probability of reporting this rung number equals the detector’s probability of
reporting that rung number, and so on. In other words, the matter’s quantum
state is spread evenly across all the energy rungs. The quantum state is
spread the same way if the matter has equilibrated with an infinite-
temperature bath.* So the thwacking—pardon me; the Floquet driving—heats
the qubits, according to condensed-matter physicists. According to the
wildebeest definition, the thwacking provides not heat but work. After all,
the thwacking results from a change in the matter’s environment, not from a
heat bath. So condensed-matter physicists wrangle with quantum
thermodynamicists about the wildebeest definition.
Let’s train our binoculars on one more specimen in the menagerie of
definitions for quantum heat and work. This definition reminds me of a
hummingbird, which scarcely disturbs the twig it alights on. This definition
stipulates that we measure the quantum system’s energy weakly. To
understand what a weak measurement is, we have to understand how quantum
measurements operate.
Imagine the Stoqhardt siblings measuring a system of Audrey’s, such as an
atom. They couple the atom to some system of Baxter’s; for instance, they lob
a photon at the atom. The photon bounces off the atom, exchanging energy,
momentum, and spin with the atom. The exchange correlates the photon’s
quantum state with the atom’s quantum state. † Then, Baxter observes a
property of his photon. For example, the photon hits a photodetector, a
camera that collects light. The camera registers the photon’s energy,
disturbing the photon. The camera shifts a needle across a dial, till the needle
points at a particular number. This process transduces information about the
photon from the quantum scale to the human scale. Baxter reads the number
off the dial, gaining information about the photon. From that information, and
from the correlation between his and Audrey’s systems, the siblings infer
about Audrey’s atom.
Physicists often assume that Baxter’s system entangles maximally with
Audrey’s—that the systems correlate as strongly as possible. In this case, the
siblings can infer about Audrey’s atom as much information as any
measurement can reveal about a quantum state. We call such a measurement
of Audrey’s atom strong. But the correlation can be weak: the photon can
interact with the atom for a short time, without exchanging much energy,
momentum, or spin. Baxter’s photon will provide little information about
Audrey’s atom. Baxter will perform a weak measurement.
Why would the siblings forfeit information about the atom? To avoid
disturbing the atom much. If the systems entangle maximally, the camera jolts
the atom similarly to how it jolts the photon. If the systems entangle only a
little, the camera disturbs the atom as a hummingbird disturbs you by
hovering behind your left ear: a shiver might run down your spine, but you
suffer no violence. Furthermore, the siblings can run many trials of their
experiment. They’ll accumulate many measurement outcomes. From those,
the siblings can reconstruct information that they could have obtained from
strong measurements, without disturbing the system’s energy as much.
The hummingbird definition of quantum work and heat begins with the
Yorkicockasheepapoo definition: measure the system’s energy. But we
replace the strong energy measurements with weak measurements. This
definition offers the advantage of scarcely disturbing the system’s energy. But
drawing conclusions tends to require many trials, so this definition says little
about the work and heat exchanged in one run.
Here ends our visit to the menagerie of definitions of quantum work and
heat. We’ve met the Yorkicockasheepapoo (measure the energy frequently,
and separate work exchanges from heat exchanges temporally), the elephant
(define work in terms of a battery), the wildebeest (define work and heat in
terms of an average—the middle of the pack), and the hummingbird (measure
the energy weakly). Other species roam beyond the camellias and palms. But
let us slip out, the glass door shutting behind us. Turning around for one more
glimpse, we can judge the glinting edifice from outside. The menagerie
signifies, to me, the trickiness of translating even basic thermodynamics into
quantum theory. Quantum is different.
Now, let us confront the terrain in front of us. I envision quantum
steampunk as a landscape. A map of it would consist of a parchment like the
one examined by Audrey at the beginning of this chapter. Many city-states,
principalities, and villages dot the map, as quantum steampunk encompasses
many communities. Different communities address different corners of
quantum thermodynamics and approach the subject from different angles.
We’ll meet the communities one by one, traversing the landscape. Audrey,
Baxter, and Caspian will traverse their map at the same time. We won’t visit
every grotto and ruined castle, and the ones we visit will be ones that your
humble tour guide happens to know. But we’ll develop a sense of the scenery
through stories. So, pack your trunk, fetch your binoculars, and secret a
penknife in an inner pocket. As Caspian said, there be dragons all over the
map.
* Marlan and one of his sons have coauthored a quantum-thermodynamics book, The Demon and the
Quantum: From the Pythagorean Mystics to Maxwell’s Demon and Quantum Mystery.14
* Related explanations emerged during the early 2000s.16,17
* Disclaimer: I have used this definition, as a theorist.
* An infinite-temperature bath sounds extraordinarily hot. So, you might expect an infinite-temperature
bath to boost the matter to its top energy level. The bath doesn’t; although we’ll see a bath that does in
chapter 7.
†
This use of the term correlate differs slightly from the use in chapter 3. According to chapter 3, if
Audrey’s particle entangles with Baxter’s, measuring Audrey’s particle can yield an outcome correlated
with a measurement of Baxter’s particle. Here, I’m saying that entanglement correlates Audrey’s
particle with Baxter’s. Consider this second usage as shorthand for the first usage.
CHAPTER 7
Put-put-put … put PONK. The train juddered several times, and then slowed to a stop. Baxter took
advantage of the confusion to swap one of his cards for a fresh one from the stack. Audrey slapped his
hand, as a young conductor stumbled down the aisle, his navy-and-gold cap askew.
“Excuse me, sir,” she called out. “What has happened?”
The conductor grabbed the brass rail across from her seat and steadied himself on it while
straightening his cap.
“Sounds like one of the engines 'as cut out, miss,” said the conductor, wiping his glistening forehead.
Audrey exchanged a glance with Caspian, who raised an eyebrow from above his newspaper.
Baxter swapped another card for the top of the stack.
“Not to worry, miss,” the conductor added, wiping off another bead of sweat. “We’ve reached the
outskirts of Manchester, which 'as more quantum engineers than you could spit at—beggin’ your
pardon, miss. We’ll be back on our way by evening, mark my words.
VROOOOOM!
The quantum engine to be imagined consists of a variation on a laser. Perhaps
you’ve used a laser pointer during a PowerPoint presentation; or admonished
someone to “be careful with that; you almost shone it in my eyes!”; or teased
your cat by bouncing a laser light around so that it resembled a small,
catchable creature. We can form a laser by putting a bunch of atoms in—you
guessed it—a box. Each atom has the same energy ladder as every other.
Lasing—a laser’s action—involves three rungs, whose energies we’ll call
E0, E1, and E2. The atoms climb and descend the rungs while absorbing and
emitting photons. The emitted photons form the light that drives cats mad.
A maser resembles a laser but emits microwaves, rather than visible light.
The maser predated the laser, but no one needs microwaves during a
PowerPoint presentation (except any audience members wanting popcorn
during the show). Guess where the maser and laser were developed? Bell
Labs—home of Claude Shannon, Schulz-DuBois, Scovil, and Geusic—as
well as at the Lebedev Physical Institute in Moscow. The first quantum heat
engine we’ll see consists of atoms that can form a maser, although the atoms
will behave a little differently than when acting as a maser.
The engine interacts with three heat baths, all at different temperatures.
Two baths exchange photons with the engine. One bath can’t necessarily
exchange photons, for technical reasons. But we can still envision all the
baths as exchanging with the engine packets of fixed amounts of energy.
FIGURE 7.1a
FIGURE 7.1b
One bath emits photons of energy E1 − E0, which can boost the engine
from rung 0 to rung 1 (figure 7.1a). Alternatively, the engine can fall from
rung 1 to rung 0 while emitting photons of energy E1 − E0 into that bath. The
second bath exchanges packets of energy E2 − E1 with the engine, and the
final bath exchanges photons of energy E2 − E0 (figure 7.1b). So, each bath
facilitates atomic transitions between two energy rungs. The atom “looks” to
each bath as though it has only two rungs. Those two rungs serve similarly to
the spin-up and spin-down states of chapter 3: each pair of energy rungs
forms a qubit.
What happens when a qubit, formed from two energy rungs, equilibrates
with a bath? The bath decoheres the qubit, robbing it of quantum properties
other than energy quantization. For instance, the qubit can no longer be in a
superposition, so the qubit has some well-defined amount of energy. But we
don’t know how much energy if we only know that the qubit is at thermal
equilibrium with the bath. The qubit has some probability 1 − p of occupying
its lower energy rung and a probability p of occupying its upper rung. The
probability p depends on the bath’s temperature: the hotter the bath, the
greater the qubit’s probability of occupying the upper rung.
How does the probability p depend on the bath’s temperature? We’ll find
out by considering a low temperature, checking how the probability behaves,
imagining raising the temperature, seeing how the probability changes, and
repeating. Let’s start with the lowest possible temperature. According to the
third law of thermodynamics, no system’s temperature can reach absolute
zero. But a temperature can almost reach absolute zero; so, let’s pretend that
the bath’s temperature does reach absolute zero. The qubit will have no
energy; the upper energy rung will lie out of reach. Therefore, the qubit’s
probability p of occupying that rung will be zero.
Imagine raising the temperature slightly above zero. The qubit acquires a
tiny probability of occupying its upper rung. As the temperature grows, the
probability grows, but much more slowly than the temperature. I think of the
temperature as an extrovert and the probability as an introvert: A speech by
the extrovert coaxes a sentence from the introvert; laughter by the extrovert
secures a smile from the introvert. Likewise, a leap in the temperature
accompanies a baby step by the upper rung’s probability.
This behavior climaxes when the temperature reaches infinity: the upper
rung’s probability reaches one-half. Not even the dizzying heights of infinite
temperature can guarantee that the qubit occupies its upper energy rung. This
result may surprise us, but it’s how the mathematics works out.
Yet temperatures of another type can push a qubit onto its top energy rung:
temperatures below absolute zero. Before we address the question, “What in
tarnation are temperatures below absolute zero?” allow me to conclude my
conceit. I am the qubit’s probability of occupying its upper level; I am an
introvert. I delight in quiet contemplation, in the company of a few, cherished
friends. Extroverts try to draw me out, but their efforts can counteract their
aims: The more they talk, the less silence I have in which to reflect, form
opinions, and speak. The higher they climb the staircase of gregariousness,
the more they leave me behind; I remain at one-half while they reach infinity.
To those kind extroverts seeking to engage us introverts, I suggest a strategy
gleaned from thermodynamics: Descend from the heights. Return to the finite
realm, and replace your effusiveness with its negative—listening. It will
boost us introverts to a probability p of one.
Now, what in tarnation are temperatures below absolute zero? I’m
measuring temperature in units of Kelvin, introduced in chapter 4, rather than
in degrees Fahrenheit or degrees Celsius. Zero Kelvin, or absolute zero, is
the lowest temperature toward which one can hope to cool any system. So,
temperatures below zero—negative temperatures—seem impossible. But a
negative-temperature qubit isn’t colder than a zero-temperature qubit; it’s
hotter than an infinite-temperature qubit. At infinite temperature, we said, a
qubit has a 50% chance of occupying its upper energy rung. Imagine setting
the temperature to infinity, then pumping more energy into the qubit. We can
pump energy into an atom by shining a laser at it. The qubit will acquire a
greater-than-50% probability of occupying its upper rung. The qubit’s
temperature can’t lie between zero and infinity, we’ve seen. The temperature
turns out to be negative, according to the mathematics.
So a negative-temperature qubit has more energy—is hotter—than an
infinite-temperature qubit. That’s why temperature can dip below zero, the
temperature of the coldest system conceivable. Negative temperatures make
systems much hotter than zero temperature.
You and I can’t have negative temperatures. Neither can the air outside;
nor your oven; nor the water from that tap, in the first-floor bathroom in your
office building, that always comes out scalding. Each of these systems has an
energy ladder with infinitely many rungs. A qubit has an energy ladder with
only two rungs. The finite number of rungs allows for negative temperatures.
The reason why lies beyond the scope of this book. Suffice it to say that
quantization allows quantum systems to have negative temperatures, that
negative-temperature systems are hotter than infinite-temperature systems,
and that technologies used today—including lasers—leverage negative
temperatures.
One of the heat baths interacting with our engine has a negative
temperature. We might as well think of this bath as consisting of negative-
temperature qubits. It gains or loses energy as the atom transitions between
the top two energy rungs, 1 and 2 (figure 7.1a). The atom more likely resides
on the top rung, due to the negative temperature. The second bath—the cold
bath—facilitates transitions between the bottom two rungs, 0 and 1. This
bath has a temperature slightly above zero. So, the bath gives the atom a
decent probability of occupying its lowest rung.
The third heat bath helps the atom transition between its top and bottom
rungs, 2 and 0. This bath has an infinite temperature, and it exchanges work
with the engine. How can it exchange work if heat baths exchange heat and I
fussed about distinguishing heat from work in chapter 6? You can say about
some people that “their word is as good as gold.” You can say about infinite-
temperature heat baths that “heat exchanged with them is as good as work.”
The reason is, exchanging heat doesn’t raise such a bath’s entropy. Imagine
funneling heat from an infinite-temperature bath to a Carnot engine. The
engine will transform all the heat into work, without dissipating any. We
might as well regard any heat exchanged with the bath as work.
Figure 7.1 depicts how the quantum engine functions. Each energy rung, in
the figure, carries a little ball. The ball represents the atom’s probability of
occupying that rung in a given trial, the rung’s probability weight. The
greater the probability weight, the larger the ball.
The cold bath drops some of the probability weight from rung 1 to rung 0.
Meanwhile, the negative-temperature bath boosts some of the probability
weight from rung 1 to rung 2. The negative-temperature bath is so hot that it
boosts more probability weight than the cold bath drops. Therefore, much
probability weight occupies the ladder’s top rung, some occupies the bottom
rung, and hardly any occupies the middle rung.
Now, the atom equilibrates with the infinite-temperature bath (figure
7.1b). So, the infinite-temperature bath coaxes the atom into having half its
probability weight on rung 2 and half its probability weight on rung 0. So,
some probability weight drops from the top rung to the bottom. In other
words, the atom drops from rung 2 to rung 0 in some trials. During the drop,
the atom emits a packet of energy into the infinite-temperature bath. So, the
quantum engine performs work.
Geusic, Schulz-DuBois, and Scovil calculated the engine’s efficiency—
the bang that the engine gets for a buck, or the work performed per packet of
heat absorbed from the hot bath. The negative-temperature bath gives the
atom an amount E2 − E1 of heat. Afterward, the engine gives the infinite-
temperature bath an amount E2 − E0 of work. The work exceeds the heat, as
figure 7.1 illustrates. The quantum engine operates at an efficiency greater
than one; the atom gets, per buck, more than a buck’s worth of bang. The
engine might as well have visited the store with enough money for a can of
oil (not that engines buy their own oil) and left with two cans, due to a buy-
one-get-one-free deal. The hot bath’s negative temperature, with the cold
bath’s low temperature, subsidizes the deal.
How quantum is this engine? The atom has quantized energies, so it’s
quantum literally. But Geusic and colleagues never leverage entanglement,
nor even any superpositions. Classical systems can approximate the
behaviors reported by Geusic and colleagues. But the atom, endowed by
quantum physics with few energy levels, forms a natural platform for such an
engine. Furthermore, Geusic and colleagues didn’t explore all the behaviors
that the engine can possibly exhibit. Some, such as the engine’s reliability,
rely on wavelike properties of quantum systems.3
LADIES AND GENTLEMEN, START YOUR
QUANTUM ENGINES
Geusic, Schulz-DuBois, and Scovil did to the field of quantum engines what
a green flag does to a NASCAR race. The field of quantum engines didn’t
rev up as quickly as a race car, but results have piled up by now. Many
results center on the Carnot efficiency, the greatest efficiency of any engine
that contacts only two heat baths. An engine can achieve the Carnot efficiency
only by operating infinitely slowly, for a reason that brings to my mind the
1954 film On the Waterfront. In the movie, a former boxer laments his lost
shot at greatness with the famous line, “I coulda been a contender.” Similarly,
at the end of a quick engine cycle, part of the heat dissipated can lament, “I
coulda been work”—albeit without actor Marlon Brando’s sulk or Brooklyn
accent. To avoid dissipating energy, you have to reduce the engine’s power to
zero. The power is the rate at which the engine performs work. Power trades
off with efficiency in classical thermodynamics.
A quantum engine may be able to have its power and eat its efficiency,
too. Italian physicists Michele Campisi and Rosario Fazio envisioned an
engine formed from particles that can undergo a special phase transition.4
You’ve witnessed a phase transition if you’ve boiled water for tea,
transforming water into steam. Quantum systems can undergo phase
transitions more exotic than the ones in our everyday lives. If you drive a
certain quantum system across a certain phase transition, interactions
between the particles may boost the work performed by the engine. The
power will grow without denting the efficiency.
One might hope that a many-particle quantum engine would resemble a car
engine, miniaturized. Tiny pistons would pump, turning tiny gears. Alas, I can
offer no such romantic vision. The engine might consist of a glob of particles,
a miniature cloud hovering above a tabletop as though threatening snow. Few
systems are known to undergo the necessary phase transition, which is exotic
even to quantum physicists. Campisi and Fazio found evidence that at least
one known system can undergo the transition, however. I hope that their
proposal spurs experimentalists to identify more such systems—that a
quantum-thermodynamic technology motivates fundamental discoveries.
Audrey alluded to another quantum engine in this book’s prologue: “
‘Baxter is developing a superior [spy fly], which can extract energy from a
certain type of light quite efficiently. Not the type here,’ she added, waving at
the fireplace, ‘but a type that Ewart would have in his lab.’ ” Audrey is
referring to squeezed light, which we can understand through the uncertainty
principle. The uncertainty principle limits the extent to which a quantum
system can have a well-defined position and a well-defined momentum.
Light has two properties, analogous to position and momentum, that also
obey the uncertainty principle. You can squeeze most of the uncertainty out of
one property, leaving the other property monumentally uncertain.* Squeezed
states facilitate applications of quantum information science, including
metrology and cryptography.
We’ve seen engine cycles that involve a cold bath and a hot bath. Imagine
replacing the hot bath with squeezed light. The light isn’t at equilibrium, but
we can attribute a temperature to it anyway.5 Imagine running a quantum
engine between the baths at maximum power: the engine performs the
greatest possible amount of work per cycle. Running at maximum power
erodes an engine’s efficiency, we’ve seen. We’ve established also that, if an
engine contacts just two thermal baths, the efficiency can’t exceed the Carnot
efficiency. But the squeezed-light engine runs at a higher efficiency than
Carnot’s, according to a 2014 analysis.6
Such a snub of Carnot shouldn’t discomfit us: The squeezed bath isn’t at
equilibrium. It violates an assumption behind Carnot’s bound on efficiency,
so the engine has no obligation to obey the bound. But, before encountering
the engine in the scientific literature, I wouldn’t necessarily have predicted
that squeezed light would break Carnot’s bound. Edison might as well have
predicted, upon beginning his experiments, which materials would support
practical light bulbs.
Not everyone agrees that the squeezed-bath engine breaks Carnot’s
bound.7,8 Theorists have argued that the squeezed bath transfers not only heat,
but also work to the engine. We should deduct this work from the work
performed by the engine, the theorists say. Quantum work and heat stir up
controversy again.
Regardless of the debate, Zürich experimentalists fabricated an engine
that exchanges energy with a squeezed-light bath.9 Their engine consists of a
tiny metal bridge that connects two vertical posts. The bridge has the
thickness of about a hundred DNA strands. Like a plucked violin string, the
bridge can vibrate up and down. The greater the vibration frequency, the
more energy the bridge has. The energy is quantized (one of just a few
numbers, with gaps between them), so the frequency is quantized. I wonder
how Brahms’s violin concerto would sound if Stradivarius violins had
similar gaps in their frequency ranges.
The bridge engine performs work—it outputs useful energy—when
dropping from a high energy-ladder rung to a lower one. The work is
performed on the posts that support the bridge. The experimentalists didn’t
bother channeling the work to perform any useful task, like raising a tiny
teapot. But they could have, by clamping a to-be-worked-on device to a post.
How much work does the bridge perform per cycle, on average over
experimental trials? The amount of energy carried by an infrared photon. An
incandescent light bulb radiates about 1020 photons per second—and many of
those carry more energy, apiece, than an infrared photon. Don’t expect the
squeezed-bath engine to revolutionize the energy industry tomorrow. Baxter
must have advanced beyond today’s experimentalists, to power a spy fly with
a squeezed-bath engine. Still, I couldn’t cool a nanobridge till it exhibited
quantum behaviors, control the bridge, and measure the work it performed.
So, today’s experimentalists deserve applause, no less than a violin concerto.
FIGURE 7.2
* Did we discuss squeezing in chapter 2 under a different name? According to chapter 2, measuring one
property eliminates its uncertainty while boosting the other property’s uncertainty. This measurement
differs from squeezing: The measurement changes a quantum state abruptly and violently, as a crash
reduces a moving car’s speed. Squeezing changes the state as the acceleration pedal changes a car’s
speed: more gently, over an extended time. You can accelerate a car, using a pedal, to 30 miles per hour,
or 45, or 50. Similarly, squeezing lets you choose how much uncertainty to cram into one property of
light. Finally, measuring a particle can give its position a well-defined value for only a short time. Light
stays squeezed for longer.
* The atoms’ staying put enables many-body-localized systems to store information as quantum
memories, loosely speaking. So, the property of many-body localization that facilitates information
storage enhances the engine, although the engine doesn’t rely directly on information processing.
†
Experimentalists had coaxed such a system not into many-body localization, by the time we wrote our
paper, but into a similar phase.
CHAPTER 8
TICK TOCK
QUANTUM CLOCKS
“We shall be late,” Audrey said. She was pacing up and down the corner of Market Square, not
registering the cobblestones, the barred doors of the shops that lined the square, or the predawn chill
that suffused the air. Her attention remained fixed on the silver pocket watch cupped in her hands.
“I told Captain Okoli that we would meet him at the docks at five ten,” she continued.
“Audrey,” said Baxter.
“The hour is already eighteen till five, and we still have not the faintest idea in which direction the
docks lie,” she said, as a ragged-looking mutt sporting a torn ear padded up to the far end of the path
she was pacing. The dog sat down on the cobblestones and whined, but her pacing ended half a yard in
front of him, and she turned on her heel without noticing him.
“Audrey.”
“If we cannot find our way soon—”
“Audrey.”
“—then Captain Okoli will—”
“Audrey.” The softer interjection came from Caspian.
“What is it!” she snapped, looking up from her pocket watch.
Caspian stepped forward, placed his hands on her shoulders, and gently swiveled them toward the
city hall.
An enormous clock protruded from the white stones, two-thirds of the way up the wall. The clock
resembled the silver pocket watch that now lay forgotten in Audrey’s hands as a chandelier resembles a
single candle flame. Audrey couldn’t quite pick out the clock’s hands, as though she were gazing at
them through a mist or through tears, but she could detect erratic motions, as though the hands were
jigging back and forth a little. The second hand mostly circumnavigated the clock face steadily but
sometimes sped up, slowed down, or jumped backward.
Audrey gazed silently for a moment.
“On the other hand,” she murmured, her eyes never leaving the clock face, “someone who knows
the way to the docks is bound to show up and point us in the right direction. I have no doubt that we can
afford to wait here another few minutes.”
FIGURE 8.1
TIMELESS TIMEKEEPING
So much for the autonomous quantum Rolex. But a quantum clock can
approximate the ideal quantum clock, similarly to how a real engine can
approximate a Carnot engine. Three colleagues of mine detailed the
approximation, and I visited two—Jonathan Oppenheim and Mischa Woods
—in London one spring. Jonathan is a professor of physics and astronomy at
University College London. Mischa was a postdoc in a superposition
between London and Delft, Holland. I’d spend all day working with them,
and then walk to the British Museum in the evening. Although I’d arrive after
the museum closed, I couldn’t stay away. Like Audrey’s parents, I’ve
relished studying the ancient Near East and ancient Egypt—although the
Stoqhardts have more expertise than I. The British Museum boasts a treasure
trove of artifacts, which I intended to pore over during the weekend. What
better way to ready oneself for millennia-old artifacts than by learning the
latest science about time?
Mischa, Jonathan, and their collaborator Ralph Silva designed an
approximation to the ideal time state, the superposition spread evenly across
all energies.7 My colleagues conceived of a different superposition of
energies—a quantum state in which the energy is not maximally uncertain, but
only highly uncertain.
Imagine measuring a quantum clock to learn the time, or controlling a
quantum robot with the clock. You, or the robot, would interact with the
clock. The interaction would disturb the clock, changing the clock’s quantum
state. The disturbance wouldn’t interfere with timekeeping if the clock were
ideal. But an imperfect clock would degrade, reducing our ability to
distinguish instants. You might as well gaze at a grandfather clock through
increasingly blurry glasses: six o’clock will blend into 5:59 and 6:01, then
into 5:58 and 6:02. Audrey noticed such blurriness when watching the
quantum clock in Market Square at the beginning of this chapter.*
Disturbances also hinder the clock’s ability to initiate processes, such as
logic gates in a computation, at desired times.
How well could Mischa, Ralph, and Jonathan’s clock withstand such
disturbances? Not too poorly, a Brit might say in an understatement. Imagine
growing the clock—adding particles to it, although not so many particles that
the clock loses its quantum nature. The bigger the clock, the greater its
resilience. And giving a little gets you a lot: as the clock grows, its resilience
grows exponentially.
Resilience also characterizes the British Museum’s artifacts. I have a soft
spot for lamassu, remnants of ancient Assyria. Ten-foot-tall statues of these
winged bull-men guarded the entrances to palaces. Time has degraded the
lamassu, but only a little: an observer can distinguish feathers in their wings
and strands in their beards. Such artifacts are portrayed as having “withstood
the flow of time,” or “evaded the flow of time,” or “resisted.” Such
portrayals don’t appeal to me, although the lamassu’s longevity does. I prefer
to regard them as surviving not because they clash with time, but because
they harmonize with it in some way. From this perspective, lamassu lie only
a few steps from the second law of thermodynamics and clocks.
On the other hand, the ancient Egyptians sculpted granite, when they could
afford it. Gudea, king of the ancient city-state of Lagash, immortalized
himself in diorite. Mischa, Jonathan, and I fashion ideas, which lack
substance. Imagine playing, rather than rock-paper-scissors, granite-diorite-
idea. The idea wouldn’t stand a chance.
Or would it? Because an idea lacks substance, it can manifest in many
forms. Plato’s cave allegory has manifested as a story, as classroom lectures,
on handwritten pages, on word processors and websites, in cartloads of
novels, in the film The Matrix, and in one of the four most memorable
advertisements I received from colleges as a high school junior. Plato’s
allegory—an idea—has survived since the fourth century BCE. King
Ashurbanipal’s lion-hunt reliefs, carved in alabaster, have survived for only
200–300 years longer.
The lion-hunt reliefs—and the lamassu—exude a grandness, a majesty as
alluring as their longevity. The nature of time and the ideal clock have as
much grandness, I believe. After leaving the British Museum’s Assyrian
gallery one Saturday, I boarded a train for Oxford. A quantum-
thermodynamics conference was to take place there starting on Monday. I
couldn’t have asked for a more fitting follow-up to the museum.
SWITCHING GEARS
I accidentally discovered that an autonomous quantum clock appears to exist
in the wild. The discovery owes its origin to David Limmer, a chemist at the
University of California, Berkeley. David has a southwestern twang; the
energy of a young scientist bent on changing his field; and, to me, the aspect
of an academic older brother. He studies a molecule—found in nature and
leveraged in technologies—depicted in figure 8.2.
The molecule contains two clusters of nuclei, represented by the spheres
in the figure. The rods represent chemical bonds, or electrons shared by the
clusters. You’ll often find such a molecule in the “closed” configuration at
the top of the figure. If you shine light on the molecule, one cluster may rotate
around the other. The resulting “open” configuration appears at the bottom of
the figure. The ability to switch configurations lends the structure one of its
names, molecular switch.
FIGURE 8.2
These switches exist in our retinas. When light enters our eyes, a
molecular switch can absorb a photon. Changing configuration, the molecule
hits a protein, as you might hit a bedside lamp when stretching after a night’s
sleep. The knock sets off a sequence of chemical reactions, which can end in
the impression of sight. Therefore, these molecular switches matter.
David wanted to model the switch using quantum thermodynamics, for a
reason I’ll explain in chapter 11. He wrote down what he knew about the
molecule; I wrote down mathematics from quantum thermodynamics; and we
kneaded the two together.8 The mathematics that represents David’s
molecule, I realized, contains mathematics that represents quantum clocks.
Think of the molecular switch as an autonomous machine that contains a
quantum clock. The clock’s hand consists of the rotating clump of nuclei. As
a hand rotates down a clock face, so do the nuclei rotate downward. The
right hand effectively points to 2 when the molecule occupies the upper
configuration in figure 8.2. When the molecule occupies the bottom
configuration, the hand effectively points to 4.
No human uses this clock to tell time. Rather, the clock tells the rest of the
machine how to behave at each instant. The rest of the machine, in the
molecular switch’s case, consists of electrons. Nuclei account for most of the
molecule’s weight; electrons account for little. They flit about the landscape
shaped by the atomic clumps. That landscape helps determine how the
electrons behave. So, the electrons form the rest of the machine controlled by
the nuclear clock. Earlier, we imagined a quantum clock that determines
when a quantum computer performs some logic gate. You can replace a
quantum computer with the electrons and replace performs a logic gate
with flit around a particular landscape. So, a quantum clock determines
when the electrons flit around a particular landscape.
A good clock’s hands have well-defined positions: they report one time at
each instant. Yet the hands also have well-defined momenta, traversing the
clock’s face at a constant speed. A quantum system can’t have a well-defined
position and a well-defined momentum simultaneously. So, how can a
quantum system serve as a clock hand?
I learned an answer from the Londoner I visited between conversations
with Mischa Woods and Jonathan Oppenheim: David Jennings is a quantum
thermodynamicist who, at the time, worked at Imperial College London.*
David reminded me that as a quantum system grows, it approaches
classicality. Imagine starting with one quantum particle and adding other
particles to it, or growing the particles’ masses. Repeat this process many,
many, many times, and the system will wind up described by classical
physics. A classical system can have a well-defined position and a well-
defined momentum.
So, a quantum clock hand should consist of many quantum particles, or of
massive quantum particles. They’ll occupy the midlands between one
quantum particle and a classical clock hand. Such a midland clock hand can
occupy a quantum state in which it has a fairly well-defined position—if not
a completely well-defined position—and a fairly well-defined momentum.
Imagine preparing the clock hand in this quantum state, measuring the clock
hand’s momentum, and dividing the outcome by the clock hand’s size. Repeat
this process in each of many trials. Most of the resulting numbers will lie
close to each other, so the hand has a fairly well-defined momentum. A
similar story features the clock hand’s position.
David Jennings’s insight explains how David Limmer’s molecule can
serve as a quantum clock: The clock hand manifests as a clump of nuclei. The
clump has a large mass relative to—for example—an electron. So, the clump
can rotate fairly steadily and report times fairly accurately.
David Limmer studies these molecular switches partially because
experimentalists can fabricate and control them easily. For instance,
experimentalists can set the atomic clump moving—can “wind up” the clock
—with lasers. All the other (autonomous) quantum clocks I know of live in
our imaginations. But those clocks have been thriving: Mischa, Jonathan,
Ralph, David Jennings, and many other theorists have been proving a flurry
of mathematical results about them. Can molecular switches bridge quantum
clocks from theory to experiment? Whether measured on a quantum clock or
a classical one, time will tell.
* We can say that an atomic clock contains an ion because an ion is like an atom, except it has a
different number of electrons.
* Such a colossal quantum clock exists only in the imaginary novel that Audrey inhabits.
* As University College London lies near the British Museum, Imperial College London neighbors the
Victoria and Albert Museum and London’s Museum of National History. Accordingly, I must confess to
having arrived in the neighborhood hours before our conversation started.
CHAPTER 9
Audrey gripped the ship’s rail, shut her eyes, and tried to avoid remembering how many spoonfuls of
marmalade she’d eaten at breakfast. Salty water kept spraying her face and dress, as though the sea
were determined to reach out and befriend her. Audrey approved of cordiality generally, but she
determined to decline this calling card. A robust friendship required sharing, her mother insisted, and
Audrey preferred not to share her marmalade with the waves.
“All right there, miss?” The first mate, his face sunburned and wrinkled like an old peach, appeared
at her elbow.
Audrey opened her eyes for an instant, shut them, and swallowed. She liked the first mate, and she
usually liked peaches, but the thought of them did her stomach no favor now.
“Captain Okoli said to fetch anythin’ y’need if you’re unwell—and to ask you to come inside again.
The sea en’t rough to us as knows it, but ’tis different to a gentle lass such as yeself.”
Audrey inhaled deeply through her nose and managed a “thank you” before the ship pitched again.
“Is it—is it always like this?” Audrey asked. Her jaw then clamped shut of her own accord, and she
gave up on speaking more.
The first mate lifted his head to gaze at the green hills that were approaching. “Near Fluctuarian
Bay?” he said. “Aye, miss, ’tis always like this.”
and
Stretching the hairpin costs work. The amount of work varies from trial to
trial of our experiment: in one trial, a water molecule will kick the hairpin
here, in this direction; in the next trial, a water molecule will kick the hairpin
there, in that direction. So, the amount of work needed in a given trial is
random. Imagine running many trials and stretching the hairpin through the
same distance every time. We can measure the work required in every trial.
The more times we have to pay a given amount of work, the more likely we
are to pay that amount next time. So, from our measurements, we can infer the
next trial’s probability of costing a given amount of work.
The probabilities would obey a simple pattern in a special case: at the
end of the experiment, the DNA is unzipped. The pipette and laser hold the
DNA’s feet steady. If we held the feet there forever, the hairpin would come
to thermal equilibrium with the water—to share the water’s temperature. The
DNA would end up with some amount Ff of free energy, wherein the
subscript f stands for final. Ff is the energy we’d have to invest to pull the
stretched hairpin out of a hat and warm the hairpin to room temperature.
Imagine pulling a stretched hairpin out of a hat after annihilating a relaxed
hairpin. The annihilation gives us an amount Fi of energy, which we can
invest in creating the stretched hairpin from nothing. Since creating the
stretched hairpin costs an amount Ff of energy, the whole process—the
annihilation and creation—costs an amount Ff − Fi of energy. Let’s call this
amount our Boltzmann balance, after the early thermodynamicist Ludwig
Boltzmann.*
What relevance does the Boltzmann balance have to our DNA
experiment? Imagine unzipping the DNA infinitely slowly, so that the hairpin
always remains at equilibrium. The DNA wouldn’t churn up the water, so
we'd waste little energy. The randomness in our experiment would die down.
We’d have to pay the same amount of work in every trial—and that amount
would be the Boltzmann balance.
Chemists, biologists, and pharmacologists want to know how large this
Boltzmann balance is. It governs how proteins change shape, how molecules
bind together, how drugs diffuse across cell membranes, and more. But
measuring the Boltzmann balance is tricky. For instance, you could measure
the balance by stretching the DNA infinitely slowly, and measuring the work
required, in one trial. But infinitely slow pulling would take forever.
Fortunately, we can seek help from “FLUCTUATION RELATIONS FOR
ALL YOUR THERMODYNAMIC NEEDS!”
Fluctuation relations don’t solve all thermodynamic needs—I’d never
trust a Victorian advert—but they help resolve our problem: We can unzip the
DNA quickly, jolting it out of equilibrium and jostling water molecules. The
work spent to stretch the hairpin, we can measure. We’ll repeat this pulling
and measuring in each of many trials. From our measurements, we’ll infer the
next trial’s probability of requiring this amount of work, or this amount, or
that amount. We plug these probabilities into one side of an equation—a
fluctuation relation—and, on the other side, out pops an estimate of the
Boltzmann balance.
Christopher Jarzynski formulated this strategy—and the corresponding
fluctuation relation—in 1997.4 He’s now a professor of chemistry at the
University of Maryland. Apart from Chris, everyone calls the equation
Jarzynski’s equality. Chris is so humble, though, he calls it the
nonequilibrium fluctuation relation.
Chris’s equation offers a new way to measure the Boltzmann balance, but
this way doesn’t eliminate all difficulties. For instance, his method requires
us to run many, many trials of the experiment. Fortunately, we can mitigate
that challenge using information theory, as we’ll see in chapter 10.
Jarzynski’s equality doesn’t merely help us estimate a Boltzmann balance
—it doesn’t only have technological applications. As a Victorian
advertisement might say:
FIGURE 9.2
At the end of a trial, we can measure how many more electrons occupy the
right-hand dot than the left-hand dot, using a certain meter. The imbalance
implies how many electrons hopped rightward. Pulling and pushing each
electron rightward cost the positive and negative charges work. So, we can
measure how much work the charges have performed in a given trial. That
amount of work satisfies fluctuation relations.12,13 More-elaborate versions of
this experiment have taken place in Finland, which I think a fitting locale for
studying thermodynamics: Where can one better experience cold and
appreciate heat?14
The quantum-dot experiments center on electrons, which seem quantum.
Furthermore, the Finnish group detected hops by single electrons. Yet
classical physics describes the experiments: The electrons share no
entanglement and are in no other interesting superpositions. We can more
fruitfully imagine the electrons as miniature kumquats than track the
electrons’ wavelike properties. Which is all well and good, as quantum
steampunk encompasses small classical systems neglected by traditional
thermodynamics, in addition to quantum systems. But fluctuation relations
should extend to quantum physics. How can they?
* Scientists call the Boltzmann balance the free-energy difference, but I think that that’s a horrid
mouthful.
CHAPTER 10
“That pickering-bus is your sole chance of reaching Singledon before the summer rains begin.” Captain
Okoli pointed into the sunrise.
The three travelers shaded their eyes to look. A vehicle was … Audrey struggled to find a verb that
encapsulated the machine’s action, other than approaching. The pickering-bus consisted of a green
dome atop four spindly, jointed legs like a praying mantis’s. The vehicle sprang from leg to leg, stalking
toward the figures huddling on the filling station’s rooftop.
“The pickering-bus shall dock directly below us,” Captain Okoli continued, never taking his eyes off
the vehicle. “Filling the tank takes nineteen minutes, during which time you shall drop three-and-a-half
feet onto the deck.” He pointed downward. “On the deck is a cushioned bench, with fifteen leather
straps. You must tie those to your waists, legs, and arms, and not untie yourselves until arriving in
Singledon.”
For the first time, Captain Okoli turned to the three travelers. Audrey and Baxter quailed under the
fierceness of his gaze, and even Caspian blinked twice.
“Remember,” Captain Okoli continued, “you must not untie yourselves, on pain of a most
uncomfortable death. When you arrive in Singledon, my sister shall lead you through a hatch unknown to
the crew.”
Audrey, Baxter, and Caspian turned their gazes back to the pickering-bus springing toward them like
a fairground ride that no one wishes to brave.
“On pain of death?” said Baxter.
“A most uncomfortable death?” echoed Audrey.
“How do you know all those details about the bus?” asked Caspian.
Captain Okoli turned back toward the pickering-bus and watched it for a moment. A smile spread
across his face, as though he were watching a kitten playing with a ribbon rather than a contraption that
resembled a giant predatory insect.
“My sister and I invented the pickering-bus,” said Captain Okoli.
“Oh,” said Baxter.
“An excellent reason,” added Audrey.
* I’m imagining unrealistic levels of equality: that all teams enjoy the same amount of funding, that all
players are of the same quality, etc.
CHAPTER 11
RESOURCE THEORIES
A HA’PENNY OF A QUANTUM STATE
“Miss?”
Audrey’s eyes fluttered open. Judging by the cold permeating her right cheek, she figured she’d
fallen asleep against the curved wall of the dirigible’s glass basket. She peeled her cheek off the glass,
intending to rub warmth back into it, but her hand stopped halfway to her face when she glimpsed the
airship’s surroundings through the window. Audrey drew a sharp breath.
A network of islands hung in the air, tethered to each other and to the ground with chains of massive
metal links. Woods filled one island, obsidian filled another, pillars of salt covered a third, and—Audrey
gasped—nuggets upon nuggets of gold gleamed from the dirt on another. Audrey’s vessel wasn’t alone
in navigating the islands. Other dirigibles floated past the islands, docked at them, and withdrew from the
docks. About half the vessels bore flags, painted on their oblong surfaces, that apparently depicted the
islands floating in the air. Nearly half the vessels bore flags recognizable from the city-state that the
three travelers had just left. The rest of the dirigibles wore colors that she recognized from the previous
week, or the week before, or colors that she didn’t recognize but that—she suspected—she would grow
acquainted with before her journey ended. The vessels, islands, and chains floated amidst wisps and
curls of clouds, which fluttered like flags themselves.
“Miss.”
Audrey tore her gaze away from the glass wall and turned to the freckled young conductor who was
bobbing on his toes beside her seat. When he saw that she’d woken, his navy-and-gold cap bobbed
farther toward her in a little bow (conductors appeared, Audrey noted, to wear navy-and-gold caps
across the map and across transportation methods). He was holding the ticket that she’d placed above
her seat, marking her territory and announcing her destination.
“Your stop is next, miss, and your friends farther down asked me to rouse you,” he said, the navy-
and-gold hat bobbing toward the end of the aisle. “We dock at the island in fifteen minutes.”
CROSSING BORDERS
We met David Limmer briefly in chapter 8, when discussing quantum clocks.
But David, a theoretical chemist at the University of California, Berkeley,
merits more of an introduction. Meeting him in the last year of graduate
school, while I was planning my postdoctoral years, returned me mentally to
elementary school. There, I’d looked up to two students who were three
grades above mine. They’d represented our school in science fairs,
participated in speech competitions, and enrolled in rigorous high school
programs. I look up to David as I looked up to those two students. He’d
completed his postdoctoral stint two years earlier and was building his
research group. He studies statistical mechanics far from equilibrium, using
information theory and other mathematical tools. Although a theorist ardent
about mathematics, he partners with experimentalists. He keeps an eye on
topics as far afield as black holes. And he’s even three years older than I,
like those kids in elementary school.
Keeping an eye on topics far afield had brought resource theories to
David’s attention. He asked me whether we could use a resource theory to
answer a question he’d posed about chemistry. The question concerns the
molecular switch introduced in chapter 8 and depicted in figure 8.2. The
molecular switch is called a photoisomer. Photoisomers appear across
nature and technologies: we have them in our eyes, as mentioned earlier, and
experimentalists have used the switches to improve the storage of solar fuel.
The switch has two collections of bonded-together atoms, both attached to an
axis.
Your average-Joe molecular switch spends much of its life in equilibrium,
exchanging heat with room-temperature surroundings. The molecule has the
shape at the top of figure 11.1, called the cis configuration. Imagine shining a
laser or sunlight on the switch. The molecule can absorb a photon, gaining
energy. The energized switch has the opportunity to, well, switch: one group
of atoms can rotate downward. The molecule will come to occupy its trans
configuration.
The molecule now has more energy than it had while in equilibrium,
albeit less energy than it had right after absorbing the photon. The molecule
can remain in this condition for a decent amount of time; that is, the molecule
can store the photon’s energy. For that reason, experimentalists at Harvard
and MIT attached molecular switches to nanoscale tubes designed to capture
and store solar fuel.17
FIGURE 11.1
The tent’s burlap flap rustled, and an object scraped across the sand. Before Audrey, Baxter, and
Caspian could register what had happened, the scraping ended, and footsteps pounded away into the
night. Baxter reached the tent flap first, and he stooped to pick up the package that had been thrust
inside.
“We have received a letter,” he announced, untying the string tied around the folded brown paper.
The three travelers returned to the little table where their lamp flickered and smoked. Baxter hunched
over the paper as Caspian bent over his left shoulder and Audrey pressed against his right.
“Runes,” said Caspian, who could see more than Audrey. “Give the letter to your sister, Baxter; she
will have the best chance at translating it.”
Baxter obeyed, and the huddle reorganized itself around Audrey. She held the paper up to her face,
squinted, and ran a finger over the symbols.
“ ’Tis difficult to see,” she said. “The writing is small at the end, but I believe I can nearly make it
out.” Her tongue poked between her lips as she concentrated. “To find the … unwitnessed …
unknown? No, ’tis more like ‘unwitnessed’ than ‘unknown.’ Maybe—ah, I believe the word is ‘unseen.’
Very well, then; to find the unseen kingdom, walk—no, come—come along after—follow. Yes, follow
the … arrow. Pointing. Sign? Ah, the signpost! To find the unseen kingdom, follow the signpost. Then
comes a smudge that …”
Audrey brought the paper closer to her face, and the others bent closer over her shoulders. Suddenly,
she flung her head back, nearly knocking skulls with both her companions.
“Baxter,” she said, holding out a hand, “give me the Collection, please.”
Baxter had fashioned the Collection himself, and he always kept it in a pocket. Since Baxter was
Baxter, the pocket might be in the jacket he’d worn last week, but the Collection presented itself after
he’d patted only three possible hiding places in today’s outfit. One might be inclined to call the Collection
a Swiss army knife; it contained a penknife, a corkscrew, a screwdriver, a magnifying glass, a ruler, a
compass, and scissors. But Baxter had added a laser, a quantum sensor, an antenna, and a refrigerator
large enough to house two qubits on a chip. He passed the Collection to Audrey, who teased out the
magnifying glass and held it up to the page for a moment.
“My,” she murmured. “This lies in your department, brother.”
Audrey handed the paper and the Collection to Baxter, who brought the magnifying glass to one eye
and then whistled.
“I have never seen a smaller refrigerator,” he said. “I should like to meet the engineer who built it.”
“What are we intended to do with a minuscule refrigerator affixed to the end of a mysterious letter?”
asked Audrey. Silence fell as all three travelers ruminated over the delivery.
“Arrow,” murmured Caspian. “Signpost.”
“Eh?” said Baxter.
“Let me see the Collection,” said Caspian.
He sorted through the tools that Audrey handed him until finding the quantum sensor, a defect-
containing diamond shielded from its surroundings.
“The letter could have fallen into anyone’s hands,” said Caspian, “and the writer wanted for the
signpost to be visible only to us. The signpost is likely small, judging by how the writing shrinks, and can
be represented as an arrow. The best way to detect a tiny arrow is with a tiny sensor; and no one would
be able to detect a tiny arrow, far from a laboratory, in the middle of the desert—apart from our
Baxter.”
Baxter didn’t bother hiding the grin that crept from one ear to the other until Caspian added, “And, of
course, apart from Ewart.” But a needle was shifting across a dial on the sensor, which Caspian held up
to the light after it froze.
“Down,” he announced, before turning to Audrey. “Audrey, what lies below us?”
Audrey had read all about the region as soon as her parents had planned a dig there.
“Workers’ huts from the fourteenth century BCE,” she said.
“What lies below the huts?”
Audrey shook her head. “Excavations have focused on the palaces built by the workers, so no one
ever bothered with the huts.” She looked hard at Caspian. “Do you think that the lost kingdom lies
below the huts?”
He shrugged.
“No one has seen the lost kingdom in millennia. It must be where nobody has bothered looking.”
* Such a matrix can contain not just real numbers, but two-dimensional numbers, or complex numbers.
How can a number be two-dimensional? Being on Earth, I’m on a two-dimensional surface. We can
specify my location by specifying a latitude and a longitude. Those two real numbers can be packaged
into one complex number. In general, a complex number contains two real numbers and can be
interpreted as a point on a two-dimensional surface.
* The energy is only fairly well-defined because quantum theory distinguishes energy from other
observables. We won’t worry, here, about why.
CHAPTER 13
Landscapes flashed past as the travelers walked, crawled, motored, rode, and flew, night and day. Single
images impressed themselves upon Audrey’s memory, like leaves snatched from a whirlwind: In a
tunnel cut from the rock below the Earth’s surface, a single torch illuminated the human figures carved
on the walls. Rows of huts stretched across a barren landscape broken only by withered shrubs
scattered here and there. Blood red, lemon yellow, emerald green, and sky blue clamored for attention in
dresses, linens, turbans, and woven baskets in a bazaar. A glass palace glittered atop a cliff under a
stormy sky, grey waves crashing against the rocks below.
Everywhere the travelers arrived, Ewart sought them out. In the bazaar, they were stalked by a thin,
reed-like man smoking a pipe—always more than a stall away but never far enough. A letter, awaiting
the trio at an inn they reached late one night, had been slitted open and resealed before their arrival.
Shadows clung to the walls, and clouds didn’t hang over only the glass palace.
Every time they arrived at a mail depot, the travelers teleported a few qubits to Audrey’s parents, as
the quantum information could reach the elder Stoqhardts more securely than classical information
could. Much as she reveled in exploring, and in the sense of purpose provided by their quest, and in how
her stomach plunged whenever their gyropter dipped, Audrey sometimes envied those qubits. On frigid
nights, huddled beneath a thin blanket, as Baxter snored in the neighboring cot, she wished she could
teleport like one of those qubits to the travelers’ final destination. Alas for Audrey, a classical lady could
not teleport—and, worse, none of the travelers could predict where their final destination lay.
COOL IT
Computers, we’ve established, need blank scrap paper. For instance, imagine
calculating the distance traversed by Audrey, Baxter, and Caspian. You’d
trace their route out on a map, measure each leg of the trip with a ruler,
convert the length on the map into a length on the ground (or in the air), and
sum the lengths. We’d write the numbers down, then write more while
carrying 1s and so on during the summation.*
Blank scrap paper manifests, to a classical computer, as bits set to 0. To a
quantum computer, blank paper manifests as qubits represented by upward-
pointing arrows. Imagine arranging magnets near an electron as in figure
13.1. The south pole lies above the north pole, so the magnetic field points
upward. The electron’s spin interacts with the magnetic field, which shapes
the spin’s energy ladder. The spin occupies its low-energy rung when
aligning with the magnetic field (when pointing upward) and occupies its
high-energy rung when anti-aligning (when pointing downward). So,
dropping the temperature, lowering the qubit’s energy, forces the spin
upward. Cooling erases the scribbles from quantum scrap paper.
Erasing scribbles costs work, we learned from Landauer. In other words,
the information-processing task of erasure happens via a thermodynamic
process. Conversely, we can accomplish a thermodynamic process by
processing information: we can cool qubits by exploiting correlations
between them. We call this strategy algorithmic cooling.1,2
FIGURE 13.1
Imagine a clutch of qubits that we’ve used to factor 9,824,783 into prime
numbers. Individual qubits contain entropy, as a corner of a sheet of scrap
paper can contain scrawlings. Some of the qubits might share entanglement;
some might share classical correlations. We can design a circuit that shoves
the entropy into a few of the qubits. Think of those qubits as a rubbish bin (as
Audrey would say) where we dump the rinds and grease from dinner. The
rest of the qubits emerge reset to pointing upward, like dishes emerging from
a scrub.
This strategy can accommodate one-shot thermodynamics, à la chapter 10:
our next computation will employ the qubits that we’ll have cleaned. We
might not mind if the next computation has a 0.1% chance of failing, so the
cleaned qubits need only point mostly upward. We can clean more qubits
approximately than exactly; so, by sacrificing the next calculation’s
probability of succeeding, we can scrub more qubits.
A classical example illustrates algorithmic cooling, which can reset
classical bits as well as qubits. Suppose that Audrey holds one bit and
Baxter holds another. The bits have the same value, sharing a classical
correlation. The pair has a 50% chance of being 00 and a 50% chance of
being 11.
One logical operation, akin to addition, is the controlled-NOT. If
Audrey’s bit is a 0, the operation does nothing to Baxter’s bit. If Audrey’s bit
is a 1, the operation flips Baxter’s bit, interchanging the 0 and 1 values. We
call this flip a NOT operation because 0 often stands for yes, while 1 stands
for no.* Not yes means no, and not no means yes.
To understand the controlled-NOT, imagine Audrey and Baxter visiting
their great uncle, Lord Wyndham Whewell. The first morning, Lord
Whewell’s butler asks the siblings whether they’ll partake of the soup to be
served at dinner, as neither cares much for soup. Baxter, a younger brother
who looks up to his sister, decides to answer as she does. So, both their
answers—both bits—are 0s (yes, we’ll have the soup), or both are 1s (no,
thank you). Audrey waffles, assenting during half their visits and declining
during half. So, the bits have a 50% chance of being 00 and a 50% chance of
being 11. Ten minutes before dinner, the butler confirms their choices.
Audrey doesn’t change her mind. If she continues to consent to the soup (if
her bit remains a 0), Baxter continues to consent (his bit remains a 0).
However, if Audrey continues to decline the soup (if her bit remains a 1),
hearing her negative encourages Baxter’s caprice momentarily, and he
changes his order (he flips his bit from 1 to 0). The controlled-NOT negates
Baxter’s choice conditionally on Audrey’s.
Let’s forget the soup and focus on how the controlled-NOT leaves the
bits. First, suppose that the bits begin as 00. Since Audrey holds a 0, the
operation does nothing to Baxter’s bit. The siblings end with 00. Second,
suppose, that the bits begin as 11. Since Audrey holds a 1, Baxter’s bit flips
from 1 to 0. The siblings end with 10.
In both cases, Baxter ends with a 0. His bit has been reset; the next time
the siblings run a computation, they can use his bit as scrap paper. But
Audrey’s bit has a 50% chance of being a 0 and a 50% chance of being a 1.
This probability distribution has the greatest entropy possible. So, Audrey
effectively holds a “scribbled-on” piece of scrap paper. She discards it,
writing her bit off as the price of erasing Baxter’s bit.
Baxter’s bit ends up clean in this example, but the bit doesn’t in all
examples. Why? The two bits could begin less correlated, or subject to a
probability distribution other than 50-50. The siblings might be able to scrub
Baxter’s bit only partially, like dishes in the absence of soap.
A heat bath can help.3 Suppose that Audrey and Baxter have many qubits
and a heat bath. The bath is cold but not as cold as their qubits need to be:
Thermalizing all the qubits with the bath would erase each qubit partially;
each would shed some of its energy and entropy into the bath. But no qubit
would shed enough to serve in the next computation; no qubit would arrive
close enough to pointing upward.
The siblings evade this pitfall, taking advantage of the bath’s low
temperature, by alternating steps in an algorithm.4 First, they perform
controlled-NOT-type logic gates on the qubits. This step shifts much of the
qubits’ energy and entropy into a few rubbish-bin qubits. The rest of the
qubits approach the quantum 0 state but don’t arrive as close as desired.
Second, Audrey and Baxter thermalize the rubbish-bin qubits with the bath.
The bath partially empties the rubbish, accepting energy and entropy from the
qubits. These qubits can now accept more energy and entropy from their
fellows, during the next logic gates.
The siblings perform logic gates, empty the rubbish bin, perform logic
gates, empty the rubbish bin, and so on. When they stop depends on how cold
they need their qubits and how cold the qubits can grow. If the siblings play
their cards right, the clean qubits can grow colder than the bath. How cold
depends on the number of qubits, on the bath’s temperature, and on which
logic gates the siblings perform.
Relying on logic gates earns this procedure its name, algorithmic cooling.
Algorithmic cooling has advanced from theory into experiments.3 Many
experiments feature molecules whose nuclei contain the spins that serve as
qubits. Experimentalists manipulate these qubits with magnets, applying
techniques used to take pictures of the brain in MRI—nuclear magnetic
resonance, according to chapter 3.
How quantum is algorithmic cooling? The answer resembles our answer
to the question, in chapter 7, “How quantum was the operation of the maser
heat engine?” Like the engine, the cooled qubits have quantized energies—
energy ladders with separated rungs. But the maser engine needed no
entanglement to run. Similarly, entanglement plays no role in most
algorithmic cooling. After all, we saw Audrey and Baxter algorithmically
cool a bit in a classical example. But algorithmic cooling can prepare qubits
to undergo quantum computations that involve entanglement.
FIGURE 13.2
TORNADO TOUR
The first time I visited England, my mother and I booked a day-long bus tour.
We departed from London in the morning; visited Windsor Castle,
Stonehenge, and the University of Oxford; and returned to London after dark.
I grew to love Oxford only years later, while lingering for days or months.
But we didn’t have months to spend on that first visit. So, my mother and I
made do with half of an afternoon. So shall we make do, dear reader, with a
few pages for the final city-states on the quantum-steampunk map. Our
whirlwind tour shall end as a tornado.
What distinguishes quantum physics from classical? Not superpositions,
which classical waves can be in. Not discreteness, which classical systems
can approximate. If you guessed liver—that is, entropy—you’re on the right
track: a quantum system has von Neumann entropy when entangled with
another system, and entanglement is nonclassical. But what about
entanglement enables it to speed up computations? A property called
contextuality, at least in certain cases.
Every experiment occurs in some context: certain posters adorn the
laboratory’s walls; a dropped pencil lies on the ground beside the door; the
experimentalist wears a gray woolen sweater knitted by her grandmother.
Much of this context seems irrelevant to the experiment’s outcome. But some
elements that we’d expect to be irrelevant—judging by our everyday,
classical experiences—are relevant, in quantum physics. Not dropped
pencils and woolen sweaters, but quantum analogs of them. So, we call
quantum physics contextual and classical physics noncontextual.
It’s difficult to prove that some physical phenomenon is nonclassical—
that no classical system could reproduce the phenomenon. But your work is
done if you can prove that the phenomenon is contextual. Which quantum-
thermodynamic behaviors are contextual—provably nonclassical—and
which aren’t? Physicists are answering this question by finding contextuality
—or a lack thereof—in protocols for measuring and extracting work.12
Time to get back on the tour bus—or, since we’re surveying quantum
steampunk, the steam-powered motorcar. We’re leaving Windsor Castle for
Stonehenge—which consists, in this book, of realistic heat baths. Across
conventional thermodynamics, we assume that baths have certain properties:
First, they’re infinitely large. This assumption permeates even resource
theories, which contribute to one-shot thermodynamics, the champion of
small systems: we assume that one’s environment contains baths of all sizes.
Second, we assume that heat baths have short memories. Consider a small
system—say, a few qubits—exchanging heat with a bath. As energy passes
between bath and qubits, information about the qubits’ quantum state enters
the bath, like a swimmer swept out to sea. The sea is so vast, the water has
almost no chance of returning the swimmer to their shore. Likewise, the bath
is so vast, the information has almost no chance of returning to the qubits.
The information gets lost in the bath, so we say that the bath lacks a memory.
Third, the bath interacts with the qubits weakly; exchanging a great deal of
energy takes a long time.
Not all baths satisfy these assumptions. What if a heat bath is small,
returns swimmers to their shores, or exchanges energy quickly? The bath can
serve as a resource,13 the qubits may avoid thermalizing, erasure can cost
less work than Landauer predicted,14 and more.
Now, let’s motor from Stonehenge to Oxford; that is, let’s move on to
quantum thermometry. The zeroth law of thermodynamics implies that
thermometers exist: If a spoon held by Baxter is in thermal equilibrium with
an almond pudding being eaten by Audrey and with a curry being eaten by
Caspian, then Audrey’s almond pudding is in thermal equilibrium with
Caspian’s curry. If the friends know the temperature of Caspian’s curry, they
can infer the temperature of Audrey’s almond pudding. Baxter’s spoon serves
as a thermometer. Thermometers detect and report temperatures, as I learned
when plastic rods were pushed deep under my tongue during childhood.
Determining whether a child should stay home from school requires
neither high precision nor small thermometers—nor quantum theory nor the
ability to discriminate between low temperatures. But imagine studying how
an embryo’s development depends on the temperatures at different points in
the embryo. You’d want to detect tiny temperature differences across tiny
distances. Or imagine measuring a quantum system’s temperature.
Quantum thermometrists study the effects of quantum phenomena on
temperature measurements—effects that lead to challenges and benefits.15
Challenges include a contrast with the zeroth-law story above. There,
Baxter’s spoon equilibrates with Audrey’s almond pudding and with
Caspian’s curry. Equilibrating—and even nearly equilibrating—takes a long
time. Given too long to sit and twiddle their thumbs, quantum systems
decohere. So, quantum thermometers may not have time to equilibrate with
quantum systems.16
Another challenge stems from how measurements disturb quantum
systems. The more information a measurement extracts, the more the
measurement disturbs the quantum system. How much can you learn about the
temperature without damaging the quantum state much?
The advantages of quantum thermometers include the ability to entangle.
Preparing a thermometer in an entangled state can enhance the measurement’s
precision.17 Or suppose that your thermometer consists of one qubit.16 It can
detect temperatures better if prepared in a superposition of energies.
Although classical waves can be in superpositions, the plastic rods pushed
under my tongue in childhood couldn’t.
Welcome back to London. We’ve completed our tornado tour of Windsor,
Stonehenge, and Oxford—and so our whirlwind tour of the quantum-
steampunk map. But Audrey, Baxter, and Caspian’s journey hasn’t ended, and
neither has ours. Centuries ago, explorers feared that roaming too far beyond
known boundaries would carry them off the planet’s edge. Let’s realize their
worst nightmares, by stepping off the map.
* Some people could perform the computations in their heads, but they’d use a neuronal equivalent of
scrap paper.
* At least, in quantum computation. In other disciplines, 0 means no.
* The British setting demands grey rather than gray.
CHAPTER 14
A plume of grey smoke rose from the east, as though from the pipe of a giant who’s finished his dinner
of Englishman.
“Akram’s signal,” Caspian whispered. Although Audrey barely heard the words, the effort cost him,
and he struggled to suppress a grimace. “Ewart will arrive within the hour.”
Caspian was lying in the dirt beside the map, his chest wrapped with a bandage torn from Audrey’s
petticoat. Audrey and Baxter were kneeling beside the map, its corners weighed down by three stones
and a dusty leather water skin. For several minutes, Baxter had been staring at the map silently and
fiddling with the Collection. At Caspian’s words, Baxter wrenched the penknife out from amongst the
tools and stabbed it into the dirt.
“It can-not be here,” he said, stabbing again. “We have looked ev-erywhere and found noth-ing, so it
simply cannot ex-ist. We have been chasing a ghost.” A stab accompanied each emphasized syllable.
Audrey had been gazing at the narwhal cavorting on an edge of the map, but she glanced up as
Baxter flung the Collection to the ground. She was about to admonish him, when she noticed a valley
beyond the stabbed dirt. The valley bordered a collection of twisted, tough-looking scrubs, which she
saw over the pile of twigs scattered between Baxter and Caspian’s feet. Caspian’s head lay on a rolled-
up cloak beside a mound of sand, behind which, in the distance, rose a hill like a sated giant’s belly.
Audrey’s gaze drifted from the hill back to Caspian. His eyes, despite being bright with pain, were
trained on her.
“Go on,” he whispered. Caspian could always tell when she was about to do something.
Audrey held his gaze for a moment—fighting the urge to leap to her feet and flee, fighting the
knowledge that she could be home and safe, reading comfortably beside the window in her parents’
library—and then she reached over the Peninsula of Pettingroft on the map and picked up the
Collection.
“Move back,” she told Baxter, before drawing in the ground. She drew around the parchment,
incorporating the stabbings in the dirt, the pile of twigs, and the mound of sand.
“Indeed,” Audrey said quietly, closing the penknife. “It cannot be here—and neither shall we be here
by the time Ewart arrives. This parchment is not all that exists.” Putting one hand on the map, she
looked up at Baxter. “Hence, we shall leave it and invent the rest. We shall step off the map.”
Suppose that Audrey has a double pendulum—a pendulum that hangs from
the bottom of another pendulum that hangs from, say, a clock face (figure
14.1). She pulls the bottom pendulum far to her right and then releases it. The
double pendulum swings, bends, and loops-the-loop like a trapeze artist.
Audrey waits for a while, then photographs the double pendulum (using a
high-speed camera available during the Victorian era only in a steampunk
novel).
Imagine that Baxter pulls another double pendulum a hair’s breadth farther
than Audrey did. He lets the pendulum swing, waits for the same amount of
time, and then photographs his double pendulum. Let’s compare the two
photographs. Baxter’s double pendulum is probably in a different
configuration than Audrey’s. Also, Baxter’s was probably moving with a
different speed than Audrey’s, in a different direction, when the photographs
were taken. That is, a double pendulum’s motion changes loads if the initial
conditions change slightly. This sensitivity to initial conditions characterizes
chaos.
Chaos manifests in systems classical and quantum. Classical chaotic
systems include the weather. Meteorologist Edward Lorenz encapsulated
sensitivity to initial conditions in the term the butterfly effect: “a butterfly
flapping its wings in Brazil could set off a tornado in Texas.”5 Quantum
systems exhibit sensitivity less straightforwardly than the weather, as they
obey different equations—quantum theory, rather than classical mechanics.
So, detecting quantum systems’ sensitivity to initial conditions is tricky. But
the scrambling signal provides such a detector.
I count myself fortunate to have had the story from the horse’s mouth.
Alexei Kitaev taught me about scrambling at the whiteboard in his office, as
we planned the final term of our quantum-computing course. He was debating
between teaching about black holes, about a mathematical problem, or about
another application of quantum computers. I voted for black holes, so he
treated the class to an introduction. Then, a piston slid into place in my mind:
scrambling belongs with quantum thermodynamics.
GO SCRAMBLE YOURSELF
I developed a sense that I should accomplish something with scrambling, but
I couldn’t figure out what. So, I dropped in on my PhD advisor’s office, and
we kicked ideas around. About half an hour in, he dropped a comment that
transformed my research and my life: “Well, you’re interested in fluctuation
relations, right?” Another piston slid into place in my mind.
Fluctuation relations are kin to the second law of thermodynamics, which
governs how information spreads out and dissipates. Quantum information
spreads out and dissipates through entanglement during scrambling.
Furthermore, fluctuation relations compare forward processes to their
reverses: Crooks’ theorem governs the stretching of a DNA hairpin and the
collapse of the hairpin. The scrambling signal, too, encodes forward and
backward motions through time.
We can understand how by returning to Baxter’s double pendulum. Like
Audrey, he’s pulled the pendulum far to the right but a hair’s breadth more
than she has. Baxter releases the pendulum, lets it swing for a while, and
stops it after the agreed-upon time. Imagine that he then reverses time, as
though pressing rewind on a video. (Baxter can’t actually reverse time, of
course. But he can approximately simulate a reversal of time in an
experiment.* Besides, Baxter can write down mathematics that represents a
perfect time reversal. So, we’ll pretend that he can reverse time in his
experiment.) The pendulum ends where Baxter started it (left-hand side of
figure 14.2).
Now, let’s return to Audrey’s double pendulum. She’s pulled it far to the
right, let it swing for a while, and stopped it. Audrey now nudges the
pendulum a hair’s breadth rightward, just as Baxter nudged his pendulum
before letting it swing. Now, Audrey reverses time, making the pendulum
swing backward. You might expect the pendulum to end where Baxter’s
pendulum ended: Baxter nudged his pendulum, and Audrey nudged hers—
just, at different times. But the chaotic motion of Audrey’s pendulum
amplifies her nudge. Audrey’s pendulum likely ends wildly differently from
Baxter’s: in a different position, with a different momentum (right-hand side
of figure 14.2).
FIGURE 14.2
* Alexei found this number in a condensed-matter paper published in the Soviet Union during the
1960s.4 As we said in chapter 3, whatever you discover in certain subfields of physics, a Soviet journal
probably published a paper about it between the 1960s and 1980s.
* Imagine Baxter experimenting on an electron spin, rather than a pendulum, for simplicity. He can
control the spin’s evolution—how the spin’s quantum state changes in time—with a magnetic field.
Suppose that a magnet’s north pole lies on one side of the spin and another magnet’s south pole lies on
the opposite side of the spin. Baxter effectively reverses time’s arrow for the spin by reversing the
magnetic field—by swapping the north and south poles. No experimentalist can reverse time’s arrow for
the atom perfectly; the south pole will end up slightly to one side, or the spin will have decohered a tad,
and so on. But many a cosmetician would envy Baxter his ability to reverse time approximately.
EPILOGUE
WHERE TO NEXT
THE FUTURE OF QUANTUM STEAMPUNK
“Go on.”
“Is it safe?” Audrey paused with one hand on the brass tube and one hand suspended above it.
Baxter flapped both his hands at his sister, like a duck shaking its wings at a duckling who inquires about
a pond’s temperature before waddling in.
“Of course, the mechanism is safe,” he said. “I checked it half a dozen times, and I promise that it
shall not blow our heads off—” Audrey began lowering the suspended hand—“with high probability.”
“Baxter!”
“Excuse me.” A low voice accompanied the appearance of the Stoqhardts’ butler in the doorway.
As neither sibling heard him, Caspian beckoned him over to the couch from which Audrey had
forbidden Caspian from rising. The doctor had concluded that Caspian would heal—probably, mostly—
and Audrey had sworn to fuss over him until he finished.
“Audrey—Ree-ree.” Lowering his voice, Baxter used the name he’d called his sister as a toddler.
“Of course, the mechanism is safe. I would never let anything happen to you, I promise—with a
probability very close to one indeed.”
Audrey rolled her eyes.
“The president of the Royal Society has arrived,” Caspian interrupted. The siblings turned toward
him, Audrey’s right hand still suspended in the air over the brass tube. “The rumored invitation to
present a lecture about our findings appears likely to become reality. No doubt, however, she expects an
informal lecture this morning.”
Baxter looked at Audrey, Audrey looked at Caspian, and Caspian raised his eyebrows at Audrey.
She looked down at the brass tube in front of her, then turned to the butler.
“Arnold, show Madame Bancroft into the drawing room. Tell her that we shall join her momentarily,
and have Daisy prepare almond cake.”
“Very good, miss.” Arnold bowed and left the library.
“Kind of the Royal Society to express an interest,” Audrey said. “Alas—” she looked down at the
tube—“the discoveries of today have little patience for those of yesterday.” The tube belonged to what
looked like a kaleidoscope, and her suspended hand held what appeared to be a plate of multicolored
glass. Audrey slid the plate into the kaleidoscope, bent down, and placed an eye to the eyepiece.
LIKE AUDREY, Baxter, and Caspian, we’ve traversed our map and left
it behind. We’ve visited quantum physics, information theory, and
thermodynamics to get our bearings. The three fields coalesced, drawn
together by probabilities, entropies, and the abilities of information and work
to serve as resources. Distinguishing between classical and quantum
thermodynamics, we visited the menagerie of definitions of quantum work
and heat. We then steamed along on a quantum engine, battled seasickness in
the bay of fluctuation relations, rose through the atmosphere to the
confederacy of resource theories, and unearthed a kingdom unseen for
decades. Leaving the security of charted land, we struck out from quantum
thermodynamics into other fields of science. Entropy has accompanied us
throughout the trip, like a Cavalier King Charles Spaniel trotting by our side.
I hope that you never look at liver the same way again.
Where will quantum steampunk journey next? I see four frontiers that
demand exploration. First, the city-states, towns, and principalities have
begun to unify. For example, fluctuation relations have connected to one-shot
thermodynamics. We can build more bridges, highways, and byways. They’ll
unify the subdisciplines within the field, enabling one community to import
tools and solutions from another.
Second, we can build more bridges to the world beyond quantum
thermodynamics. Chemistry; condensed matter; atomic, molecular, and
optical physics; biophysics; particle physics; and black holes involve
quantum physics, information theory, and energy. We’ve begun to interface
with them, as in the fluctuation relation for scrambling, and the implications
have redounded across quantum physics. We can reach out, find parallels
between fields, and trade more ideas, insights, and questions.
Third, much of quantum thermodynamics consists of theory. A tide of
experiments has grown over the past several years, and I expect it to surge.
We have quantum computing, among other influences, to thank: the hunger for
quantum computers has driven experimentalists to enhance their control over
quantum and other small systems. Atoms, ions, photons, superconducting
qubits, and nuclei offer platforms for testing quantum thermodynamics.
The opportunity should spur us theorists to propose experiments whose
outcomes we can’t predict—questions that require experiments. Some
quantum-thermodynamics experiments confirm predictions that don’t need
checking. These predictions rest on quantum theory that has survived tests for
decades. Some experiments on which I’ve collaborated fall into this
category. Such experiments advance science by forcing experimentalists to
hone techniques, by stimulating theoretical developments, and more. But
experiments that we can’t easily simulate on classical computers would do
more justice to the platforms at our disposal, as well as to our
thermodynamic and quantum forbears, some of whom leaned on experiments
to uncover theory.
Fourth, we can do justice to our thermodynamic forbears by inventing
technologies worth investing in. Like the early thermodynamicists, we’ve
uncovered fundamental physics: we’ve strengthened the second law of
thermodynamics, identified thermodynamic tasks performable with quantum
resources but not with classical, and more. But thermodynamics evolved
hand in hand with the steam engine, the driver of the Industrial Revolution.
Quantum thermodynamics has the opportunity to drive changes of its own.
Quantum thermodynamicists have proposed technologies: quantum
engines, refrigerators, ratchets, and batteries. Such a quantum technology may
achieve an efficiency, or a power, or some other metric, forbidden to
classical technologies. These discoveries have illuminated the distinction
between classical and quantum physics. But are the technologies practical?
Not yet. This observation applies foremost to research of mine: for now, I’d
rather ride in a classically powered car than in a car powered by the MBL-
mobile I coinvented.
Theory contacts technology through experimentation, and experiments
have begun. So far, they’re proof-of-principle experiments, demonstrating
that theory can meet reality after toil and sweat. Most quantum engines
require more work—to cool down particles, turn magnetic fields on and off
during the engine cycle, and so on—than the engines produce. Autonomous
quantum thermal machines may help resolve this problem but remain in their
infancy. Colleagues and I aim to nurture them into solutions. I hope that we
discover niches suited to quantum thermal machines and closed to classical.
In analogy, the strong sunlight in Pasadena, California, suits solar panels as it
doesn’t suit a power plant. We may identify settings, akin to Pasadena
sunlight, that call for quantum thermal machines as Southern California calls
for solar panels.
Quantum thermal machines are objects. Techniques developed in quantum
steampunk aren’t objects, but they can benefit technology, too. Such
techniques include shortcuts to adiabaticity, algorithmic cooling, and
thermometry. For instance, along with quantum thermodynamicists, other
scientists develop and use shortcuts to adiabaticity. Shortcuts discovered to
aid quantum engine cycles can also benefit quantum computing, metrology,
and communication.
Algorithmic cooling stems from the need for clean scrap paper in quantum
computing. Theorists have proposed cooling protocols, and experimentalists
have implemented some. To my knowledge, the experiments remain proof-of-
principle and showcase algorithmic cooling, rather than using algorithmic
cooling to aid experimentalists. But algorithmic cooling may become a
screwdriver—a tool that will so appeal to experimentalists that they’ll apply
it in every experiment. Until then, algorithmic cooling illuminates the
relationship between information and heat from a fundamental perspective.
Quantum thermometry has boomed over the past several years. Quantum
thermodynamicists have established general principles, illustrated with
specific models, of quantum thermometry. Other scientists have been
applying quantum phenomena to aid thermometry, too, outside of quantum
thermodynamics. For example, two experimental groups applied quantum
thermometry to a question in biology:1,2 Different cells in an embryo divide at
different times. Imagine reversing the order in which the cells divide. Would
the reversal harm the organism? You could find out by manipulating the
temperatures in different parts of the embryo because temperature controls
the rate at which cells divide.
Experimentalists injected nanoscale diamonds into a worm embryo. The
diamonds reported the temperature at various points in the worm. This
information guided experimentalists who heated the embryo with lasers. The
manipulated embryos grew into fairly normal adults. But their cells, and their
descendants’ cells, cycled through the stages of life slowly. Reversing the
order in which cells divide seemed not to harm the organisms, despite
slowing the organisms’ aging. Quantum thermometers illuminated a question
in biology.
Yet the embryo experimentalists belong to the field of quantum sensing, off
the quantum-steampunk map. The embryo experimentalists have little contact
with the town of quantum thermometry in quantum thermodynamics.
Similarly, atomic physicists have been cooling quantum gases for decades.
The physicists have been taking the gases’ temperatures since long before
quantum thermometry began booming, and quantum thermometry has not yet
infiltrated those experiments. However, quantum thermometry looks poised
to surpass standard atomic-physics techniques within a few years. Quantum-
thermometry models have grown detailed and platform-specific, and they
have the potential to change standards.
These frontiers mark opportunities for quantum steampunk. They beckon,
to my mind, like the seas and jungles faced by any adventurer in a steampunk
novel. Quantum steampunk has flourished over the past decade: We’ve
gained fundamental insights, such as how quantum resources can outperform
classical resources in thermodynamic tasks. We’ve translated theoretical
proposals into experiments, and we’ve partnered with other fields of
science. I expect quantum thermodynamics to continue thriving and evolving.
Where the past meets the future, as when thermodynamics meets quantum
computation, science can spin a today fit for a steam-punk novel.
ACKNOWLEDGMENTS
I’m grateful to many people for their contributions to this book. Thanks to my
husband for baking the muffins that fueled my eight o’clock writing sessions
on weekend mornings, for all the care and consideration symbolized by those
muffins, and for understanding about my need to write at eight o’clock on
weekend mornings. Thanks to Sarah Siegel for remaining confident, since
middle school, that I’d publish a book.
Thanks to my editors—Tiffany Gasbarrini, Michael Zierler, and Susan
Matheson—for their patience and their enthusiasm about the manuscript.
Thanks to Todd Cahill for transforming my poor sketches into works of art.
This project was supported by grant number FQXi-MGB-2009 from the
Foundational Questions Institute and Fetzer Franklin Fund, a donor-advised
fund of Silicon Valley Community Foundation. The grant was secured with
the help and kindness of Jeffrey Bub. Thanks to Caltech’s Institute for
Quantum Information and Matter for further support.
Many colleagues and friends dedicated time and attention to reviewing
parts of the text and providing feedback: Chris Akers, David Arvidsson-
Shukur, Gian Paolo Beretta, Felix Binder, Sara Campbell, Chris Jarzynski,
David Jennings, Jay Lawrence, David Limmer, Fred McLean, Jonathan
Oppenheim, Jukka Pekola, Patrick Potts, John Preskill, Paul Skrzypczyk,
Aephraim Steinberg, and Albert Ye. I owe to Rob Spekkens my awareness of
how operationalism likens thermodynamics to information theory. I owe to
Jason Alicea the illustration, via a flock of birds, of how more is different.
Thanks to Raj Katti and Hengyun Zhou for further technical assistance.
Captain Okoli is named after the late physicist Dr. Chiamaka Okoli, who
is missed by her friends and colleagues.
GLOSSARY
ABSOLUTE ZERO. The lowest temperature conceivable. Temperature of zero in units of Kelvin.
AEOLIPILE. Ancient Greek steam engine.
ALGORITHMIC COOLING. Lowering bits’ or qubits’ temperatures by manipulating correlations
among the bits or qubits.
ANGULAR MOMENTUM. A quantity, like energy, possessed by every object that’s rotating about
an axis. How much angular momentum an object has depends on the object’s mass, its speed, and
how far each chunk of it lies from the axis.
AUTONOMOUS QUANTUM CLOCK. Autonomous quantum thermal machine that keeps time.
AUTONOMOUS QUANTUM THERMAL MACHINE. Thermal machine, described by quantum
theory and not by classical physics, that operates independently, without external control.
BIT. Basic unit of information. The information you gain upon learning the outcome of an event that
could have played out in two ways with equal probabilities. For example, the information you gain
upon learning how a fair coin landed after being flipped.
BLACK HOLE. Cosmological object so dense that not even light can escape its gravitational pull.
BOLTZMANN BALANCE. Difference between two free energies, such as the free energy of a
DNA hairpin long after it’s been stretched and the free energy that the hairpin had before it was
pulled. Useful number applied in biology, chemistry, and pharmacology. Can be estimated with help
from Jarzynski’s equality or Crooks’ theorem.
BOLTZMANN’S CONSTANT. Number hardwired into our universe, similarly to the electron’s mass.
Crops up throughout thermodynamics.
BOSON. Fundamental quantum particle that carries force, such as the electric force that attracts
electrons to protons. Bosons tend to clump together.
BRUTE FORCE. Simple but time-consuming strategy for solving a computational problem: formulate
every possible solution and check whether it’s correct, one after the other.
BUTTERFLY EFFECT. Encapsulates a chaotic system’s sensitivity to initial conditions. Term coined
by meteorologist Edward Lorenz: “A butterfly flapping its wings in Brazil could set off a tornado in
Texas.”
CARNOT EFFICIENCY. Greatest efficiency achievable by any heat engine that interacts with
exactly two different-temperature heat baths during its cycle.
CARNOT ENGINE CYCLE. Engine cycle devised by the nineteenth-century French engineer
Nicolas Léonard Sadi Carnot.
CHAOS. Extreme sensitivity to initial conditions. Exhibited by the weather, double pendulums, and
black holes.
CIS CONFIGURATION. “Closed” configuration of a photoisomer, or molecular switch.
CLASSICAL. Described accurately by classical mechanics, classical electrodynamics, or general
relativity. Quantum physics isn’t classical.
CLASSICAL MECHANICS. The physics of objects big enough to be seen with the naked eye, or
under a classroom microscope, and how they move. Established by Isaac Newton during the 1600s.
CLOSED, ISOLATED SYSTEM. Thermodynamic system that exchanges nothing (no heat or
particles or anything else) with the rest of the world.
COLD BATH. Low-temperature bath that interacts with a heat engine during part of an engine cycle.
COMMUTE. Numbers commute (multiplicatively) if the order in which you multiply them together
doesn’t matter. For example, two times three equals three times two; so two and three commute.
CONDENSED MATTER. Physics of solids and liquids.
CONTEXTUALITY. Property of entanglement that enables it to speed up computations, in at least
some cases. Repeating an experiment many times yields many outcomes that obey some type of
statistics. If an experiment’s statistics depend on the context in which the experiment was performed
—on anything that happened in parallel with the experiment—the experiment is contextual. Quantum
theory is contextual; classical physics isn’t.
CONTROLLED-NOT. Logical operation performable on two bits (or qubits). If the first bit is a 0,
nothing is done to the second bit. If the first bit is a 1, the second bit is flipped (the second bit is
changed to a 1 if it began as a 0, or it is changed to a 0 if it began as a 1).
CORRELATION. Property of two (or more) measurements that are performed in many trials. Two
measurements share a correlation if, when one measurement’s outcome changes, the other
measurement’s outcome changes.
CROOKS’ THEOREM. Fluctuation relation that enables us to predict the likelihood that a given trial
will cost (or yield) a given amount of work.
DATA COMPRESSION. The information-processing task of squeezing a message into the least
possible number of bits.
DECOHERENCE. Undesirable, uncontrolled entanglement of a quantum system with its environment.
DEFECT IN DIAMOND. Arrangement of atoms that colors a diamond black and enables the
diamond to store quantum information. Advisable in quantum computers, not in engagement rings.
DETERMINISTIC. Able to be predicted with certainty. Contrasted with probabilistic.
DILUTION REFRIGERATOR. Device used to cool superconducting qubits to near absolute-zero
temperature so that they exhibit quantum behaviors.
DISSIPATION. Seeping of energy from a controlled system into many uncontrollable systems. Waste.
DISTURB, DISTURBANCE. See measurement disturbance.
DNA HAIRPIN. Length of DNA that consists of two complementary chains connected at one end by
a loop. The unzipping and rezipping of a DNA hairpin obeys fluctuation relations.
ELECTRODYNAMICS. Theory of light and its interactions with matter. Developed by James Clerk
Maxwell and other scientists during the 1800s.
ENCRYPTION. Security protocol intended to encode information in a form indecipherable to
eavesdroppers.
ENERGY. Subject of thermodynamics.
CHEMICAL ENERGY. Energy stored in chemical bonds between atoms.
ELECTRICAL ENERGY. Energy of repulsion between a positive electric charge and a negative
charge, as well as the energy of attraction between two like charges.
GRAVITATIONAL POTENTIAL ENERGY. Energy accrued to a mass that resists another
mass’s gravitational pull.
KINETIC ENERGY. Energy of motion.
ENGINE CYCLE. Sequence of steps undergone by an engine to perform work. Returns the engine to
its initial conditions.
ENTANGLEMENT. Relationship shareable by quantum particles. Measurements of the particles can
be correlated more strongly than any correlations producible with just classical particles.
ENTROPY. Measure of uncertainty about how an event, such as a measurement, will unfold. Function
of surprisals. Many entropies have been defined.
ONE-SHOT ENTROPIES. Functions of probability distributions or quantum states. They measure
the best efficiencies with which we can perform information-processing tasks or thermodynamic
tasks in a few trials or with few pieces of information.
RÉNYI ENTROPIES. One-shot entropies defined by the twentieth-century Hungarian
mathematician Alfréd Rényi.
SHANNON ENTROPY. Measure of the randomness of an event whose possible outcomes are
described by a set of probabilities. It equals the best efficiency with which we can compress
classical information, on average, if we compress infinitely many messages.
VON NEUMANN ENTROPY. Measure of the randomness of a quantum state. It equals the best
efficiency with which we can compress quantum information, on average, if we compress
infinitely many messages.
EQUILIBRIUM. Situation in which a thermodynamic system’s large-scale properties (such as
temperature and volume) don’t change much and no net flows (for example, of particles) enter or
leave the system, for a long time.
FERMION. Fundamental quantum particle. Fermions form the matter in our everyday world and obey
Pauli’s exclusion principle.
FLOQUET DRIVING. Periodically whacking a system. Whether the energy imparted by the
whacking counts as heat or as work is debated by quantum thermodynamicists and condensed-
matter physicists.
FLUCTUATION RELATION. Equation that’s stronger and more detailed than the second law of
thermodynamics.
FLUCTUATION RELATION FOR SCRAMBLING. Equation, analogous to Jarzynski’s equality,
that describes quantum chaos.
FREE ENERGY. The work required to create a system from scratch, as by pulling a rabbit out of a
hat, and warming the system up to room temperature. Also, how much work could be extracted by
annihilating the system. Features in fluctuation relations.
FREE OPERATION. An operation that can be performed easily, without the payment of any cost, in
a resource theory.
FREE SYSTEM. An object that can be accessed easily, without the payment of any cost, in a
resource theory.
FUNCTION. Mathematical machine that takes in numbers and spits out numbers. Examples include
the logarithm.
GENERAL RELATIVITY. Physical theory that describes large, massive objects, such as the planets.
Developed by Albert Einstein during the early 1900s.
HEAT. Random, uncoordinated energy that’s transferred between systems.
HEAT BATH, HEAT RESERVOIR. An enormous system that’s in equilibrium, that has a fixed
temperature, and that may exchange heat with other systems. Soap and back scrubber not included.
HEAT CAPACITY. Amount of heat required to raise a system’s temperature by one degree.
HEAT ENGINE. Device that turns heat into a little work while dissipating more heat.
HOT BATH. High-temperature heat bath with which an engine interacts during an engine cycle.
INFORMATION. Ingredient required for one to distinguish between alternatives. Also, that which
catalyzes an event without losing its ability to catalyze that event.
INFORMATION THEORY. Study of how to measure information and of how efficiently we can
process information (solve computational problems, secure information, communicate information,
and store information).
JARZYNSKI’S EQUALITY. Fluctuation relation that interrelates a Boltzmann balance and the work
invested to jolt a system out of equilibrium.
LAMASSU. Winged bull-men who guarded the entrances to ancient Assyrian palaces.
LANDAUER ERASURE. The erasing, or resetting, of a bit of information. Costs at least a szilard of
work.
LANDAUER’S PRINCIPLE. Erasing, or resetting, a bit of information costs at least a szilard of
work.
LAWS OF THERMODYNAMICS. Backbones of the theory of thermodynamics.
ZEROTH LAW OF THERMODYNAMICS. Establishes the notion of a thermometer. If a spoon
of Baxter’s is in thermal equilibrium with an almond pudding of Audrey’s and with a curry of
Caspian’s, then Audrey’s almond pudding is in thermal equilibrium with Caspian’s curry.
FIRST LAW OF THERMODYNAMICS. The energy of every closed, isolated system is
conserved.
SECOND LAW OF THERMODYNAMICS. Star in the cast of thermodynamics. One
formulation is, the entropy of every closed, isolated system can only increase or remain constant.
THIRD LAW OF THERMODYNAMICS. No process (of finitely many steps) can cool a system
to absolute-zero temperature.
LINEAR ALGEBRA. Branch of mathematics that underlies quantum computing. Requires that one
solve loads of equations simultaneously. If you ever want to insult a quantum information theorist,
say, “Pfft. Isn’t quantum information just linear algebra?”
LIVER. See entropy.
LOCALIZATION. Confinement to a finite volume, in contrast with the spread-out nature of a wave
that represents a quantum system. For a slightly different usage, see many-body localization.
LOGARITHM. Function that converts products of numbers into sums of numbers.
MACROSTATE. Consists of the large-scale properties (such as the temperature and pressure) of a
collection of particles.
MANY-BODY LOCALIZATION. Phase of quantum matter in which particles mostly stay put and
entanglement spreads slowly. Contrasted with the thermal phase.
MASER. Like a laser, but emits microwave radiation instead of classical light.
MATRIX. Collection of numbers arranged in a grid. Mathematical representation of a quantum
observable. Not all matrices commute with each other. See übernumber.
MAXIMAL ENTANGLEMENT. Relationship that quantum particles can share. Measurements of
maximally entangled states can be correlated as strongly as anything in our world can be (as far as
we know).
MAXWELL’S DEMON. “Finite being” dreamed up by nineteenth-century physicist James Clerk
Maxwell. Star of a thermodynamic paradox. Appears, prima facie, to violate the second law of
thermodynamics.
MBL-MOBILE. Quantum engine whose operation involves transitioning quantum particles between a
many-body-localized phase and a thermal phase.
MEASUREMENT DISTURBANCE. Unavoidable alteration of a quantum system’s state by a
measurement of the system.
METROLOGY. Study of how, and how well, we can measure things.
MICROCANONICAL ENSEMBLE. What Napoleon would have called part of his army. Also, the
classical analog of the microcanonical quantum state.
MICROCANONICAL STATE. Equilibrium state of a closed, isolated quantum system.
MICROSTATE. Property of a collection of particles. Defined by a list of the particles’ positions and
momenta (and, depending on what the particles consist of, their angular momenta, their vibrations,
etc.).
MOLECULAR SWITCH. Molecule that can change shape from one configuration to another. Found
in natural systems, such as our eyes, and used in technologies, such as solar-fuel-storage devices.
MOMENTUM. Property that reflects the difficulty of stopping an object. The heavier the object, and
the more quickly it moves, the greater its momentum.
MONOGAMY OF ENTANGLEMENT. Limitation on the amount of entanglement that one particle
can share with others. The more entanglement that a particle of Baxter’s shares with a particle of
Audrey’s, the less entanglement Baxter’s particle can share with a particle of Caspian’s.
NEGATIVE TEMPERATURE. Temperature below absolute zero, achievable by systems whose
energies are quantized. A system is hotter at all negative temperatures than it would be at any
positive temperature.
NONCLASSICAL. Inaccurately described by classical mechanics, electrodynamics, and general
relativity. Quantum physics is nonclassical.
NONEQUILIBRIUM THERMODYNAMICS. Study of the energy of systems away from
equilibrium, or roiled up.
NUCLEAR MAGNETIC RESONANCE (NMR). Experimental toolkit used to control nuclei that
store quantum information in quantum computers, as well as to image brains in magnetic resonance
imaging (MRI).
OBSERVABLE. Measurable property. Examples include position and momentum.
ONE-SHOT INFORMATION THEORY. Study of how efficiently we can perform information-
processing tasks (solve computational problems, communicate information, secure information, and
store information) in a limited number of trials, or given a limited amount of information.
ONE-SHOT THERMODYNAMICS. Study of how adroitly we can perform thermodynamic tasks
(such as work extraction) in a limited number of trials, or given small systems.
OPEN SYSTEM. System that interacts with other systems, such as by exchanging energy with a heat
bath.
OPERATIONAL. Concerned with how efficiently agents can perform tasks with given resources—
for example, how efficiently one can transmit information, given a staticky telephone, or can extract
work, given heat baths at different temperatures. Information theory and thermodynamics are
operational.
PAULI’S EXCLUSION PRINCIPLE. No two fermions can be in the same quantum state. Explains
how electrons arrange themselves in atoms.
PERPETUUM MOBILE. Perpetual-motion machine. Forbidden by the second law of
thermodynamics.
PHASE. Form in which matter can exist. Everyday examples include solid, liquid, and gas. Quantum
examples include many-body localization.
PHOTODETECTOR. Camera that collects light, registering photons.
PHOTOISOMER. Molecular switch.
PHOTON. Particle of light.
PLATFORM. Type of hardware—for example, a material used to build a quantum computer.
POWER. The rate at which an object delivers energy, e.g., at which an engine performs work.
PRIME NUMBER. Number that’s divisible only by itself and by one.
PRINCIPLE OF NO SIGNALING. Information can’t travel more quickly than light. Originates in
Einstein’s theory of relativity and obeyed by entangled systems.
PROBABILISTIC. Random. Contrasted with deterministic.
PROBABILITY DISTRIBUTION. Set of probabilities that describe how a random event might
unfold.
QUANTIZATION. Limitation to only a few possible numbers. Examples of quantization include the
energy that a hydrogen atom has due to its one electron’s motion and the electron’s attraction to the
nucleus.
QUANTUM. Indivisible unit, as of energy or light.
QUANTUM ADIABATICITY. Very slow adjustment of a quantum system’s energy ladder, as by
strengthening an electric field near an atom. The system remains on the same ladder rung throughout
the adjustment.
QUANTUM COMPUTER. Computer whose operation relies on quantum phenomena. Able to solve
certain computational problems, such as the factoring of prime numbers, far more quickly than any
classical computer.
QUANTUM DOT. Artificial atom—in some cases, a little patch of space on a semiconductor surface.
An electron is confined to the patch, by an electric field, similarly to how an electron in an atom is
confined to remain near the nucleus. Able to store one unit of quantum information.
QUANTUM INFORMATION. Information that can be stored in, and processed by, quantum
systems.
QUANTUM INFORMATION SCIENCE. Study of how we can use quantum resources (such as
entanglement) to process information in ways impossible with only classical resources.
QUANTUM-INFORMATION THERMODYNAMICS. Intersection of quantum computing and
thermodynamics.
QUANTUM SPEEDUP. Outperformance of a classical system by a quantum system on an
information-processing task.
QUANTUM STATE. Mathematical representation of a quantum system’s status. Usable to predict
the probability that a given measurement will yield a given outcome. Quantum analog of a probability
distribution.
QUANTUM STEAMPUNK. Reenvisioning of nineteenth-century thermodynamics for small,
quantum, far-from-equilibrium, and information-processing systems. Intersection of quantum
computing and thermodynamics, plus the use of this intersection as a new lens onto other disciplines.
Shares its aesthetic with steampunk in juxtaposing futuristic technology (quantum computing) with a
Victorian setting (thermodynamics).
QUANTUM THEORY. Study of small systems (such as electrons, protons, and photons) that fall
outside the purview of classical physics.
QUANTUM THERMODYNAMICS. Extension of conventional thermodynamics to quantum
systems.
QUANTUM THERMOMETRY. Study of the effects of quantum phenomena on temperature
measurements.
QUBIT. Basic unit of quantum information.
REAL NUMBER. Number of the sort that describes our everyday lives. A negative number, a
positive number, or zero.
RESOURCE. Something that’s scarce and that’s valuable because it’s useful.
RESOURCE THEORY. Simple model, developed in quantum information theory, for any situation in
which constraints restrict the systems that one can access and the operations that one can perform.
RESOURCE-THEORY FRAMEWORK. Mathematical and conceptual toolkit of resource theories.
ROBIN HOOD TRANSFER. Transformation of one probability distribution into another, akin to the
transformation of one distribution of wealth into another via a theft from the rich and a gift to the
poor.
SCRAMBLING. Dissemination of initially localized quantum information across a system through
many-particle entanglement.
SCRAMBLING SIGNAL. Number that indicates whether a quantum system is scrambled.
“SECOND LAWS” OF THERMODYNAMICS. Equations or inequalities that are stronger—that
provide more information—than the second law of thermo-dynamics.
SHORTCUT TO ADIABATICITY. Quick adjustment of a quantum system’s energy ladder. After
the adjustment, the system ideally ends on the same ladder rung on which it began.
SIMULATOR. Special-purpose computer that calculates how a certain system would behave under
certain conditions—for example, how a certain material would respond if heated.
SPIN. Property that quantum systems have and that classical systems lack. Described by the same
mathematics as angular momentum.
SQUEEZED LIGHT. Light whose quantum uncertainty is squeezed into one observable, leaving
another observable with a nearly well-defined value.
STATISTICAL MECHANICS. Study of many-particle systems. Less operational than
thermodynamics.
STEAMPUNK. Genre of literature, art, and film in which Victorian-era settings are juxtaposed with
futuristic technologies.
SUPERCONDUCTING QUBIT. Tiny circuit cooled to a low temperature and able to store a unit of
quantum information.
SUPERCONDUCTIVITY. Property that graces certain quantum materials. Current can flow through
the material forever, without dissipating.
SUPERPOSITION. Sum of waves that is a wave itself.
SURPRISAL. How much information you gain upon learning how a random event unfolded.
SZILARD. Maximum amount of energy obtainable from one run of Szilard’s engine. Minimum amount
of energy required to erase a bit of information. Named after a twentieth-century Hungarian-
American physicist.
SZILARD’S ENGINE. Engine usable to turn heat into work, with help from a bit of information.
THERMAL EQUILIBRIUM. One system is in thermal equilibrium with another if both have the
same temperature.
THERMAL MACHINE. Device that uses, produces, or stores heat or work. Examples include heat
engines, refrigerators, heat pumps, ratchets, batteries, and clocks.
THERMAL PHASE. Phase of quantum matter in which particles move, and entanglement spreads,
quickly. Contrasted with the many-body localized phase.
THERMODYNAMIC LIMIT. Idealization, focused on in traditional thermodynamics, in which
systems are infinitely large.
THERMODYNAMICS. Study of energy—the forms it can assume and its transformations among
those forms. Colored by operationalism.
THERMODYNAMIC UNCERTAINTY RELATION. Inequality that interrelates the entropy
produced when particles flow from one bath to another with the fluctuations in the current of
particles.
TRANS CONFIGURATION. “Open” configuration of a photoisomer, or molecular switch.
ÜBERNUMBER. Collection of numbers that, in some ways, acts like a number itself. See matrix.
UNCERTAINTY PRINCIPLE. Limitation on the extent to which one quantum observable (such as
position) has a well-defined value, given the extent to which another observable (such as momentum)
has a well-defined value.
UNCERTAINTY RELATION, QUANTUM. Inequality that encapsulates the uncertainty principle.
UNIVERSAL COMPUTER. Computer that can be programmed to solve any solvable computational
problem, then reprogrammed to solve any other.
VELOCITY. Speed at which, and direction in which, an object is moving.
WAVE FUNCTION. Certain mathematical representation of a quantum state. Emphasizes the state’s
wavelike properties.
WAVE-FUNCTION COLLAPSE. See measurement disturbance.
WAVELENGTH. Distance between two consecutive crests of a wave.
WAVE-PARTICLE DUALITY. Every chunk of matter and light resembles a wave in some ways and
resembles a particle in others.
WORK. Coordinated, organized energy that’s being transferred between systems and that can be
directly harnessed to perform a useful task, such as pushing a car up a hill.
WORK EXTRACTION. Acquisition of useful energy, as from two different-temperature heat baths
via an engine.
REFERENCES
CHAPTER 0 PROLOGUE
1. Malik, Wajeeha. “Inky’s Daring Escape Shows How Smart Octopuses
Are.” National Geographic, April 14, 2016.
https://www.nationalgeographic.com/animals/article/160414-inky-
octopus-escapes-intelligence.
CHAPTER 1 INFORMATION THEORY
1. Schumacher, Benjamin, and Michael Westmoreland. Quantum
Processes, Systems, and Information. New York: Cambridge University
Press, 2010.
2. Munroe, Randall. What If? Serious Scientific Answers to Absurd
Hypothetical Questions. International ed. Boston: Mariner Books, 2014.
3. Suzuki, Jeff. A History of Mathematics. Upper Saddle River, NJ:
Prentice Hall, 2002.
4. “Liver: Anatomy and Functions.” Johns Hopkins Medicine. Accessed
April 4, 2021. https://www.hopkinsmedicine.org/health/conditions-and-
diseases/liver-anatomy-and-functions.
5. Tribus, M., and E. C. McIrvine. “Energy and Information.” Scientific
American 225, no. 3 (September 1971): 179–88, quote at p. 180.
http://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Energy-and-
Information.pdf.
CHAPTER 2 QUANTUM PHYSICS
1. Improbable Research (blog). “Yet Another Prize for Ig-Winning
Ponytail-Physics Researcher,” December 15, 2015.
https://www.improbable.com/2015/12/15/yet-another-prize-for-ig-
winning-ponytail-physics-researcher/.
2. Sebens, Charles T. “How Electrons Spin.” Studies in History and
Philosophy of Science Part B: Studies in History and Philosophy of
Modern Physics 68 (November 1, 2019): 40–50.
https://doi.org/10.1016/j.shpsb.2019.04.007.
3. Heisenberg, W. “Über den anschaulichen Inhalt der quantentheoretischen
Kinematik und Mechanik.” Zeitschrift für Physik 43, no. 3 (March 1,
1927): 172–98. https://doi.org/10.1007/BF01397280.
4. Kennard, E. H. “Zur Quantenmechanik einfacher Bewegungstypen.”
Zeitschrift für Physik 44, no. 4 (April 1, 1927): 326–52.
https://doi.org/10.1007/BF01391200.
5. Bell, J. S. “On the Einstein Podolsky Rosen Paradox.” Physics Physique
Fizika 1, no. 3 (November 1, 1964): 195–200.
https://doi.org/10.1103/PhysicsPhysiqueFizika.1.195.
6. John Gribbin. Schrödinger’s Kittens and the Search for Reality. New
York: Back Bay Books, 1995.
CHAPTER 3 QUANTUM COMPUTATION
1. Feynman, Richard P. “Simulating Physics with Computers.”
International Journal of Theoretical Physics 21, no. 6 (June 1, 1982):
467–88. https://doi.org/10.1007/BF02650179.
2. Manin, Yuri. Computable and Uncomputable. Moscow: Sovetskoye
Radio, 1980.
3. Benioff, Paul. “The Computer as a Physical System: A Microscopic
Quantum Mechanical Hamiltonian Model of Computers as Represented
by Turing Machines.” Journal of Statistical Physics 22, no. 5 (May 1,
1980): 563–91. https://doi.org/10.1007/BF01011339.
4. Fredkin, Edward, and Tommaso Toffoli. “Conservative Logic.”
International Journal of Theoretical Physics 21, no. 3 (April 1, 1982):
219–53. https://doi.org/10.1007/BF01857727.
5. Deutsch, David. “Quantum Theory, the Church–Turing Principle and the
Universal Quantum Computer.” Proceedings of the Royal Society of
London A. Mathematical and Physical Sciences 400, no. 1818 (July 8,
1985): 97–117. https://doi.org/10.1098/rspa.1985.0070.
6. Altman, Ehud, Kenneth R. Brown, Giuseppe Carleo, Lincoln D. Carr,
Eugene Demler, Cheng Chin, et al. “Quantum Simulators: Architectures
and Opportunities.” PRX Quantum 2, no. 1 (February 24, 2021):
017003. https://doi.org/10.1103/PRXQuantum.2.017003.
CHAPTER 4 THERMODYNAMICS
1. Grahame, Kenneth. The Wind in the Willows. New York: Charles
Scribner’s Sons, 1913. https://www.gutenberg.org/files/27805/27805-
h/27805-h.htm.
2. Prigogine, Ilya. “Nobel Lecture: Time Structure and Fluctuations.”
Nobel Prize website, “The Nobel Prize in Chemistry 1977.” Accessed
April 4, 2021.
https://www.nobelprize.org/prizes/chemistry/1977/prigogine/lecture/.
3. Prigogine, Ilya. “Biographical.” In Nobel Lectures, Chemistry 1971-
1980, translated from the French, edited by Tore Frängsmyr and Sture
Forsén, Singapore: World Scientific Publishing, 1993.
https://www.nobelprize.org/prizes/chemistry/1977/prigogine/biographic
al/.
4. Fowler, R. H., and E. A. Guggenheim. Statistical Thermodynamics: A
Version of Statistical Mechanics for Students of Physics and
Chemistry. New York: Macmillan; Cambridge, UK: Cambridge
University Press, 1939.
5. Fernández-Pineda, C., and S. Velasco “Comment on ‘Historical
Observations on Laws of Thermodynamics.’ ” Journal of Chemical &
Engineering Data 57, no. 4 (April 12, 2012): 1347–1347.
https://doi.org/10.1021/je300082q.
6. Eddington, Arthur. The Nature of the Physical World. 1928. Reprint,
Cambridge, UK: Cambridge University Press, 2007.
https://henry.pha.jhu.edu/Eddington.2008.pdf.
7. Lloyd, Seth. “Going into Reverse.” Nature 430, no. 7003 (August 2004):
971–971. https://doi.org/10.1038/430971a.
8. Son, Hyungmok, Juliana J. Park, Wolfgang Ketterle, and Alan O.
Jamison. “Collisional Cooling of Ultracold Molecules.” Nature 580, no.
7802 (April 2020): 197–200. https://doi.org/10.1038/s41586-020-2141-
z.
CHAPTER 5 A FINE MERGER
1. Szilard, Leo. “On the Decrease of Entropy in a Thermodynamic System
by the Intervention of Intelligent Beings.” Behavioral Science 9, no. 4
(1964): 301–10. https://doi.org/10.1002/bs.3830090402.
2. Landauer, R. “Irreversibility and Heat Generation in the Computing
Process.” IBM Journal of Research and Development 5, no. 3 (July
1961): 183–91. https://doi.org/10.1147/rd.53.0183.
3. Bennett, Charles H. “Demons, Engines and the Second Law.” Scientific
American 257, no. 5 (November 1987): 108–116.
https://www.jstor.org/stable/24979551.
4. Bender, Carl M., Dorje C. Brody, and Bernhard J. Meister. “Unusual
Quantum States: Non–Locality, Entropy, Maxwell’s Demon and
Fractals.” Proceedings of the Royal Society A: Mathematical, Physical
and Engineering Sciences 461, no. 2055 (March 8, 2005): 733–53.
https://doi.org/10.1098/rspa.2004.1351.
5. Rio, Lídia del, Johan Åberg, Renato Renner, Oscar Dahlsten, and Vlatko
Vedral. “The Thermodynamic Meaning of Negative Entropy.” Nature
476, no. 7361 (August 2011): 476–476.
https://doi.org/10.1038/nature10395.
6. Kim, Sang Wook, Takahiro Sagawa, Simone De Liberato, and Masahito
Ueda. “Quantum Szilard Engine.” Physical Review Letters 106, no. 7
(February 14, 2011): 070401.
https://doi.org/10.1103/PhysRevLett.106.070401.
7. Szilard, Leo. “On the Decrease of Entropy in a Thermodynamic System
by the Intervention of Intelligent Beings.” Behavioral Science 9, no. 4
(1964): 301–10. https://doi.org/https://doi.org/10.1002/bs.3830090402.
8. Bennett, Charles H. “The Thermodynamics of Computation—A
Review.” International Journal of Theoretical Physics 21, no. 12
(December 1, 1982): 905–40. https://doi.org/10.1007/BF02084158.
CHAPTER 6 THE PHYSICS OF YESTERDAY’S TOMORROW
1. Watanabe, Satoshi, and Louis de Broglie. Le Deuxième Théorème de La
Thermodynamique et La Mécanique Ondulatoire. Hermann, 1935.
2. Slater, J. C. Introduction to Chemical Physics. 1st ed. New York:
McGraw-Hill, 1939, 46.
3. Demers, Pierre. “Le Second Principe et La Théorie Des Quanta.”
Canadian Journal of Research 11, no. 50 (1944): 27–51.
4. Ramsey, Norman F. “Thermodynamics and Statistical Mechanics at
Negative Absolute Temperatures.” Physical Review 103, no. 1 (July 1,
1956): 20–28. https://doi.org/10.1103/PhysRev.103.20.
5. Scovil, H. E. D., and E. O. Schulz-DuBois. “Three-Level Masers as
Heat Engines.” Physical Review Letters 2, no. 6 (March 15, 1959):
262–63. https://doi.org/10.1103/PhysRevLett.2.262.
6. Geusic, J. E., E. O. Schulz-DuBios, and H. E. D. Scovil. “Quantum
Equivalent of the Carnot Cycle.” Physical Review 156, no. 2 (April 10,
1967): 343–51. https://doi.org/10.1103/PhysRev.156.343.
7. Lindblad, G. “On the Generators of Quantum Dynamical Semigroups.”
Communications in Mathematical Physics 48, no. 2 (June 1, 1976):
119–30. https://doi.org/10.1007/BF01608499.
8. Gorini, Vittorio, Andrzej Kossakowski, and E. C. G. Sudarshan.
“Completely Positive Dynamical Semigroups of N-level Systems.”
Journal of Mathematical Physics 17, no. 5 (May 1, 1976): 821–25.
https://doi.org/10.1063/1.522979.
9. Park, James L., and William Band. “Generalized Two-Level Quantum
Dynamics. III. Irreversible Conservative Motion.” Foundations of
Physics 8, no. 3 (April 1, 1978): 239–54.
https://doi.org/10.1007/BF00715210.
10. Kraus, K. “General State Changes in Quantum Theory.” Annals of
Physics 64, no. 2 (June 1, 1971): 311–35. https://doi.org/10.1016/0003-
4916(71)90108-4.
11. Davies, E. B. “Markovian Master Equations.” Communications in
Mathematical Physics 39, no. 2 (June 1, 1974): 91–110.
https://doi.org/10.1007/BF01608389.
12. Kosloff, Ronnie. “A Quantum Mechanical Open System as a Model of a
Heat Engine.” Journal of Chemical Physics 80, no. 4 (February 15,
1984): 1625–31. https://doi.org/10.1063/1.446862.
13. Alicki, Robert. “The Quantum Open System as a Model of the Heat
Engine.” Journal of Physics A: Mathematical and General 12, no. 5
(1979): L103-07. https://iopscience.iop.org/article/10.1088/0305-
4470/12/5/007.
14. Scully, Robert J., and Marlan O. Scully. The Demon and the Quantum:
From the Pythagorean Mystics to Maxwell’s Demon and Quantum
Mystery. 2nd ed. Weinheim, Germany: Wiley-VCH, 2010.
15. Lloyd, Seth. “Black Holes, Demons, and the Loss of Coherence: How
Complex Systems Get Information, and What They Do with It.” PhD
diss., Rockefeller University, 1988.
16. Goldstein, Sheldon, Joel L. Lebowitz, Roderich Tumulka, and Nino
Zanghì. “Canonical Typicality.” Physical Review Letters 96, no. 5
(February 8, 2006): 050403.
https://doi.org/10.1103/PhysRevLett.96.050403.
17. Popescu, Sandu, Anthony J. Short, and Andreas Winter. “Entanglement
and the Foundations of Statistical Mechanics.” Nature Physics 2, no. 11
(November 2006): 754–58. https://doi.org/10.1038/nphys444.
18. Page, Don N. “Black Hole Information.” ArXiv:Hep-Th/9305040,
February 25, 1995. http://arxiv.org/abs/hep-th/9305040.
19. Prigogine, I., and C. George. “The Second Law as a Selection Principle:
The Microscopic Theory of Dissipative Processes in Quantum Systems.”
Proceedings of the National Academy of Sciences 80, no. 14 (July 1,
1983): 4590–94. https://doi.org/10.1073/pnas.80.14.4590.
20. Anderson, P. W. “More Is Different.” Science 177, no. 4047 (August 4,
1972): 393–96. https://doi.org/10.1126/science.177.4047.393.
21. Frenzel, Max F., David Jennings, and Terry Rudolph. “Reexamination of
Pure Qubit Work Extraction.” Physical Review E 90, no. 5 (November
18, 2014): 052136. https://doi.org/10.1103/PhysRevE.90.052136.
CHAPTER 7 PEDAL TO THE METAL
1. Scovil, H. E. D., and E. O. Schulz-DuBois. “Three-Level Masers as
Heat Engines.” Physical Review Letters 2, no. 6 (March 15, 1959):
262–63. https://doi.org/10.1103/PhysRevLett.2.262.
2. Geusic, J. E., E. O. Schulz-DuBios, and H. E. D. Scovil. “Quantum
Equivalent of the Carnot Cycle.” Physical Review 156, no. 2 (April 10,
1967): 343–51. https://doi.org/10.1103/PhysRev.156.343.
3. Kalaee, Alex Arash Sand, Andreas Wacker, and Patrick P. Potts.
“Violating the Thermodynamic Uncertainty Relation in the Three-Level
Maser.” ArXiv:2103.07791 [Quant-Ph], March 13, 2021.
http://arxiv.org/abs/2103.07791.
4. Campisi, Michele, and Rosario Fazio. “The Power of a Critical Heat
Engine.” Nature Communications 7, no. 1 (June 20, 2016): 11895.
https://doi.org/10.1038/ncomms11895.
5. Oz-Vogt, J., A. Mann, and M. Revzen. “Thermal Coherent States and
Thermal Squeezed States.” Journal of Modern Optics 38, no. 12
(December 1, 1991): 2339–47.
https://doi.org/10.1080/09500349114552501.
6. Roßnagel, J., O. Abah, F. Schmidt-Kaler, K. Singer, and E. Lutz.
“Nanoscale Heat Engine beyond the Carnot Limit.” Physical Review
Letters 112, no. 3 (January 22, 2014): 030602.
https://doi.org/10.1103/PhysRevLett.112.030602.
7. Niedenzu, Wolfgang, David Gelbwaser-Klimovsky, Abraham G.
Kofman, and Gershon Kurizki. “On the Operation of Machines Powered
by Quantum Non-Thermal Baths.” New Journal of Physics 18, no. 8
(August 2, 2016): 083012. https://doi.org/10.1088/1367-
2630/18/8/083012.
8. Gardas, Bartłomiej, and Sebastian Deffner. “Thermodynamic
Universality of Quantum Carnot Engines.” Physical Review E 92, no. 4
(October 12, 2015): 042126.
https://doi.org/10.1103/PhysRevE.92.042126.
9. Klaers, Jan, Stefan Faelt, Atac Imamoglu, and Emre Togan. “Squeezed
Thermal Reservoirs as a Resource for a Nanomechanical Engine beyond
the Carnot Limit.” Physical Review X 7, no. 3 (September 13, 2017):
031044. https://doi.org/10.1103/PhysRevX.7.031044.
10. Yunger Halpern, Nicole, Christopher David White, Sarang
Gopalakrishnan, and Gil Refael. “Quantum Engine Based on Many-Body
Localization.” Physical Review B 99, no. 2 (January 22, 2019): 024203.
https://doi.org/10.1103/PhysRevB.99.024203.
11. Palao, José P., Ronnie Kosloff, and Jeffrey M. Gordon. “Quantum
Thermodynamic Cooling Cycle.” Physical Review E 64, no. 5 (October
30, 2001): 056130. https://doi.org/10.1103/PhysRevE.64.056130.
12. Linden, Noah, Sandu Popescu, and Paul Skrzypczyk. “How Small Can
Thermal Machines Be? The Smallest Possible Refrigerator.” Physical
Review Letters 105, no. 13 (September 21, 2010): 130401.
https://doi.org/10.1103/PhysRevLett.105.130401.
13. Binder, Felix C., Sai Vinjanampathy, Kavan Modi, and John Goold.
“Quantacell: Powerful Charging of Quantum Batteries.” New Journal of
Physics 17, no. 7 (July 22, 2015): 075015.
https://doi.org/10.1088/1367-2630/17/7/075015.
14. Maslennikov, Gleb, Shiqian Ding, Roland Hablützel, Jaren Gan,
Alexandre Roulet, Stefan Nimmrichter et al. “Quantum Absorption
Refrigerator with Trapped Ions.” Nature Communications 10, no. 1
(January 14, 2019): 202. https://doi.org/10.1038/s41467-018-08090-0.
CHAPTER 8 TICK TOCK
1. Brewer, S. M., J.-S. Chen, A. M. Hankin, E. R. Clements, C. W. Chou,
D. J. Wineland, D. B. Hume, and D. R. Leibrandt. “27Al+ Quantum-Logic
Clock with a Systematic Uncertainty below 10−18.” Physical Review
Letters 123, no. 3 (July 15, 2019): 033201.
https://doi.org/10.1103/PhysRevLett.123.033201.
2. Dubé, Pierre. “Ion Clock Busts into New Precision Regime.” Physics 12
(July 15, 2019). https://physics.aps.org/articles/v12/79.
3. Newell, David B., and Eite Tiesinga. “Reference on Constants, Units,
and Uncertainty: International System of Units (SI).” NIST website.
Accessed May 6, 2021. https://physics.nist.gov/cuu/Units/current.html.
4. Pauli, Wolfgang. Handbuch Der Physik. 1st ed. Vol. 23. Berlin:
Springer, 1926.
5. Pauli, Wolfgang. Handbuch Der Physik. 2nd ed. Vol. 24. Berlin:
Springer, 1933.
6. Pauli, Wolfgang. Handbuch Der Physik. Vol. 5, Part 1: Prinzipien der
Quantentheorie I. Berlin: Springer, 1958.
7. Woods, Mischa P., Ralph Silva, and Jonathan Oppenheim. “Autonomous
Quantum Machines and Finite-Sized Clocks.” Annales Henri Poincaré
20, no. 1 (January 1, 2019): 125–218. https://doi.org/10.1007/s00023-
018-0736-9.
8. Yunger Halpern, Nicole, and David T. Limmer. “Fundamental
Limitations on Photoisomerization from Thermodynamic Resource
Theories.” Physical Review A 101, no. 4 (April 17, 2020): 042116.
https://doi.org/10.1103/PhysRevA.101.042116
CHAPTER 9 UNSTEADY AS SHE GOES
1. “Selling the Victorians.” The National Archive website. Accessed April
5, 2021.
https://www.nationalarchives.gov.uk/education/resources/selling-the-
victorians/.
2. Mossa, A., M. Manosas, N. Forns, J. M. Huguet, and F. Ritort. “Dynamic
Force Spectroscopy of DNA Hairpins: I. Force Kinetics and Free
Energy Landscapes.” Journal of Statistical Mechanics: Theory and
Experiment 2009, no. 2 (February 25, 2009): P02060.
https://doi.org/10.1088/1742-5468/2009/02/P02060.
3. Schroeder, Daniel V. An Introduction to Thermal Physics. San
Francisco: Pearson, 1999.
4. Jarzynski, C. “Nonequilibrium Equality for Free Energy Differences.”
Physical Review Letters 78, no. 14 (April 7, 1997): 2690–93.
https://doi.org/10.1103/PhysRevLett.78.2690.
5. Crooks, Gavin E. “Entropy Production Fluctuation Theorem and the
Nonequilibrium Work Relation for Free Energy Differences.” Physical
Review E 60, no. 3 (September 1, 1999): 2721–26.
https://doi.org/10.1103/PhysRevE.60.2721.
6. Liphardt, Jan, Sophie Dumont, Steven B. Smith, Ignacio Tinoco, and
Carlos Bustamante. “Equilibrium Information from Nonequilibrium
Measurements in an Experimental Test of Jarzynski’s Equality.” Science
296, no. 5574 (June 7, 2002): 1832–35.
https://doi.org/10.1126/science.1071152.
7. Hummer, Gerhard, and Attila Szabo. “Free Energy Reconstruction from
Nonequilibrium Single-Molecule Pulling Experiments.” Proceedings of
the National Academy of Sciences 98, no. 7 (March 27, 2001): 3658–
61. https://doi.org/10.1073/pnas.071034098.
8. Blickle, V., T. Speck, L. Helden, U. Seifert, and C. Bechinger.
“Thermodynamics of a Colloidal Particle in a Time-Dependent
Nonharmonic Potential.” Physical Review Letters 96, no. 7 (February
23, 2006): 070603. https://doi.org/10.1103/PhysRevLett.96.070603.
9. Douarche, F., S. Ciliberto, A. Petrosyan, and I. Rabbiosi. “An
Experimental Test of the Jarzynski Equality in a Mechanical
Experiment.” EPL (Europhysics Letters) 70, no. 5 (April 29, 2005):
593. https://doi.org/10.1209/epl/i2005-10024-4.
10. Misof, K., W. J. Landis, K. Klaushofer, and P. Fratzl. “Collagen from the
Osteogenesis Imperfecta Mouse Model (OIM) Shows Reduced
Resistance against Tensile Stress.” Journal of Clinical Investigation
100, no. 1 (July 1, 1997): 40–45. https://doi.org/10.1172/JCI119519.
11. Herczenik, Eszter, and Martijn F. B. G. Gebbink. “Molecular and
Cellular Aspects of Protein Misfolding and Disease.” FASEB Journal
22, no. 7 (2008): 2115–33. https://doi.org/https://doi.org/10.1096/fj.07-
099671.
12. Utsumi, Y., D. S. Golubev, M. Marthaler, K. Saito, T. Fujisawa, and
Gerd Schön. “Bidirectional Single-Electron Counting and the Fluctuation
Theorem.” Physical Review B 81, no. 12 (March 29, 2010): 125331.
https://doi.org/10.1103/PhysRevB.81.125331.
13. Küng, B., C. Rössler, M. Beck, M. Marthaler, D. S. Golubev, Y. Utsumi
et al. “Irreversibility on the Level of Single-Electron Tunneling.”
Physical Review X 2, no. 1 (January 13, 2012): 011001.
https://doi.org/10.1103/PhysRevX.2.011001.
14. Saira, O.-P., Y. Yoon, T. Tanttu, M. Möttönen, D. V. Averin, and J. P.
Pekola. “Test of the Jarzynski and Crooks Fluctuation Relations in an
Electronic System.” Physical Review Letters 109, no. 18 (October 31,
2012): 180601. https://doi.org/10.1103/PhysRevLett.109.180601.
15. Bartolotta, Anthony, and Sebastian Deffner. “Jarzynski Equality for
Driven Quantum Field Theories.” Physical Review X 8, no. 1 (February
27, 2018): 011033. https://doi.org/10.1103/PhysRevX.8.011033.
16. Ortega, Alvaro, Emma McKay, Álvaro M. Alhambra, and Eduardo
Martín-Martínez. “Work Distributions on Quantum Fields.” Physical
Review Letters 122, no. 24 (June 21, 2019): 240604.
https://doi.org/10.1103/PhysRevLett.122.240604.
17. Bruschi, David, Benjamin Morris, and Ivette Fuentes. “Thermodynamics
of Relativistic Quantum Fields Confined in Cavities.” Physics Letters A
384, no. 25 (September 7, 2020): 126601.
https://doi.org/10.1016/j.physleta.2020.126601.
18. Teixidó-Bonfill, Adam, Alvaro Ortega, and Eduardo Martín-Martínez.
“First Law of Quantum Field Thermodynamics.” Physical Review A 102,
no. 5 (November 18, 2020): 052219.
https://doi.org/10.1103/PhysRevA.102.052219.
19. Liu, Nana, John Goold, Ivette Fuentes, Vlatko Vedral, Kavan Modi, and
David Bruschi. “Quantum Thermodynamics for a Model of an Expanding
Universe.” Classical and Quantum Gravity 33, no. 3 (January 11,
2016): 035003.
20. An, Shuoming, Jing-Ning Zhang, Mark Um, Dingshun Lv, Yao Lu, Junhua
Zhang et al. “Experimental Test of the Quantum Jarzynski Equality with a
Trapped-Ion System.” Nature Physics 11, no. 2 (February 2015): 193–
99. https://doi.org/10.1038/nphys3197.
21. Batalhão, Tiago B., Alexandre M. Souza, Laura Mazzola, Ruben
Auccaise, Roberto S. Sarthour, Ivan S. Oliveira et al. “Experimental
Reconstruction of Work Distribution and Study of Fluctuation Relations
in a Closed Quantum System.” Physical Review Letters 113, no. 14
(October 3, 2014): 140601.
https://doi.org/10.1103/PhysRevLett.113.140601.
22. Naghiloo, M., J. J. Alonso, A. Romito, E. Lutz, and K. W. Murch.
“Information Gain and Loss for a Quantum Maxwell’s Demon.” Physical
Review Letters 121, no. 3 (July 17, 2018): 030604.
https://doi.org/10.1103/PhysRevLett.121.030604.
23. Zhang, Zhenxing, Tenghui Wang, Liang Xiang, Zhilong Jia, Peng Duan,
Weizhou Cai et al. “Experimental Demonstration of Work Fluctuations
along a Shortcut to Adiabaticity with a Superconducting Xmon Qubit.”
New Journal of Physics 20, no. 8 (August 2, 2018): 085001.
https://doi.org/10.1088/1367-2630/aad4e7.
24. Cerisola, Federico, Yair Margalit, Shimon Machluf, Augusto J.
Roncaglia, Juan Pablo Paz, and Ron Folman. “Using a Quantum Work
Meter to Test Non-Equilibrium Fluctuation Theorems.” Nature
Communications 8, no. 1 (November 1, 2017): 1241.
https://doi.org/10.1038/s41467-017-01308-7.
25. Hernández-Gómez, S., S. Gherardini, F. Poggiali, F. S. Cataliotti, A.
Trombettoni, P. Cappellaro et al. “Experimental Test of Exchange
Fluctuation Relations in an Open Quantum System.” Physical Review
Research 2, no. 2 (June 12, 2020): 023327.
https://doi.org/10.1103/PhysRevResearch.2.023327.
CHAPTER 10 ENTROPY, ENERGY, AND A TINY POSSIBILITY
1. Rényi, Alfréd. “On Measures of Entropy and Information,” Proceedings
of the Fourth Berkeley Symposium on Mathematical Statistics and
Probability, Vol. 1, 547–61. University of California, Berkeley:
University of California Press, 1961.
2. Faist, Philippe. “Welcome to the Entropy Zoo.” Personal website of
Philippe Faist. Accessed April 5, 2021.
https://phfaist.com/d/entropyzoo/TheEntropyZoo.pdf.
3. Rio, Lídia del, Johan Åberg, Renato Renner, Oscar Dahlsten, and Vlatko
Vedral. “The Thermodynamic Meaning of Negative Entropy.” Nature
476, no. 7361 (August 2011): 476–476.
https://doi.org/10.1038/nature10395.
4. Yunger Halpern, Nicole, Andrew J. P. Garner, Oscar C. O. Dahlsten, and
Vlatko Vedral. “Introducing One-Shot Work into Fluctuation Relations.”
New Journal of Physics 17, no. 9 (September 11, 2015): 095003.
https://doi.org/10.1088/1367-2630/17/9/095003.
5. Burnette, Joyce. “Women Workers in the British Industrial Revolution.”
EH.Net Encyclopedia, edited by Robert Whaples, March 26, 2008.
https://eh.net/encyclopedia/women-workers-in-the-british-industrial-
revolution/.
6. Lamb, Evelyn. “5 Sigma: What’s That?” Scientific American (blog),
July 17, 2012. https://blogs.scientificamerican.com/observations/five-
sigmawhats-that/.
7. Jarzynski, Christopher. “Rare Events and the Convergence of
Exponentially Averaged Work Values.” Physical Review E 73, no. 4
(April 5, 2006): 046105. https://doi.org/10.1103/PhysRevE.73.046105.
8. Yunger Halpern, Nicole, and Christopher Jarzynski. “Number of Trials
Required to Estimate a Free-Energy Difference, Using Fluctuation
Relations.” Physical Review E 93, no. 5 (May 26, 2016): 052144.
https://doi.org/10.1103/PhysRevE.93.052144.
CHAPTER 11 RESOURCE THEORIES
1. Yirka, Bob. “Researchers Demonstrate Teleportation Using On-Demand
Photons from Quantum Dots.” Science X website. Accessed April 5,
2021. https://phys.org/news/2018-12-teleportation-on-demand-photons-
quantum-dots.html.
2. Horodecki, Ryszard, Paweł Horodecki, Michał Horodecki, and Karol
Horodecki. “Quantum Entanglement.” Reviews of Modern Physics 81,
no. 2 (June 17, 2009): 865–942.
https://doi.org/10.1103/RevModPhys.81.865.
3. Chitambar, Eric, and Gilad Gour. “Quantum Resource Theories.”
Reviews of Modern Physics 91, no. 2 (April 4, 2019): 025001.
https://doi.org/10.1103/RevModPhys.91.025001.
4. Marshall, Albert W., Ingram Olkin, and Barry C. Arnold. Inequalities:
Theory of Majorization and Its Applications. Springer Series in
Statistics. New York: Springer, 2011.
5. Ruch, Ernst, Rudolf Schranner, and Thomas H. Seligman. “The Mixing
Distance.” Journal of Chemical Physics 69, no. 1 (July 1, 1978): 386–
92. https://doi.org/10.1063/1.436364.
6. Janzing, D., P. Wocjan, R. Zeier, R. Geiss, and T. Beth. “Thermodynamic
Cost of Reliability and Low Temperatures: Tightening Landauer’s
Principle and the Second Law.” International Journal of Theoretical
Physics 39, no. 12 (December 1, 2000): 2717–53.
https://doi.org/10.1023/A:1026422630734.
7. Horodecki, Michał, and Jonathan Oppenheim. “Fundamental Limitations
for Quantum and Nanoscale Thermodynamics.” Nature Communications
4, no. 1 (June 26, 2013): 2059. https://doi.org/10.1038/ncomms3059.
8. Gour, Gilad, David Jennings, Francesco Buscemi, Runyao Duan, and
Iman Marvian. “Quantum Majorization and a Complete Set of Entropic
Conditions for Quantum Thermodynamics.” Nature Communications 9,
no. 1 (December 17, 2018): 5352. https://doi.org/10.1038/s41467-018-
06261-7.
9. Gour, Gilad, Markus P. Müller, Varun Narasimhachar, Robert W.
Spekkens, and Nicole Yunger Halpern. “The Resource Theory of
Informational Nonequilibrium in Thermodynamics.” Physics Reports
583 (July 2, 2015): 1–58.
https://doi.org/10.1016/j.physrep.2015.04.003.
10. Brandão, Fernando, Michał Horodecki, Nelly Ng, Jonathan Oppenheim,
and Stephanie Wehner. “The Second Laws of Quantum
Thermodynamics.” Proceedings of the National Academy of Sciences
112, no. 11 (March 17, 2015): 3275–79.
https://doi.org/10.1073/pnas.1411728112.
11. Yunger Halpern, Nicole, and Joseph M. Renes. “Beyond Heat Baths:
Generalized Resource Theories for Small-Scale Thermodynamics.”
Physical Review E 93, no. 2 (February 18, 2016): 022126.
https://doi.org/10.1103/PhysRevE.93.022126.
12. Yunger Halpern, Nicole. “Beyond Heat Baths II: Framework for
Generalized Thermodynamic Resource Theories.” Journal of Physics A:
Mathematical and Theoretical 51, no. 9 (February 1, 2018): 094001.
13. Vaccaro, Joan A., and Stephen M. Barnett. “Information Erasure without
an Energy Cost.” Proceedings of the Royal Society A: Mathematical,
Physical and Engineering Sciences 467, no. 2130 (June 8, 2011):
1770–78. https://doi.org/10.1098/rspa.2010.0577.
14. Yunger Halpern, Nicole. “Toward Physical Realizations of
Thermodynamic Resource Theories.” In Information and Interaction:
Eddington, Wheeler, and the Limits of Knowledge, edited by Ian T.
Durham and Dean Rickles, 135–66. The Frontiers Collection. Cham,
Germany: Springer International, 2017. https://doi.org/10.1007/978-3-
319-43760-6_8.
15. Yunger Halpern, Nicole, Andrew J. P. Garner, Oscar C. O. Dahlsten, and
Vlatko Vedral. “Introducing One-Shot Work into Fluctuation Relations.”
New Journal of Physics 17, no. 9 (September 11, 2015): 095003.
https://doi.org/10.1088/1367-2630/17/9/095003.
16. Alhambra, Álvaro M., Lluis Masanes, Jonathan Oppenheim, and
Christopher Perry. “Fluctuating Work: From Quantum Thermodynamical
Identities to a Second Law Equality.” Physical Review X 6, no. 4
(October 24, 2016): 041017.
https://doi.org/10.1103/PhysRevX.6.041017.
17. Kucharski, Timothy J., Nicola Ferralis, Alexie M. Kolpak, Jennie O.
Zheng, Daniel G. Nocera, and Jeffrey C. Grossman. “Templated
Assembly of Photoswitches Significantly Increases the Energy-Storage
Capacity of Solar Thermal Fuels.” Nature Chemistry 6, no. 5 (May
2014): 441–47. https://doi.org/10.1038/nchem.1918.
18. Yunger Halpern, Nicole, and David T. Limmer. “Fundamental
Limitations on Photoisomerization from Thermodynamic Resource
Theories.” Physical Review A 101, no. 4 (April 17, 2020): 042116.
https://doi.org/10.1103/PhysRevA.101.042116.
CHAPTER 12 THE UNSEEN KINGDOM
1. Yunger Halpern, Nicole, and Joseph M. Renes. “Beyond Heat Baths:
Generalized Resource Theories for Small-Scale Thermodynamics.”
Physical Review E 93, no. 2 (February 18, 2016): 022126.
https://doi.org/10.1103/PhysRevE.93.022126.
2. Yunger Halpern, Nicole. “Beyond Heat Baths II: Framework for
Generalized Thermodynamic Resource Theories.” Journal of Physics A:
Mathematical and Theoretical 51, no. 9 (February 1, 2018): 094001.
3. Lostaglio, Matteo. “The Resource Theory of Quantum
Thermodynamics.” Master’s thesis, Imperial College London, 2014.
4. Jaynes, E. T. “Information Theory and Statistical Mechanics.” Physical
Review 106, no. 4 (May 15, 1957): 620–30.
https://doi.org/10.1103/PhysRev.106.620.
5. Jaynes, E. T. “Information Theory and Statistical Mechanics. II.”
Physical Review 108, no. 2 (October 15, 1957): 171–90.
https://doi.org/10.1103/PhysRev.108.171.
6. Balian, Roger, and N. L. Balazs. “Equiprobability, Inference, and
Entropy in Quantum Theory.” Annals of Physics 179, no. 1 (October 1,
1987): 97–144. https://doi.org/10.1016/S0003-4916(87)80006-4.
7. Balian, Roger, Yoram Alhassid, and Hugo Reinhardt. “Dissipation in
Many-Body Systems: A Geometric Approach Based on Information
Theory.” Physics Reports 131, no. 1 (January 1, 1986): 1–146.
https://doi.org/10.1016/0370-1573(86)90005-0.
8. Lostaglio, Matteo, David Jennings, and Terry Rudolph. “Thermodynamic
Resource Theories, Non-Commutativity and Maximum Entropy
Principles.” New Journal of Physics 19, no. 4 (April 6, 2017): 043008.
https://doi.org/10.1088/1367-2630/aa617f.
9. Guryanova, Yelena, Sandu Popescu, Anthony J. Short, Ralph Silva, and
Paul Skrzypczyk. “Thermodynamics of Quantum Systems with Multiple
Conserved Quantities.” Nature Communications 7, no. 1 (July 7, 2016):
12049. https://doi.org/10.1038/ncomms12049.
10. Yunger Halpern, Nicole, Philippe Faist, Jonathan Oppenheim, and
Andreas Winter. “Microcanonical and Resource-Theoretic Derivations
of the Thermal State of a Quantum System with Noncommuting Charges.”
Nature Communications 7, no. 1 (July 7, 2016): 12051.
https://doi.org/10.1038/ncomms12051.
11. Yunger Halpern, Nicole, Michael E. Beverland, and Amir Kalev.
“Noncommuting Conserved Charges in Quantum Many-Body
Thermalization.” Physical Review E 101, no. 4 (April 15, 2020):
042117. https://doi.org/10.1103/PhysRevE.101.042117.
12. Manzano, Gonzalo, Juan M. R. Parrondo, and Gabriel T. Landi. “Non-
Abelian Quantum Transport and Thermosqueezing Effects.”
ArXiv:2011.04560 [Cond-Mat, Physics:Quant-Ph], November 9, 2020.
http://arxiv.org/abs/2011.04560.
CHAPTER 13 ALL OVER THE MAP
1. Sørensen, Ole W. “A Universal Bound on Spin Dynamics.” Journal of
Magnetic Resonance (1969) 86, no. 2 (February 1, 1990): 435–40.
https://doi.org/10.1016/0022-2364(90)90278-H.
2. Schulman, Leonard J., and Umesh V. Vazirani. “Molecular Scale Heat
Engines and Scalable Quantum Computation.” In Proceedings of the
Thirty-First Annual ACM Symposium on Theory of Computing (STOC),
322–29. Atlanta, Georgia: Association for Computing Machinery, 1999.
https://doi.org/10.1145/301250.301332.
3. Park, Daniel K., Nayeli A. Rodriguez-Briones, Guanru Feng, Robabeh
Rahimi, Jonathan Baugh, and Raymond Laflamme. “Heat Bath
Algorithmic Cooling with Spins: Review and Prospects.” In Electron
Spin Resonance (ESR) Based Quantum Computing, edited by Takeji
Takui, Lawrence Berliner, and Graeme Hanson, 227–55. Biological
Magnetic Resonance series, vol. 31. New York: Springer, 2016.
https://doi.org/10.1007/978-1-4939-3658-8_8.
4. Boykin, P. Oscar, Tal Mor, Vwani Roychowdhury, Farrokh Vatan, and
Rutger Vrijen. “Algorithmic Cooling and Scalable NMR Quantum
Computers.” Proceedings of the National Academy of Sciences 99, no.
6 (March 19, 2002): 3388–93. https://doi.org/10.1073/pnas.241641898.
5. Horowitz, Jordan M., and Todd R. Gingrich. “Thermodynamic
Uncertainty Relations Constrain Non-Equilibrium Fluctuations.” Nature
Physics 16, no. 1 (January 2020): 15–20.
https://doi.org/10.1038/s41567-019-0702-6.
6. Pietzonka, Patrick, Andre C Barato, and Udo Seifert. “Universal Bound
on the Efficiency of Molecular Motors.” Journal of Statistical
Mechanics: Theory and Experiment 2016, no. 12 (December 30, 2016):
124004. https://doi.org/10.1088/1742-5468/2016/12/124004.
7. Ptaszyński, Krzysztof. “Coherence-Enhanced Constancy of a Quantum
Thermoelectric Generator.” Physical Review B 98, no. 8 (August 20,
2018): 085425. https://doi.org/10.1103/PhysRevB.98.085425.
8. Agarwalla, Bijay Kumar, and Dvira Segal. “Assessing the Validity of the
Thermodynamic Uncertainty Relation in Quantum Systems.” Physical
Review B 98, no. 15 (October 26, 2018): 155438.
https://doi.org/10.1103/PhysRevB.98.155438.
9. Macieszczak, Katarzyna, Kay Brandner, and Juan P. Garrahan. “Unified
Thermodynamic Uncertainty Relations in Linear Response.” Physical
Review Letters 121, no. 13 (September 24, 2018): 130601.
https://doi.org/10.1103/PhysRevLett.121.130601.
10. Guéry-Odelin, D., A. Ruschhaupt, A. Kiely, E. Torrontegui, S. Martínez-
Garaot, and J. G. Muga. “Shortcuts to Adiabaticity: Concepts, Methods,
and Applications.” Reviews of Modern Physics 91, no. 4 (October 24,
2019): 045001. https://doi.org/10.1103/RevModPhys.91.045001.
11. Albash, Tameem, and Daniel A. Lidar. “Adiabatic Quantum
Computation.” Reviews of Modern Physics 90, no. 1 (January 29, 2018):
015002. https://doi.org/10.1103/RevModPhys.90.015002.
12. Bäumer, Elisa, Matteo Lostaglio, Martí Perarnau-Llobet, and Rui
Sampaio. “Fluctuating Work in Coherent Quantum Systems: Proposals
and Limitations.” In Thermodynamics in the Quantum Regime:
Fundamental Aspects and New Directions, edited by Felix Binder, Luis
A. Correa, Christian Gogolin, Janet Anders, and Gerardo Adesso, 275–
300. Fundamental Theories of Physics series. Cham, Germany: Springer
International, 2018. https://doi.org/10.1007/978-3-319-99046-0_11.
13. Wakakuwa, Eyuri. “Operational Resource Theory of Non-
Markovianity.” ArXiv:1709.07248 [Quant-Ph], October 3, 2017.
http://arxiv.org/abs/1709.07248.
14. Pezzutto, Marco, Mauro Paternostro, and Yasser Omar. “Implications of
Non-Markovian Quantum Dynamics for the Landauer Bound.” New
Journal of Physics 18, no. 12 (December 15, 2016): 123018.
https://doi.org/10.1088/1367-2630/18/12/123018.
15. Mehboudi, Mohammad, Anna Sanpera, and Luis A Correa.
“Thermometry in the Quantum Regime: Recent Theoretical Progress.”
Journal of Physics A: Mathematical and Theoretical 52, no. 30 (July
26, 2019): 303001. https://doi.org/10.1088/1751-8121/ab2828.
16. Jevtic, Sania, David Newman, Terry Rudolph, and T. M. Stace. “Single-
Qubit Thermometry.” Physical Review A 91, no. 1 (January 22, 2015):
012331. https://doi.org/10.1103/PhysRevA.91.012331.
17. Stace, Thomas M. “Quantum Limits of Thermometry.” Physical Review
A 82, no. 1 (July 30, 2010): 011611.
https://doi.org/10.1103/PhysRevA.82.011611.
CHAPTER 14 STEPPING OFF THE MAP
1. “Black Holes.” Science Mission Directorate, NASA Science website.
Accessed April 5, 2021. https://science.nasa.gov/astrophysics/focus-
areas/black-holes.
2. Hawking, S. W. “Particle Creation by Black Holes.” Communications
in Mathematical Physics 43, no. 3 (August 1, 1975): 199–220.
https://doi.org/10.1007/BF02345020.
3. Kitaev, Alexei. “A Simple Model of Quantum Holography (Part 1).”
Conference presentation at Entanglement in Strongly Correlated Quantum
Matter, Kavli Institute for Theoretical Physics, April 7, 2015.
https://online.kitp.ucsb.edu/online/entangled15/kitaev/.
4. Larkin, A. I., and Yu. N. Ovchinnikov. “Quasiclassical Method in the
Theory of Superconductivity.” Soviet Journal of Experimental and
Theoretical Physics 28 (June 1, 1969): 1200.
5. “This Month in Physics History: Circa January 1961: Lorenz and the
Butterfly Effect.” APS News 12, no. 1 (January 2003).
http://www.aps.org/publications/apsnews/200301/history.cfm.
6. Yunger Halpern, Nicole. “Jarzynski-like Equality for the Out-of-Time-
Ordered Correlator.” Physical Review A 95, no. 1 (January 17, 2017):
012120. https://doi.org/10.1103/PhysRevA.95.012120.
7. Solinas, P., and S. Gasparinetti. “Full Distribution of Work Done on a
Quantum System for Arbitrary Initial States.” Physical Review E 92, no.
4 (October 23, 2015): 042150.
https://doi.org/10.1103/PhysRevE.92.042150.
8. Campisi, Michele, and John Goold. “Thermodynamics of Quantum
Information Scrambling.” Physical Review E 95, no. 6 (June 20, 2017):
062127. https://doi.org/10.1103/PhysRevE.95.062127.
9. Touil, Akram, and Sebastian Deffner. “Quantum Scrambling and the
Growth of Mutual Information.” Quantum Science and Technology 5,
no. 3 (May 26, 2020): 035005. https://doi.org/10.1088/2058-
9565/ab8ebb.
10. Arute, Frank, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C.
Bardin, Rami Barends et al. “Quantum Supremacy Using a
Programmable Superconducting Processor.” Nature 574, no. 7779
(October 2019): 505–10. https://doi.org/10.1038/s41586-019-1666-5.
EPILOGUE WHERE TO NEXT?
1. Choi, Joonhee, Hengyun Zhou, Renate Landig, Hai-Yin Wu, Xiaofei Yu,
Stephen E. Von Stetina et al. “Probing and Manipulating Embryogenesis
via Nanoscale Thermometry and Temperature Control.” Proceedings of
the National Academy of Sciences 117, no. 26 (June 30, 2020): 14636–
41. https://doi.org/10.1073/pnas.1922730117.
2. Fujiwara, Masazumi, Simo Sun, Alexander Dohms, Yushi Nishimura,
Ken Suto, Yuka Takezawa et al. “Real-Time Nanodiamond Thermometry
Probing in Vivo Thermogenic Responses.” Science Advances 6, no. 37
(September 1, 2020): eaba9636.
https://doi.org/10.1126/sciadv.aba9636.
INDEX
Dalton, John, 89
Darwin, Charles, 200
data compression, 21–23, 260
Davies, Paul, 115
decoherence, 48, 261
defects in diamond, 67, 261
Democritus, 89
demon, Maxwell’s, 107–10, 115, 264
deterministic strategies, 180
Deutsch, David, 59–60
Devs, 61
diamonds, quantum-computing, 67, 254
Dickens, Charles, 72
dilution refrigerators, 66, 261
disagreeing quantum observables, 208–19
Disney World, 75
dissipation, 76, 85, 110, 261
disturbance. See measurement disturbance
DNA hairpins: Boltzmann balance, 182–83
definition of, 261
fluctuation relations, 163–67, 169–70, 172
driving, Floquet, 123–24, 262
economics, 196
Eddington, Sir Arthur, 86–87, 160
efficiency: Carnot, 80, 85, 260
of quantum engines, 133–34
Egypt, ancient, 155
Einstein, Albert: on entanglement, 61n
on measurement, 8
theory of general relativity, 27, 47, 86, 169, 262
electrical energy, 76, 261
electrodynamics, 27, 261
electrons, 90
electron spin, 32–34, 51, 68
elephants, 120–22, 126, 174
Eliot, T. S., 37, 86
Eminem, 177
encryption, 58, 261
energy: chemical, 76, 261
definition of, 261
dissipation of, 110
electrical, 76, 261
free, 92, 163, 165, 262
gravitational potential, 28–29, 31, 261
kinetic, 76, 97, 261
one-shot thermodynamics, 175–85
quanta of, 29
in quantum theory, 215n
szilards of, 98, 267
energy ladders, 28–31, 194
engine cycles: Carnot, 77–80, 260
definition of, 261
Otto, 138, 141
engines: Carnot, 98–99, 108
heat, 76–80, 263
quantum, 127–37
steam, 72–75, 77–80
Szilard’s, 94–99, 267
entanglement: definition of, 261
Einstein on, 61n
maximal, 47–48, 264
monogamy of, 48–49, 264
in quantum computation, 50, 52, 56, 63, 65
in quantum physics, 4–6, 41–49, 111–12
in quantum thermal machines, 128, 140, 146
traits of, 46–48
entropy(-ies), 94, 213–14, 234
concept of, 23–25
definition of, 20, 261
one-shot, 179–80, 184, 261
Rényi, 178–79, 198–99, 261
Shannon, 21, 23–24, 39–40, 69–70, 79, 178, 261–62
thermodynamic, 73, 83–84
von Neumann, 69–70, 79, 262
equilibrium, 80–81
definition of, 262
thermal, 81, 267
equilibrium states, 213–14
erasure, Landauer, 99–102, 110, 263
events, 16
experiments, 202, 253
Halloween, 28
hardware, 65
Harvard University, 10
Hatsopoulos, George, 114
Hawking, Stephen, 115, 241
heat, 76, 94–95, 132–33
definition of, 262
quantum, 117–26
heat baths, 234–35
definition of, 262
hot baths, 77–80, 262
infinite-temperature, 124n*, 131–33
heat capacity, 87–88, 262
heat engines, 76–77
Carnot cycle, 77–80, 260
definition of, 262
quantum, 128–34
heat pumps, 233
heat reservoirs, 77–78, 262
Hebrew University in Jerusalem, 114
Heisenberg, Werner, 39
Hero of Alexandria, 73
Higgs boson, 183
His Dark Materials (Pullman), 8
history, 73–76, 113–17
Honeywell, 6, 61
Hood, Robin, 196
hot baths, 77–80, 262. See also heat baths
hummingbirds, 124–26, 174
Huntington Library, Art Museum, and Botanical Gardens, 9–10
machines: mathematical, 18
quantum thermal, 127–48
thermal, 127–28, 267
macrostates, 81, 263
magnetic resonance imaging (MRI), 67, 174
Manchester, England, 89
Manin, Yuri, 59
many-body localization (MBL), 137–44, 263
many-body-localized phase, 139, 141–42
many-body systems, 139
mascots, 107–10
masers, 128, 263
mathematical machines, 18
matrix, 211, 263. See also übernumbers
The Matrix (1999), 156
maximal entanglement, 47–48, 263
Maxwell, James Clerk, 27, 89–90, 107–9, 264
Maxwell’s demon, 107–10, 115, 264
MBL (many-body localization), 137–44, 263
MBL-mobile engine, 137–45, 239–40, 253, 264
measurement: of information, 15–19
of spin, 51–52
measurement disturbance, 6, 41, 49, 62, 236, 264
mechanics: classical, 27, 85, 260
statistical, 90–91, 267
memory, quantum, 139–40
metrology, 63, 264
microcanonical ensembles, 215–16, 264
microcanonical states, 215–16, 264
Microsoft, 6, 61
microstates, 81, 264
MIT, 114
molecular motors, 147
molecular switches, 156–59, 204–5, 240, 264
momentum, 38
angular, 32–33, 259
definition of, 264
monogamy of entanglement, 48–49, 264
motorcars, 75–76
motors, molecular, 147
MRI (magnetic resonance imaging), 67, 174
Mr. Toad’s Wild Ride (Disney World), 75
Munroe, Randall, 15
Museum of National History (London), 158n
Pasadena, California, 9
passwords, 13–25
Pauli, Wolfgang, 105
Pauli’s exclusion principle, 105, 265
Perimeter Institute for Theoretical Physics, 8–9
perpetual-motion machines, 109–10
perpetuum mobile, 109–10, 265
phase(s) of matter, 63
definition of, 265
many-body-localized, 139, 141–42
thermal, 141–42, 267
phase transitions, 135
photodetectors, 124–25, 265
photoisomers, 204, 240, 265
photons, 29, 265
physics: atomic, 248–49
classical, 27, 44–46
condensed-matter, 249
steampunk, 1–3. See also quantum physics
Physics Modified, 8
The Pinhoe Egg (Jones), 24–25
platforms, 65–68, 265
Plato, 156
Poe, Edgar Allan, 198–99
popular culture, 61
postquantum cryptography, 5
power, 134, 265
Preskill, John, 9, 244, 247
Prigogine, Ilya, 82, 115
prime factoring, 57–58
prime numbers: definition of, 57, 265
factoring numbers into, 57
principle of no signaling, 47, 265
probabilistic strategies, 180
probability, 13–25
probability distribution, 21, 265
probability weight, 133
proof-of-principle experiments, 253
protons, 90
Prufrock, J. Alfred, 37, 86
Pullman, Philip, 8
Washington Post, 61
Washington University in St. Louis, 213
Waterloo, Canada, 8–9
Watt, James, 74–75
wave function, 41, 268
wave-function collapse. See measurement disturbance
wavelength, 34–35, 268
wave-particle duality, 34–37, 49, 268
waves, gravitational, 63
weak measurement, 125
Wells, H. G., 2–3
White, Christopher D., 140
wildebeests, 122–24, 126
Wild West, 60–61, 169
Wild Wild West (1999), 7–8
Woods, Mischa, 154
work, 76–80, 95
definition of, 268
quantum, 117–26
quantum engine, 133
szilards of, 98, 267
work extraction, 77–80, 268
Charles Bennett resolved Maxwell's demon paradox by linking information processing and thermodynamics, showing that the demon's act of erasing information results in entropy production that compensates for any entropy reduction caused by its sorting actions. This resolution highlighted the role of information theory in thermodynamics and established the principle that information has thermodynamic consequences, expanding thermodynamic considerations to include data processing and computation .
Entanglement in quantum mechanics is significant because it allows particles to become correlated in ways that classical physics cannot explain, leading to phenomena such as quantum speedup in computing tasks . Unlike classical correlations, which can be explained by pre-existing information, entangled particles exhibit correlations that cannot be accounted for by any shared classical information . This property enables quantum systems to perform information-processing tasks that classical systems cannot, providing advantages in fields like quantum computing and cryptography . Moreover, in classical physics, interactions are local and deterministic, whereas entanglement in quantum systems is non-local, allowing particles to influence each other instantaneously at a distance, which differentiates it further from classical correlations .
The Carnot cycle is a theoretical model that establishes the maximum possible efficiency for a heat engine operating between two heat reservoirs at different temperatures. This efficiency is determined by the temperatures of the hot and cold baths, where a larger difference in temperature results in higher efficiency . However, the Carnot cycle operates under an ideal scenario that is not achievable in reality, as it requires infinitely slow operation to minimize energy dissipation, which in practice is impossible . Despite its impracticality, the Carnot cycle is significant as it sets an upper limit on engine efficiency, guiding the design of more realistic engines and highlighting the fundamental constraints imposed by thermodynamics .
Small and out-of-equilibrium systems challenge traditional thermodynamics by operating outside the conventions of large-system, equilibrium-based thermodynamics. These systems can exhibit behaviors like fluctuations that dominate overall dynamics, defying classical expectations. Their behavior necessitates a recalibration of concepts such as temperature and entropy and prompts the development of quantum thermodynamics to incorporate these non-classical behaviors .
Quantum information theory enhances our understanding of quantum thermodynamics by providing a framework to explore how quantum phenomena like entanglement and superposition affect thermodynamic processes. It allows scientists to study the thermodynamic efficiency of quantum systems and uncover advantages over classical systems, such as improved work extraction due to quantum correlations. Quantum information theory offers insights into the interplay between quantum mechanics and classical thermodynamics laws, leading to advancements like the concept of quantum steampunk .
Quantum steampunk differs from classical thermodynamics by integrating quantum physics and information theory to reinterpret thermodynamic tasks. It emphasizes quantum systems' ability to perform thermodynamic tasks better than classical systems, such as using entanglement to achieve work gains and employing bosons for efficient engine performance. Quantum steampunk also focuses on small and out-of-equilibrium systems, unlike classical thermodynamics' traditional focus on large, equilibrium systems .
The uncertainty principle in quantum mechanics implies that the more precisely a quantum system's position is known, the less precisely its momentum can be known, due to the non-commutative nature of quantum observables . This principle influences thermodynamic uncertainty relations, which similarly involve inequalities that set lower limits on uncertainties in classical and quantum systems . These relations highlight that fluctuations, like those of currents in particle flow or electricity, cannot be arbitrarily small, as governed by entropy . Quantum thermodynamics, as a field, applies these concepts to small systems, blending statistical mechanics with information theory and impacting our understanding of processes like cooling in quantum computers .
Scientific cross-disciplinary collaboration played a vital role in the development of quantum thermodynamics by enabling the fusion of concepts from quantum physics, information theory, and traditional thermodynamics. For instance, during the 1980s, a collaboration between mathematicians and physicists led to the modeling of how quantum systems reach equilibrium, advancing the understanding of quantum engines and their theoretical underpinnings . Furthermore, the maturation of quantum information theory as a tool allowed scientists to apply its principles to other fields such as chemistry and material science, rejuvenating thermodynamics through a quantum lens . This interdisciplinary approach facilitated tackling foundational problems and experimenting with new ideas, thereby accelerating the field’s growth and acceptance globally . The broader application of quantum information tools across different scientific disciplines exemplifies the critical importance of cross-disciplinary collaboration in advancing quantum thermodynamics .
Monogamous entanglement refers to a limitation in quantum systems where if a particle is maximally entangled with one other particle, it cannot be maximally entangled with any other. This implies that entanglement is a finite resource that must be shared among particles. For example, if Baxter's particle is maximally entangled with Audrey's, this restricts it from being simultaneously maximally entangled with Caspian's particle . This limitation affects how quantum information is managed and shared, as the more entanglement a particle shares with one system, the less it can entangle with others, leading to decoherence where quantum systems begin resembling classical ones . Such monogamy of entanglement underscores the challenges in developing quantum technologies like quantum computers, where managing entanglement among many particles is essential yet difficult .
The principle of no signaling reconciles quantum entanglement with the theory of relativity by ensuring that information does not travel faster than the speed of light, as required by relativity. While quantum entanglement allows for correlations between distant particles, these correlations do not enable the transmission of information between the particles at superluminal speeds. Thus, entangled systems obey the no signaling principle, maintaining consistency with Einstein's theory of relativity .