Chasing Ed Jaynes’s ghost

You can’t escape him, working where information theory meets statistical mechanics.

Information theory concerns how efficiently we can encode information, compute, evade eavesdroppers, and communicate. Statistical mechanics is the physics of  many particles. We can’t track every particle in a material, such as a sheet of glass. Instead, we reason about how the conglomerate likely behaves. Since we can’t know how all the particles behave, uncertainty blunts our predictions. Uncertainty underlies also information theory: You can think that your brother wished you a happy birthday on the phone. But noise corroded the signal; he might have wished you a madcap Earth Day. 

Edwin Thompson Jaynes united the fields, in two 1957 papers entitled “Information theory and statistical mechanics.” I’ve cited the papers in at least two of mine. Those 1957 papers, and Jaynes’s philosophy, permeate pockets of quantum information theory, statistical mechanics, and biophysics. Say you know a little about some system, Jaynes wrote, like a gas’s average energy. Say you want to describe the gas’s state mathematically. Which state can you most reasonably ascribe to the gas? The state that, upon satisfying the average-energy constraint, reflects our ignorance of the rest of the gas’s properties. Information theorists quantify ignorance with a function called entropy, so we ascribe to the gas a large-entropy state. Jaynes’s Principle of Maximum Entropy has spread from statistical mechanics to image processing and computer science and beyond. You can’t evade Ed Jaynes.

I decided to turn the tables on him this December. I was visiting to Washington University in St. Louis, where Jaynes worked until six years before his 1998 death. Haunted by Jaynes, I’d hunt down his ghost.

WashU

I began with my host, Kater Murch. Kater’s lab performs experiments with superconducting qubits. These quantum circuits sustain currents that can flow forever, without dissipating. I questioned Kater over hummus, the evening after I presented a seminar about quantum uncertainty and equilibration. Kater had arrived at WashU a decade-and-a-half after Jaynes’s passing but had kept his ears open.

Ed Jaynes, Kater said, consulted for a startup, decades ago. The company lacked the funds to pay him, so it offered him stock. That company was Varian, and Jaynes wound up with a pretty penny. He bought a mansion, across the street from campus, where he hosted the physics faculty and grad students every Friday. He’d play a grand piano, and guests would accompany him on instruments they’d bring. The department doubled as his family. 

The library kept a binder of Jaynes’s papers, which Kater had skimmed the previous year. What clarity shined through those papers! With a touch of pride, Kater added that he inhabited Jaynes’s former office. Or the office next door. He wasn’t certain.

I passed the hummus to a grad student of Kater’s. Do you hear stories about Jaynes around the department? I asked. I’d heard plenty about Feynman, as a PhD student at Caltech.

Not many, he answered. Just in conversations like this.

Later that evening, I exchanged emails with Kater. A contemporary of Jaynes’s had attended my seminar, he mentioned. Pity that I’d missed meeting the contemporary.

The following afternoon, I climbed to the physics library on the third floor of Crow Hall. Portraits of suited men greeted me. At the circulation desk, I asked for the binders of Jaynes’s papers.

Who? asked the student behind the granola bars advertised as “Free study snacks—help yourself!” 

E.T. Jaynes, I repeated. He worked here as a faculty member.

She turned to her computer. Can you spell that?

I obeyed while typing the name into the computer for patrons. The catalogue proffered several entries, one of which resembled my target. I wrote down the call number, then glanced at the notes over which the student was bending: “The harmonic oscillator.” An undergrad studying physics, I surmised. Maybe she’ll encounter Jaynes in a couple of years. 

I hiked upstairs, located the statistical-mechanics section, and ran a finger along the shelf. Hurt and Hermann, Itzykson and Drouffe, …Kadanoff and Baym. No Jaynes? I double-checked. No Jaynes. 

Library books

Upon descending the stairs, I queried the student at the circulation desk. She checked the catalogue entry, then ahhhed. You’d have go to the main campus library for this, she said. Do you want directions? I declined, thanked her, and prepared to return to Kater’s lab. Calculations awaited me there; I’d have no time for the main library. 

As I reached the physics library’s door, a placard caught my eye. It appeared to list the men whose portraits lined the walls. Arthur Compton…I only glanced at the placard, but I didn’t notice any “Jaynes.”

Arthur Compton greeted me also from an engraving en route to Kater’s lab. Down the hall lay a narrow staircase on whose installation, according to Kater, Jaynes had insisted. Physicists would have, in the stairs’ absence, had to trek down the hall to access the third floor. Of course I wouldn’t photograph the staircase for a blog post. I might belong to the millenial generation, but I aim and click only with purpose. What, though, could I report in a blog post? 

That night, I googled “e.t. jaynes.” His Wikipedia page contained only introductory and “Notes” sections. A WashU website offered a biography and unpublished works. But another tidbit I’d heard in the department yielded no Google hits, at first glance. I forbore a second glance, navigated to my inbox, and emailed Kater about plans for the next day.

I’d almost given up on Jaynes when Kater responded. After agreeing to my suggestion, he reported feedback about my seminar: A fellow faculty member “thought that Ed Jaynes (his contemporary) would have been very pleased.” 

The email landed in my “Nice messages” folder within two shakes. 

Leaning back, I reevaluated my data about Jaynes. I’d unearthed little, and little surprise: According to the WashU website, Jaynes “would undoubtedly be uncomfortable with all of the attention being lavished on him now that he is dead.” I appreciate privacy and modesty. Nor does Jaynes need portraits or engravings. His legacy lives in ideas, in people. Faculty from across his department attended a seminar about equilibration and about how much we can know about quantum systems. Kater might or might not inhabit Jaynes’s office. But Kater wears a strip cut from Jaynes’s mantle: Kater’s lab probes the intersection of information theory and statistical mechanics. They’ve built a Maxwell demon, a device that uses information as a sort of fuel to perform thermodynamic work. 

I’ve blogged about legacies that last. Assyrian reliefs carved in alabaster survive for millennia, as do ideas. Jaynes’s ideas thrive; they live even in me.

Did I find Ed Jaynes’s ghost at WashU? I think I honored it, by pursuing calculations instead of pursuing his ghost further. I can’t say whether I found his ghost. But I gained enough information.

 

With thanks to Kater and to the Washington University Department of Physics for their hospitality.

Theoretical physics has not gone to the dogs.

I was surprised to learn, last week, that my profession has gone to the dogs. I’d introduced myself to a nonscientist as a theoretical physicist.

“I think,” he said, “that theoretical physics has lost its way in symmetry and beauty and math. It’s too far from experiments to be science.”

The accusation triggered an identity crisis. I lost my faith in my work, bit my nails to the quick, and enrolled in workshops about machine learning and Chinese.

Or I might have, if all theoretical physicists pursued quantum gravity.

Quantum-gravity physicists attempt to reconcile two physical theories, quantum mechanics and general relativity. Quantum theory manifests on small length scales, such as atoms’ and electrons’. General relativity manifests in massive systems, such as the solar system. A few settings unite smallness with massiveness, such as black holes and the universe’s origin. Understanding these settings requires a unification of quantum theory and general relativity.

Try to unify the theories, and you’ll find yourself writing equations that contain infinities. Such infinities can’t describe physical reality, but they’ve withstood decades of onslaughts. For guidance, many quantum-gravity theorists appeal to mathematical symmetries. Symmetries, they reason, helped 20th-century particle theorists predict experimental outcomes with accuracies better than any achieved with any other scientific theory. Perhaps symmetries can extend particle physics to a theory of quantum gravity.

Some physicists have criticized certain approaches to quantum gravity, certain approaches to high-energy physics more generally, and the high-energy community’s philosophy and sociology. Much criticism has centered on string theory, according to which our space-time has up to 26 dimensions, most too small for you to notice. Critics include Lee Smolin, the author of The Trouble with Physics, Peter Woit, who blogs on Not Even Wrong, and Sabine Hossenfelder, who published Lost in Math this year. This article contains no criticism of their crusade. I see merit in arguments of theirs, as in arguments of string theorists.

Science requires criticism to progress. So thank goodness that Smolin, Woit, Hossenfelder, and others are criticizing string theory. Thank goodness that the criticized respond. Thank goodness that debate rages, like the occasional wildfire needed to maintain a forest’s health.

The debate might appear to impugn the integrity of theoretical physics. But quantum gravity constitutes one pot in the greenhouse of theoretical physics. Theoretical physicists study lasers, star formation, atomic clocks, biological cells, gravitational waves, artificial materials, and more. Theoretical physicists are explaining, guiding, and collaborating on experiments. So many successes have piled up recently, I had trouble picking examples for this article. 

One example—fluctuation relations—I’ve blogged about beforeThese equalities generalize the second law of thermodynamics, which illuminates why time flows in just one direction. Fluctuation relations also provide a route to measuring an energetic quantity applied in pharmacology, biology, and chemistry. Experimentalists have shown, over the past 15 years, that fluctuation relations govern RNA, DNA, electronic systems, and trapped ions (artificial atoms). 

Second, experimentalists are exercising, over quantum systems, control that physicists didn’t dream of decades ago. Harvard physicists can position over 50 atoms however they please, using tweezers formed from light. Google has built a noisy quantum computer of 72 superconducting qubits, circuits through which charge flows without resistance. Also trapped ions, defects in diamonds, photonics, and topological materials are breaking barriers. These experiments advance partially due to motivation from theorists and partially through collaborations with theorists. In turn, experimental data guide theorists’ explanations and our proposals of experiments.

In one example, theorists teamed with experimentalists to probe quantum correlations spread across space and time. In another example, theorists posited a mechanism by which superconducting qubits interact with a hot environment. Other illustrations from the past five years include discrete time crystals, manybody scars, magic-angle materials, and quantum chaos. 

These collaborations even offer hope for steering quantum gravity with experiments. Certain quantum-gravity systems share properties with certain many-particle quantum systems. This similarity, we call “the AdS/CFT duality.” Experimentalists have many-particle quantum systems and are stretching those systems toward the AdS/CFT regime. Experimental results, with the duality, might illuminate where quantum-gravity theorists should and shouldn’t search. Perhaps no such experiments will take place for decades. Perhaps AdS/CFT can’t shed light on our universe. But theorists and experimentalists are partnering to try.

These illustrations demonstrate that theoretical physics, on the whole, remains healthy, grounded, and thriving. This thriving is failing to register with part of the public. Evidence thwacked me in the face last week, as explained at the start of this article. The Wall Street Journal published another example last month: John Horgan wrote that “physics, which should serve as the bedrock of science, is in some respects the most troubled field of” science. The evidence presented consists of one neighborhood in the theoretical fraction of the metropolis of physics: string and multiverse models.

Horgan’s article reflects decades of experience in science journalism, a field I respect. I sympathize, moreover, with those who interface so much with quantum gravity, the subfield appears to eclipse the rest of theoretical physics. Horgan was reviewing books by Stephen Hawking and Martin Rees, who discuss string and multiverse models. Smolin, Woit, Hossenfelder, and others garner much press, which they deserve: They provoke debate and articulate their messages eloquently. Such press can blot out, say, profiles of the theoretical astrophysicists licking their lips over gravitational-wave data.

If any theory bears flaws, those flaws need correcting. But most theoretical physicists don’t pursue quantum gravity, let alone string theory. Any flaws of string theory do not mar all theoretical physics. These points need a megaphone, because misconceptions about theoretical physics endanger society. First, companies need workers who have technical skills and critical reasoning. Both come from training in theoretical physics. Besmirching theoretical physics can divert students from programs that can benefit the economy and nurture thoughtful citizens.1 

Second, some nonscientists are attempting to discredit the scientific community for political gain. Misconceptions about theoretical physics can appear to support these nonscientists’ claims. The ensuing confusion can lead astray voters and parents who face choices about vaccination, global health, national security, and budget allocations.

Last week, I heard that my profession has wandered too far from experiments. Hours earlier, I’d skyped with an experimentalist with whom I’m collaborating. A disconnect separates the reality of theoretical physicists from impressions harbored by part of the public. Let’s clear up the misconceptions. Theoretical physics, as a whole, remains healthy, grounded, and thriving.

 

 

1Nurturing thoughtful citizens takes also humanities, social-sciences, language, and arts programs.

“Methinks, I know one kind like you.”

I was expecting to pore over a poem handwritten by one of history’s most influential chemists. Sir Humphry Davy lived in Britain around the turn of the 19th century. He invented a lamp that saved miners’ lives, discovered and isolated chemical elements, coined the term “laughing gas,” and inspired younger researchers through public lectures.

Davy

Humphry Davy

Davy wrote not only scientific papers, but also poetry. He befriended contemporaries known today as “Romantic poets,” including Samuel Taylor Coleridge. English literature and the history of science rank among the specialties of the Huntington Library in San Marino, CA. The Huntington collects manuscripts and rare books, and I secured a reader card this July. I aspired to find a poem by Davy.

Bingo: The online catalogue contained an entry entitled “To the glow worm.” I requested the manuscript and settled into the hushed, wood-paneled reading room.

Davy had written scarcely legibly, in black ink, on a page that had creased and torn. I glanced over the lines, then realized that the manuscript folder contained two other pages. The pages had stuck together, so I gently flipped the lot over.

Davy poem

Poem “To the glow worm,” by Humphry Davy

A line at the top of the back page seized the wheel of my attention.

“Methinks, I know one kind like you.”

The line’s intimacy arrested me. I heard a speaker contemplating someone whom he or she had met recently, turning the person over in the speaker’s mind, gaining purchase on the person’s identity. “I know you,” I heard the speaker saying, and I saw the speaker wagging a finger at the person. “I know your type…I think.”

The line’s final six words suggested impulsiveness. How can you know someone you’re still wrapping your head around? I felt inclined to suggest a spoonful of circumspection. But perhaps the speaker was reflecting more than I’d allowed: “Methinks” suggested temperance, an acknowledgement of uncertainty.

I backpedaled to the folder’s cover. “Includes verse and letter by Lady Davy,” it read. Jane Apreece, a wealthy widow, acquired the title Lady Davy upon marrying Sir Humphry. She enjoyed a reputation for social savvy, fashionableness, and sharpness. I’d intruded on her poem, a response to Davy’s. Apreece’s pages begged for a transcription, which I struggled through until the reading room closed 45 minutes later. Dan Lewis, the Huntington’s Dibner Senior Curator of the History of Science and Technology, later improved upon my attempt (parenthesized text ours):

Methinks, I know one kind like you,

Thine(?) to peace, & Nature true;

Kindled by Feeling’s purest flame,

In Storm, or Calm, for ages(?) the same.

Bestowing most its brilliant Light,

Amidst the tranquil shades of Night;

And prompt to solace, raise, & cheer(?),

The heart, subdued by Doubt or Care.

Though not of busy Life afraid

Yet loving best, the pastoral Shade;

Shedding a Ray, more clear & pure,

A Ray, which longer shall endure,

As Friendships light must ever prove

More steadfast than the Flame of Love.

Light recurs throughout the verse: The speaker refers to two flames, to a “Ray,” and to a “brilliant Light // Amidst the tranquil shades of Night.” Comparisons with light suit a scientist, who reveals aspects of nature never witnessed before. (I expect that the speaker directs the apostrophe toward Davy.) Comparisons with light suit Davy not only professionally, but also, to Apreece, personally: Each member of the couple inspired the other to learn. Their poems reflect their intellectual symbiosis: Apreece’s references to light complement the glow worm, which Davy called “lively living lamp of night.”

The final two lines arrested me as the first line did. The speaker contrasts “Friendship[’]s light” with “the Flame of Love.” Finite resources can’t sustain flames, which consume candles, wood, and oxygen. Once its fuel disappears, flame proves less than “steadfast.” Similarly, love can’t survive on passion’s flames. Love should rest on friendship, which sheds the “light” extolled throughout the poem. Light enhances our vision, providing the wisdom needed to sustain love throughout life’s vicissitudes. 

These two lines reveal the temperance hinted at by the “Methinks.” The speaker argues for levelheadedness, for balancing emotion with sustainability. Spoonful of circumspection retracted.

IMG_2354

The clock struck 4:45, and readers began returning their manuscripts and books to the circulation desk. I stood up—and pricked myself on a thorn of realization. The catalogue dated the manuscript to “perhaps [ . . . ] 1811 – they [Davy and Apreece] were married in 1812.” The lovers exchanged these poems without knowing that their marriage would sour years later. I’d read about their relationship—as about Davy’s science and poetry—in Richard Holmes’s The Age of Wonder. 

At least the Davys reunited when Sir Humphry’s last illness struck. At least they remained together until he died. At least a reader can step, through the manuscript, into the couple’s patch of happiness. One can hope see more clearly for their—a scientist’s, a societal navigator’s, and two human beings’—light.

Jane letter + p.1 of poem.JPG

Letter and poem by Jane Apreece (p. 1). The top segment constitutes a letter written “by Lady Davy to a ‘Miss Talbot’ (1852, January 2),” according to the catalogue.

Jane poem, p. 2.JPG

Poem by Jane Apreece (p. 2)

If anyone has insights or has corrections to the transcription, please comment. I haven’t transcribed Davy’s poem, which might illuminate Lady Davy’s response.

With thanks to the Huntington Library of San Marino, CA, for the use of its collection. With thanks to Dan Lewis for improving upon my transcription and for prodding, for five years, toward a reader card.

Doctrine of the (measurement) mean

Don’t invite me to dinner the night before an academic year begins.

You’ll find me in an armchair or sitting on my bed, laptop on my lap, journaling. I initiated the tradition the night before beginning college. I take stock of the past year, my present state, and hopes for the coming year.

Much of the exercise fosters what my high-school physics teacher called “an attitude of gratitude”: I reflect on cities I’ve visited, projects firing me up, family events attended, and subfields sampled. Other paragraphs, I want off my chest: Have I pushed this collaborator too hard or that project too little? Miscommunicated or misunderstood? Strayed too far into heuristics or into mathematical formalisms?

If only the “too much” errors, I end up thinking, could cancel the “too little.”

In one quantum-information context, they can.

Seesaw

Imagine that you’ve fabricated the material that will topple steel and graphene; let’s call it a supermetatopoconsulator. How, you wonder, do charge, energy, and particles move through this material? You’ll learn by measuring correlators.

A correlator signals how much, if you poke this piece here, that piece there responds. At least, a two-point correlator does: \langle A(0) B(\tau) \rangle. A(0) represents the poke, which occurs at time t = 0. B(\tau) represents the observable measured there at t = \tau. The \langle . \rangle encapsulates which state \rho the system started in.

Condensed-matter, quantum-optics, and particle experimentalists have measured two-point correlators for years. But consider the three-point correlator \langle A(0) B(\tau) C (\tau' ) \rangle, or a k-point \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle, for any k \geq 2. Higher-point correlators relate more-complicated relationships amongst events. Four-pointcorrelators associated with multiple times signal quantum chaos and information scrambling. Quantum information scrambles upon spreading across a system through many-body entanglement. Could you measure arbitrary-point, arbitrary-time correlators?

New material

Supermetatopoconsulator (artist’s conception)

Yes, collaborators and I have written, using weak measurements. Weak measurements barely disturb the system being measured. But they extract little information about the measured system. So, to measure a correlator, you’d have to perform many trials. Moreover, your postdocs and students might have little experience with weak measurements. They might not want to learn the techniques required, to recalibrate their detectors, etc. Could you measure these correlators easily?

Yes, if the material consists of qubits,2 according to a paper I published with Justin Dressel, José Raúl González Alsonso, and Mordecai Waegell this summer. You could build such a system from, e.g., superconducting circuits, trapped ions, or quantum dots.

You can measure \langle \underbrace{ A(0) B (\tau') C (\tau'') \ldots M (\tau^{(k)}) }_k \rangle, we show, by measuring A at t = 0, waiting until t = \tau', measuring B, and so on until measuring M at t = \tau^{(k)}. The t-values needn’t increase sequentially: \tau'' could be less than \tau', for instance. You’d have to effectively reverse the flow of time experienced by the qubits. Experimentalists can do so by, for example, flipping magnetic fields upside-down.

Each measurement requires an ancilla, or helper qubit. The ancilla acts as a detector that records the measurement’s outcome. Suppose that A is an observable of qubit #1 of the system of interest. You bring an ancilla to qubit 1, entangle the qubits (force them to interact), and look at the ancilla. (Experts: You perform a controlled rotation on the ancilla, conditioning on the system qubit.)

Each trial yields k measurement outcomes. They form a sequence S, such as (1, 1, 1, -1, -1, \ldots). You should compute a number \alpha, according to a formula we provide, from each measurement outcome and from the measurement’s settings. These numbers form a new sequence S' = \mathbf{(} \alpha_S(1), \alpha_S(1), \ldots \mathbf{)}. Why bother? So that you can force errors to cancel.

Multiply the \alpha’s together, \alpha_S(1) \times \alpha_S(1) \times \ldots, and average the product over the possible sequences S. This average equals the correlator \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle. Congratulations; you’ve characterized transport in your supermetatopoconsulator.

Success

When measuring, you can couple the ancillas to the system weakly or strongly, disturbing the system a little or a lot. Wouldn’t strong measurements perturb the state \rho whose properties you hope to measure? Wouldn’t the perturbations by measurements one through \ell throw off measurement \ell + 1?

Yes. But the errors introduced by those perturbations cancel in the average. The reason stems from how we construct \alpha’s: Our formula makes some products positive and some negative. The positive and negative terms sum to zero.

Balance 2

The cancellation offers hope for my journal assessment: Errors can come out in the wash. Not of their own accord, not without forethought. But errors can cancel out in the wash—if you soap your \alpha’s with care.

 

1and six-point, eight-point, etc.

2Rather, each measured observable must square to the identity, e.g., A^2 = 1. Qubit Pauli operators satisfy this requirement.

 

With apologies to Aristotle.

I get knocked down…

“You’ll have to have a thick skin.”

Marcelo Gleiser, a college mentor of mine, emailed the warning. I’d sent a list of physics PhD programs and requested advice about which to attend. Marcelo’s and my department had fostered encouragement and consideration.

Suit up, Marcelo was saying.

Criticism fuels science, as Oxford physicist David Deutsch has written. We have choices about how we criticize. Some criticism styles reflect consideration for the criticized work’s creator. Tufts University philosopher Daniel Dennett has devised guidelines for “criticizing with kindness”:1

1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.

2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).

3. You should mention anything you have learned from your target.

4. Only then are you permitted to say so much as a word of rebuttal or criticism.

Scientists skip to step four often—when refereeing papers submitted to journals, when posing questions during seminars, when emailing collaborators, when colleagues sketch ideas at a blackboard. Why? Listening and criticizing require time, thought, and effort—three of a scientist’s most valuable resources. Should any scientist spend those resources on an idea of mine, s/he deserves my gratitude. Spending empathy atop time, thought, and effort can feel supererogatory. Nor do all scientists prioritize empathy and kindness. Others of us prioritize empathy but—as I have over the past five years—grown so used to its latency, I forget to demonstrate it.

Doing science requires facing not only criticism, but also “That doesn’t make sense,” “Who cares?” “Of course not,” and other morale boosters.

Doing science requires resilience.

Resilience

So do measurements of quantum information (QI) scrambling. Scrambling is a subtle, late, quantum stage of equilibration2 in many-body systems. Example systems include chains of spins,3 such as in ultracold atoms, that interact with each other strongly. Exotic examples include black holes in anti-de Sitter space.4

Imagine whacking one side of a chain of interacting spins. Information about the whack will disseminate throughout the chain via entanglement.5 After a long interval (the scrambling time, t_*), spins across the systems will share many-body entanglement. No measurement of any few, close-together spins can disclose much about the whack. Information will have scrambled across the system.

QI scrambling has the subtlety of an assassin treading a Persian carpet at midnight. Can we observe scrambling?

Carpet

A Stanford team proposed a scheme for detecting scrambling using interferometry.6 Justin Dressel, Brian Swingle, and I proposed a scheme based on weak measurements, which refrain from disturbing the measured system much. Other teams have proposed alternatives.

Many schemes rely on effective time reversal: The experimentalist must perform the quantum analog of inverting particles’ momenta. One must negate the Hamiltonian \hat{H}, the observable that governs how the system evolves: \hat{H} \mapsto - \hat{H}.

At least, the experimentalist must try. The experimentalist will likely map \hat{H} to - \hat{H} + \varepsilon. The small error \varepsilon could wreak havoc: QI scrambling relates to chaos, exemplified by the butterfly effect. Tiny perturbations, such as the flap of a butterfly’s wings, can snowball in chaotic systems, as by generating tornadoes. Will the \varepsilon snowball, obscuring observations of scrambling?

Snowball

It needn’t, Brian and I wrote in a recent paper. You can divide out much of the error until t_*.

You can detect scrambling by measuring an out-of-time-ordered correlator (OTOC), an object I’ve effused about elsewhere. Let’s denote the time-t correlator by F(t). You can infer an approximation \tilde{F}(t) to F(t) upon implementing an \varepsilon-ridden interferometry or weak-measurement protocol. Remove some steps from that protocol, Brian and I say. Infer a simpler, easier-to-measure object \tilde{F}_{\rm simple}(t). Divide the two measurement outcomes to approximate the OTOC:

F(t)  \approx \frac{ \tilde{F}(t) }{ \tilde{F}_{\rm simple}(t) }.

OTOC measurements exhibit resilience to error.

Arm 2

Physicists need resilience. Brian criticizes with such grace, he could serve as the poster child for Daniel Dennett’s guidelines. But not every scientist could. How can we withstand kindness-lite criticism?

By drawing confidence from what we’ve achieved, with help from mentors like Marcelo. I couldn’t tell what about me—if anything—could serve as a rock on which to plant a foot, as an undergrad. Mentors identified what I had too little experience to appreciate. You question what you don’t understand, they said. You assimilate perspectives from textbooks, lectures, practice problems, and past experiences. You scrutinize details while keeping an eye on the big picture. So don’t let so-and-so intimidate you.

I still lack my mentors’ experience, but I’ve imbibed a drop of their insight. I savor calculations that I nail, congratulate myself upon nullifying referees’ concerns, and celebrate the theorems I prove.

I’ve also created an email folder entitled “Nice messages.” In go “I loved your new paper; combining those topics was creative,” “Well done on the seminar; I’m now thinking of exploring that field,” and other rarities. The folder affords an umbrella when physics clouds gather.

Finally, I try to express appreciation of others’ work.7 Science thrives on criticism, but scientists do science. And scientists are human—undergrads, postdocs, senior researchers, and everyone else.

Doing science—and attempting to negate Hamiltonians—we get knocked down. But we can get up again.

 

Around the time Brian and I released “Resilience” two other groups proposed related renormalizations. Check out their schemes here and here.

1Thanks to Sean Carroll for alerting me to this gem of Dennett’s.

2A system equilibrates as its large-scale properties, like energy, flatline.

3Angular-momentum-like quantum properties

4Certain space-times different from ours

5Correlations, shareable by quantum systems, stronger than any achievable by classical systems

6The cancellation (as by a crest of one wave and a trough of another) of components of a quantum state, or the addition of components (as two waves’ crests)

7Appreciation of specific qualities. “Nice job” can reflect a speaker’s belief but often reflects a desire to buoy a receiver whose work has few merits to elaborate on. I applaud that desire and recommend reinvesting it. “Nice job” carries little content, which evaporates under repetition. Specificity provides content: “Your idea is alluringly simple but could reverberate across multiple fields” has gristle.

So long, and thanks for all the Fourier transforms

The air conditioning in Caltech’s Annenberg Center for Information Science and Technology broke this July. Pasadena reached 87°F on the fourth, but my office missed the memo. The thermostat read 62°.

Hyperactive air conditioning suits a thermodynamicist’s office as jittery wifi suits an electrical-engineering building. Thermodynamicists call air conditioners “heat pumps.” A heat pump funnels heat—the energy of random motion—from cooler bodies to hotter. Heat flows spontaneously only from hot to cold on average, according to the Second Law of Thermodynamics. Pumping heat against its inclination costs work, organized energy drawn from a reliable source.

Reliable sources include batteries, coiled springs, and ACME anvils hoisted into the air. Batteries have chemical energy that power electric fans. ACME anvils have gravitational potential energy that splat coyotes.

Thermostat

I hoisted binder after binder onto my desk this July. The binders felt like understudies for ACME anvils, bulging with papers. They contained notes I’d written, and articles I’d read, for research throughout the past five years. My Caltech sojourn was switching off its lights and drawing its shutters. A control theorist was inheriting my desk. I had to move my possessions to an office downstairs, where I’d moonlight until quitting town.

Quitting town.

I hadn’t expected to feel at home in southern California, after stints in New and old England. But research and researchers drew me to California and then hooked me. Caltech’s Institute for Quantum Information and Matter (IQIM) has provided an intellectual home, colleagues-cum-friends, and a base from which to branch out to other scholars and institutions.

The IQIM has provided also the liberty to deck out my research program as a college dorm room with posters—according to my tastes, values, and exuberances. My thesis demanded the title “Quantum steampunk: Quantum information, thermodynamics, their intersection, and applications thereof across physics.” I began developing the concept of quantum steampunk on this blog. Writing a manifesto for the concept, in the thesis’s introduction, proved a delight:

The steampunk movement has invaded literature, film, and art over the past three decades. Futuristic technologies mingle, in steampunk works, with Victorian and wild-west settings. Top hats, nascent factories, and grimy cities counterbalance time machines, airships, and automata. The genre arguably originated in 1895, with the H.G. Wells novel The Time Machine. Recent steampunk books include the best-selling The Invention of Hugo Cabret; films include the major motion picture Wild Wild West; and artwork ranges from painting to jewelry to sculpture.

Steampunk captures the romanticism of fusing the old with the cutting-edge. Technologies proliferated during the Victorian era: locomotives, Charles Babbage’s analytical engine, factories, and more. Innovation facilitated exploration. Add time machines, and the spirit of adventure sweeps you away. Little wonder that fans flock to steampunk conventions, decked out in overcoats, cravats, and goggles.

What steampunk fans dream, quantum-information thermodynamicists live.

Thermodynamics budded during the late 1800s, when steam engines drove the Industrial Revolution. Sadi Carnot, Ludwig Boltzmann, and other thinkers wondered how efficiently engines could operate. Their practical questions led to fundamental insights—about why time flows; how much one can know about a physical system; and how simple macroscopic properties, like temperature, can capture complex behaviors, like collisions by steam particles. An idealization of steam—the classical ideal gas—exemplifies the conventional thermodynamic system. Such systems contain many particles, behave classically, and are often assumed to remain in equilibrium.

But thermodynamic concepts—such as heat, work, and equilibrium—characterize small scales, quantum systems, and out-of-equilibrium processes. Today’s experimentalists probe these settings, stretching single DNA strands with optical tweezers [4], cooling superconducting qubits to build quantum computers [5, 6], and extracting work from single-electron boxes [7]. These settings demand reconciliation with 19th-century thermodynamics. We need a toolkit for fusing the old with the new.

Quantum information (QI) theory provides such a toolkit. Quantum phenomena serve as resources for processing information in ways impossible with classical systems. Quantum computers can solve certain computationally difficult problems quickly; quantum teleportation transmits information as telephones cannot; quantum cryptography secures messages; and quantum metrology centers on high- precision measurements. These applications rely on entanglement (strong correlations between quantum systems), disturbances by measurements, quantum uncertainty, and discreteness.

Technological promise has driven fundamental insights, as in thermodynamics. QI theory has blossomed into a mathematical toolkit that includes entropies, uncertainty relations, and resource theories. These tools are reshaping fundamental science, in applications across physics, computer science, and chemistry.

QI is being used to update thermodynamics, in the field of quantum thermodynamics (QT) [8, 9]. QT features entropies suited to small scales; quantum engines; the roles of coherence in thermalization and transport; and the transduction of information into work, à la Maxwell’s demon [10].

This thesis (i) contributes to the theory of QI thermodynamics and (ii) applies the theory, as a toolkit, across physics. Spheres touched on include atomic, molecular, and optical (AMO) physics; nonequilibrium statistical mechanics; condensed matter; chemistry; and high-energy physics. I propose the name quantum steampunk for this program…

Never did I anticipate, in college, that a PhD could reflect my identity and style. I feared losing myself and my perspective in a subproblem of a subproblem of a subproblem. But I found myself blessed with the chance to name the aesthetic that’s guided my work, the scent I’ve unconsciously followed from book to class to research project to conversation, to paper, since…middle school, come to think of it. I’m grateful for that opportunity.

Q. steampunk

Whump, went my quantum-engine binder on my desk. I’d stuck an address label, pointing to Annenberg, to the binder. If the binder walked away, whoever found it would know where it belonged. Scratching at the label with a fingernail failed to budge the sticker. I stuck a label addressed to Cambridge, Massachusetts alongside the Pasadena address.

I’m grateful to be joining Harvard as an ITAMP (Institute for Theoretical Atomic, Molecular, and Optical Physics) Postdoctoral Fellow. You’ll be able to catch me in Harvard’s physics department, in ITAMP, or at MIT, starting this September.

While hunting for a Cambridge apartment, I skyped with potential roommates. I’d inquire about locations, about landlords and landladies, about tidiness, and about heating. The heating system’s pretty old, most tenants would admit. We keep the temperature between 60 and 65 degrees, to keep costs down. I’d nod and extol the layering of sweaters, but I shivered inside.

One tenant surprised me. The heating…works too well, she said. It’s pretty warm, to tell the truth. I thought about heat pumps and quantum engines, about picnics in the Pasadena sunshine, about the Julys I’d enjoyed while the world around me had sweated. Within an hour, I’d committed to sharing the apartment.

Boxes

Some of you have asked whether I’ll continue blogging for Quantum Frontiers. Yes: Extricating me from the IQIM requires more than 3,000 miles.

See you in Cambridge.

 

With apologies to Douglas Adams.

The World Cup from a Quantum Perspective

Two weeks into the Football World Cup and the group stages are over: 16 teams have gone home, leaving the top 16 teams to contend the knock-out stages. Those fans who enjoy a bet will be poring over the odds in search of a bargain—a mis-calculation on the part of the bookmakers. Is now the time to back Ronaldo for the golden boot, whilst Harry Kane dominates the headlines and sports betting markets? Will the hosts Russia continue to defy lowly pre-tournament expectations and make the semi-finals? Are France about to emerge from an unconvincing start to the tournament and blossom as clear front-runners?

But, whilst for most the sports betting markets may lead to the enhanced enjoyment of the tournament that a bet can bring (as well as the possibility of making a little money), for others they represent a window into the fascinating world of sports statistics. A fascination that can be captured by the simple question: how do they set the odds?

Suppose that a bookmaker has in their possession a list of outcome probabilities for matches between each pair of national football teams in the world and wants to predict the overall winner. There are 32768 possible ways for the tournament knock-out rounds to pan-out—a large, but not insurmountable number of iterations by modern computational standards.

However, if the bookmaker instead considers the tennis grand-slams, with 128 competitors in the first round, then there are a colossal 1.7 × 1038 permutations. Indeed, in a knock-out format there are 2n-1 permutations, where n is the number of entrants. And for those of a certain mindset, this exponentially growing space immediately raises the question of whether a quantum algorithm can yield a speed-up for the related prediction tasks.

A Tiny Cup

The immediate question which we want to answer here is, perhaps, who will win the World Cup. We will walk through the idea on the blackboard first, and then implement it as a quantum algorithm—which, hopefully, will give some insight into how and where quantum computers can outperform classical ones, for this particular way of answering the question.

Let us take a toy setup with four teams A, B, C and D;
the knockout stage starts with a game A vs. B, and C vs. D.
Whoever wins each game will play against each other, so here we have four possible final games: A vs. C, A vs. D, B vs. C, or B vs. D.
Let’s denote by p(X, Y) the probability that X wins when playing against Y.

The likelihood of A winning the cup is then simply given by

p(A, B) × ( p(C, D) × p(A, C) + p(D, C) × p(A, D) ),

i.e. the probability that A wins against B, times the probabilities of A winning against C in case C won against D, plus the probability of A winning against D in case D won.

How can we obtain the same quantity with a quantum algorithm?

First, we set up our Hilbert space so that it can represent all possible Cup scenarios.
Since we have four teams, we need a four-dimensional quantum system as our smallest storage unit—we commonly call those qudits as generalizations of a qubit, which having dimension 2 would be fit to store two teams only (we can always “embed” a qudit into a few qubits of the same dimension.

Remember: k qubits have dimension 2k, so we could also store the qudit as two qubits).
If we write |A\rangle, this simply stands for a qudit representing team A; if we write |A\rangle |B\rangle, then we have a state representing two teams.

To represent a full knockout tree, we follow the same logic: Take four qudits for the initial draw; add two qudits for the winners of the first two matches, and one qudit for the final winner.

For instance, one possible knockout scenario would be

|\text{Game 1}\rangle = \underbrace{|A\rangle |B\rangle |C\rangle |D\rangle}_\text{Initial Draw} \ \underbrace{|A\rangle |D\rangle}_\text{Finals} \ |D\rangle.

The probability associated with Game 1 is then precisely p(A, B) × p(D, C) × p(D, A).

Here is where quantum computing comes in.

Starting from an initial state |A\rangle |B\rangle |C\rangle |D\rangle, we create two new slots in a superposition over all possible match outcomes, weighted by the square-root of their probabilities (which we call q instead of p):

\begin{aligned} |\text{Step 1}\rangle = |A\rangle |B\rangle |C\rangle |D\rangle \big(\ \ &\text{q(A, B)q(C, D)} \,|A\rangle\ |C\rangle +\\ &\text{q(A, B)q(D, C)} \,|A\rangle\ |D\rangle +\\ &\text{q(B, A)q(C, D)} \,|B\rangle\ |C\rangle +\\ &\text{q(B, A)q(D, C)} \,|B\rangle\ |D\rangle\ \big). \end{aligned}

For the final round, we perform the same operation on those two last slots; e.g. we would map |A\rangle |C\rangle to a state |A\rangle |C\rangle ( q(A, C) |A\rangle + q(C, A) |C\rangle ). The final state is thus a superposition over eight possible weighted games (as we would expect).

So you can tell me who wins the World Cup?

Yes. Or well, probably. We find out by measuring the rightmost qudit.
As we know, the probability of obtaining a certain measurement outcome, say A, will then be determined by the square of the weights in front of the measured state; since we put in the square-roots initially we recover the original probabilities. Neat!

And since there are two possible game trees that lead to a victory of A, we have to sum them up—and we get precisely the probability we calculated by hand above. This means the team that is most likely to win will be the most likely measurement outcome.

So what about the World Cup? We have 16 teams; one team can thus be stored in four qubits. The knockout tree has 31 vertices, and a naive implementation can be done on a quantum computer with 124 qubits. Of course we are only a bit naive, so we can simulate this quantum computer on a classical one and obtain the following winning probabilities:

0.194 Brazil
0.168 Spain
0.119 France
0.092 Belgium
0.082 Argentina
0.075 England
0.049 Croatia
0.041 Colombia
0.04 Portugal
0.032 Uruguay
0.031 Russia
0.022 Switzerland
0.019 Denmark
0.018 Sweden
0.012 Mexico
0.006 Japan

It is worth noting that all operations we described can be implemented efficiently with a quantum computer, and the number of required qubits is quite small; for the four teams, we could get away with seven qudits, or fourteen qubits (and we could even save some, by ignoring the first four qudits which are always the same).

So for this particular algorithm there is an exponential speedup over its non-probabilistic classical counterpart: as mentioned, one would have to iterate over all trees; tedious for the World Cup, practically impossible for tennis. However…

Classical vs. Quantum

Does using a quantum algorithm give us a speedup for this task? Here, the answer is no; one could obtain similar results in comparable time using probabilistic methods, for instance, by doing Monte Carlo sampling.

But there are several interesting related questions that we could ask for which there might be a quantum advantage.

For some team A, we can easily create a state that has all game trees in superposition that lead to a victory of A—even weighting them using their respective probabilities.
Given this state as a resource, we can think of questions like “which game tree is most likely, given that we fix A and B as semifinalists”, or “which team should A play in the knockout stages to maximize the probability that B wins the tournament”.

Or, more controversially: can we optimize the winning chances for some team by rearranging the initial draw?

Some questions like these lend themselves to applying Grover search, for which there is a known speedup over classical computers. To inquire deeper into the utility of quantum algorithms, we need to invent the right kind of question to ask of this state.

Let us think of one more toy example. Being part physicists, we assume cows are spheres—so we might as well also assume that if A is likely to win a match against B, it always wins—even if the probability is only 51%. Let’s call this exciting sport “deterministic football”. For a set of teams playing a tournament of deterministic football, does there exist a winning initial draw for every team?

This becomes an especially interesting question in cases where there is a non-trivial cyclic relation between the teams’ abilities, a simple example being: A always beats B, B always beats C, and C always beats A. For example, if this problem turns out to be NP-hard, then it would be reasonable to expect that the quadratic improvement achieved by quantum search is the best we can hope for in using a quantum algorithm for the task of finding a winning initial draw for a chosen team—at least for deterministic football (phew).

To the finals and beyond

World Cup time is an exciting time: whatever the question, we are essentially dealing with binary trees, and making predictions can be translated into finding partitions or cuts that satisfy certain properties defined through a function of the edge weights (here the pairwise winning probabilities). We hope this quantum take on classical bookmaking might point us in the direction of new and interesting applications for quantum algorithms.

Hopefully a good bargain!

(Written with Steven Herbert and Sathyawageeswar Subramanian)