Long live Yale’s cemetery

Call me morbid, but, the moment I arrived at Yale, I couldn’t wait to visit the graveyard.

I visited campus last February, to present the Yale Quantum Institute (YQI) Colloquium. The YQI occupies a building whose stone exterior honors Yale’s Gothic architecture and whose sleekness defies it. The YQI has theory and experiments, seminars and colloquia, error-correcting codes and small-scale quantum computers, mugs and laptop bumper stickers. Those assets would have drawn me like honey. But my host, Steve Girvin, piled molasses, fudge, and cookie dough on top: “you should definitely reserve some time to go visit Josiah Willard Gibbs, Jr., Lars Onsager, and John Kirkwood in the Grove Street Cemetery.”

Laptop

Gibbs, Onsager, and Kirkwood pioneered statistical mechanics. Statistical mechanics is the physics of many-particle systems, energy, efficiency, and entropy, a measure of order. Statistical mechanics helps us understand why time flows in only one direction. As a colleague reminded me at a conference about entropy, “You are young. But you will grow old and die.” That conference featured a field trip to a cemetery at the University of Cambridge. My next entropy-centric conference took place next to a cemetery in Banff, Canada. A quantum-thermodynamics conference included a tour of an Oxford graveyard.1 (That conference reincarnated in Santa Barbara last June, but I found no cemeteries nearby. No wonder I haven’t blogged about it.) Why shouldn’t a quantum-thermodynamics colloquium lead to the Grove Street Cemetery?

Building

Home of the Yale Quantum Institute

The Grove Street Cemetery lies a few blocks from the YQI. I walked from the latter to the former on a morning whose sunshine spoke more of springtime than of February. At one entrance stood a gatehouse that looked older than many of the cemetery’s residents.

“Can you tell me where to find Josiah Willard Gibbs?” I asked the gatekeepers. They handed me a map, traced routes on it, and dispatched me from their lodge. Snow had fallen the previous evening but was losing its battle against the sunshine. I sloshed to a pathway labeled “Locust,” waded along Locust until passing Myrtle, and splashed back and forth until a name caught my eye: “Gibbs.” 

Entrance

One entrance of the Grove Street Cemetery

Josiah Willard Gibbs stamped his name across statistical mechanics during the 1800s. Imagine a gas in a box, a system that illustrates much of statistical mechanics. Suppose that the gas exchanges heat with a temperature-T bath through the box’s walls. After exchanging heat for a long time, the gas reaches thermal equilibrium: Large-scale properties, such as the gas’s energy, quit changing much. Imagine measuring the gas’s energy. What probability does the measurement have of outputting E? The Gibbs distribution provides the answer, e^{ - E / (k_{\rm B} T) } / Z. The k_{\rm B} denotes Boltzmann’s constant, a fundamental constant of nature. The Z denotes a partition function, which ensures that the probabilities sum to one.

Gibbs lent his name to more than probabilities. A function of probabilities, the Gibbs entropy, prefigured information theory. Entropy features in the Gibbs free energy, which dictates how much work certain thermodynamic systems can perform. A thermodynamic system has many properties, such as temperature and pressure. How many can you control? The answer follows from the Gibbs-Duheim relation. You’ll be able to follow the Gibbs walk, a Yale alumnus tells me, once construction on Yale’s physical-sciences complex ends.

Gibbs 1

Back I sloshed along Locust Lane. Turning left onto Myrtle, then right onto Cedar, led to a tree that sheltered two tombstones. They looked like buddies about to throw their arms around each other and smile for a photo. The lefthand tombstone reported four degrees, eight service positions, and three scientific honors of John Gamble Kirkwood. The righthand tombstone belonged to Lars Onsager:

NOBEL LAUREATE*

[ . . . ]

*ETC.

Onsager extended thermodynamics beyond equilibrium. Imagine gently poking one property of a thermodynamic system. For example, recall the gas in a box. Imagine connecting one end of the box to a temperature-T bath and the other end to a bath at a slightly higher temperature, T' \gtrsim T. You’ll have poked the system’s temperature out of equilibrium. Heat will flow from the hotter bath to the colder bath. Particles carry the heat, energy of motion. Suppose that the particles have electric charges. An electric current will flow because of the temperature difference. Similarly, heat can flow because of an electric potential difference, or a pressure difference, and so on. You can cause a thermodynamic system’s elbow to itch, Onsager showed, by tickling the system’s ankle.

To Onsager’s left lay John Kirkwood. Kirkwood had defined a quasiprobability distribution in 1933. Quasiprobabilities resemble probabilities but can assume negative and nonreal values. These behaviors can signal nonclassical physics, such as the ability to outperform classical computers. I generalized Kirkwood’s quasiprobability with collaborators. Our generalized quasiprobability describes quantum chaos, thermalization, and the spread of information through entanglement. Applying the quasiprobability across theory and experiments has occupied me for two-and-a-half years. Rarely has a tombstone pleased anyone as much as Kirkwood’s tickled me.

Kirkwood and Onsager

The Grove Street Cemetery opened my morning with a whiff of rosemary. The evening closed with a shot of adrenaline. I met with four undergrad women who were taking Steve Girvin’s course, an advanced introduction to physics. I should have left the conversation bled of energy: Since visiting the cemetery, I’d held six discussions with nine people. But energy can flow backward. The students asked how I’d come to postdoc at Harvard; I asked what they might major in. They described the research they hoped to explore; I explained how I’d constructed my research program. They asked if I’d had to work as hard as they to understand physics; I confessed that I might have had to work harder.

I left the YQI content, that night. Such a future deserves its past; and such a past, its future.

WIP

With thanks to Steve Girvin, Florian Carle, and the Yale Quantum Institute for their hospitality.

1Thermodynamics is a physical theory that emerges from statistical mechanics.

“A theorist I can actually talk with”

Haunted mansions have ghosts, football teams have mascots, and labs have in-house theorists. I found myself posing as a lab’s theorist at Caltech. The gig began when Oskar Painter, a Caltech experimentalist, emailed that he’d read my first paper about quantum chaos. Would I discuss the paper with the group?

Oskar’s lab was building superconducting qubits, tiny circuits in which charge can flow forever. The lab aimed to control scores of qubits, to develop a quantum many-body system. Entanglement—strong correlations that quantum systems can sustain and everyday systems can’t—would spread throughout the qubits. The system could realize phases of matter—like many-particle quantum chaos—off-limits to most materials.

How could Oskar’s lab characterize the entanglement, the entanglement’s spread, and the phases? Expert readers will suggest measuring an entropy, a gauge of how much information this part of the system holds about that part. But experimentalists have had trouble measuring entropies. Besides, one measurement can’t capture many-body entanglement; such entanglement involves too many intricacies. Oskar was searching for arrows to add to his lab’s measurement quiver.

Mascot

In-house theorist?

I’d proposed a protocol for measuring a characterization of many-body entanglement, quantum chaos, and thermalization—a property called “the out-of-time-ordered correlator.” The protocol appealed to Oskar. But practicalities limit quantum many-body experiments: The more qubits your system contains, the more the system can contact its environment, like stray particles. The stronger the interactions, the more the environment entangles with the qubits, and the less the qubits entangle with each other. Quantum information leaks from the qubits into their surroundings; what happens in Vegas doesn’t stay in Vegas. Would imperfections mar my protocol?

I didn’t know. But I knew someone who could help us find out.

Justin Dressel works at Chapman University as a physics professor. He’s received the highest praise that I’ve heard any experimentalist give a theorist: “He’s a theorist I can actually talk to.” With other collaborators, Justin and I simplified my scheme for measuring out-of-time-ordered correlators. Justin knew what superconducting-qubit experimentalists could achieve, and he’d been helping them reach for more.

How about, I asked Justin, we simulate our protocol on a computer? We’d code up virtual superconducting qubits, program in interactions with the environment, run our measurement scheme, and assess the results’ noisiness. Justin had the tools to simulate the qubits, but he lacked the time. 

Know any postdocs or students who’d take an interest? I asked.

Chapman.001

Chapman University’s former science center. Don’t you wish you spent winters in California?

José Raúl González Alonso has a smile like a welcome sign and a coffee cup glued to one hand. He was moving to Chapman University to work as a Grand Challenges Postdoctoral Fellow. José had built simulations, and he jumped at the chance to study quantum chaos.

José confirmed Oskar’s fear and other simulators’ findings: The environment threatens measurements of the out-of-time-ordered correlator. Suppose that you measure this correlator at each of many instants, you plot the correlator against time, and you see the correlator drop. If you’ve isolated your qubits from their environment, we can expect them to carry many-body entanglement. Golden. But the correlator can drop if, instead, the environment is harassing your qubits. You can misdiagnose leaking as many-body entanglement.

OTOC plots

Our triumvirate identified a solution. Justin and I had discovered another characterization of quantum chaos and many-body entanglement: a quasiprobability, a quantum generalization of a probability.  

The quasiprobability contains more information about the entanglement than the out-of-time-ordered-correlator does. José simulated measurements of the quasiprobability. The quasiprobability, he found, behaves one way when the qubits entangle independently of their environment and behaves another way when the qubits leak. You can measure the quasiprobability to decide whether to trust your out-of-time-ordered-correlator measurement or to isolate your qubits better. The quasiprobability enables us to avoid false positives.

Physical Review Letters published our paper last month. Working with Justin and José deepened my appetite for translating between the abstract and the concrete, for proving abstractions as a theorist’s theorist and realizing them experimentally as a lab’s theorist. Maybe, someday, I’ll earn the tag “a theorist I can actually talk with” from an experimentalist. For now, at least I serve better than a football-team mascot.

Humans can intuit quantum physics.

One evening this January, audience members packed into a lecture hall in MIT’s physics building. Undergraduates, members of the public, faculty members, and other scholars came to watch a film premiere and a panel discussion. NOVA had produced the film, “Einstein’s Quantum Riddle,” which stars entanglement. Entanglement is a relationship between quantum systems such as electrons. Measuring two entangled electrons yields two outcomes, analogous to the numbers that face upward after you roll two dice. The quantum measurements’ outcomes can exhibit correlations stronger than any measurements of any classical, or nonquantum, systems can. Which die faces point upward can share only so much correlation, even if the dice hit each other.

einstein's q. riddle

Dice feature in the film’s explanations of entanglement. So does a variation on the shell game, in which one hides a ball under one of three cups, shuffles the cups, and challenges viewers to guess which cup is hiding the ball. The film derives its drama from the Cosmic Bell test. Bell tests are experiments crafted to show that classical physics can’t describe entanglement. Scientists recently enhanced Bell tests using light from quasars—ancient, bright, faraway galaxies. Mix astrophysics with quantum physics, and an edgy, pulsing soundtrack follows.

The Cosmic Bell test grew from a proposal by physicists at MIT and the University of Chicago. The coauthors include David Kaiser, a historian of science and a physicist on MIT’s faculty. Dave co-organized the premiere and the panel discussion that followed. The panel featured Dave; Paola Cappellaro, an MIT quantum experimentalist; Alan Guth, an MIT cosmologist who contributed to the Bell test; Calvin Leung, an MIT PhD student who contributed; Chris Schmidt, the film’s producer; and me. Brindha Muniappan, the Director of Education and Public Programs at the MIT Museum, moderated the discussion.

panel

think that the other panelists were laughing with me.

Brindha asked what challenges I face when explaining quantum physics, such as on this blog. Quantum theory wears the labels “weird,” “counterintuitive,” and “bizarre” in journalism, interviews, blogs, and films. But the thorn in my communicational side reflects quantum “weirdness” less than it reflects humanity’s self-limitation: Many people believe that we can’t grasp quantum physics. They shut down before asking me to explain.

Examples include a friend and Quantum Frontiers follower who asks, year after year, for books about quantum physics. I suggest literature—much by Dave Kaiser—he reads some, and we discuss his impressions. He’s learning, he harbors enough curiosity to have maintained this routine for years, and he has technical experience as a programmer. But he’s demurred, several times, along the lines of “But…I don’t know. I don’t think I’ll ever understand it. Humans can’t understand quantum physics, can we? It’s too weird.” 

Quantum physics defies many expectations sourced from classical physics. Classical physics governs how basketballs arch, how paint dries, how sunlight slants through your window, and other everyday experiences. Yet we can gain intuition about quantum physics. If we couldn’t, how could we solve problems and accomplish research? Physicists often begin solving problems by trying to guess the answer from intuition. We reason our way toward a guess by stripping away complications, constructing toy models, and telling stories. We tell stories about particles hopping from site to site on lattices, particles trapped in wells, and arrows flipping upward and downward. These stories don’t capture all of quantum physics, but they capture the essentials. After grasping the essentials, we translate them into math, check how far our guesses lie from truth, and correct our understanding. Intuition about quantum physics forms the compass that guides problem solving.

Growing able to construct, use, and mathematize such stories requires work. You won’t come to understand quantum theory by watching NOVA films, though films can prime you for study. You can gain a facility with quantum theory through classes, problem sets, testing, research, seminars, and further processing. You might not have the time or inclination to. Even if you have, you might not come to understand why quantum theory describes our universe: Science can’t necessarily answer all “why” questions. But you can grasp what quantum theory implies about our universe.

People grasp physics arguably more exotic than quantum theory, without exciting the disbelief excited by a grasp of quantum theory. Consider the Voyager spacecraft launched in 1977. Voyager has survived solar winds and -452º F weather, imaged planets, and entered interstellar space. Classical physics—the physics of how basketballs arch—describes much of Voyager’s experience. But even if you’ve shot baskets, how much intuition do you have about interstellar space? I know physicists who claim to have more intuition about quantum physics than about much classical. When astrophysicists discuss Voyager and interstellar space, moreover, listeners don’t fret that comprehension lies beyond them. No one need fret when quantum physicists discuss the electrons in us.

Fretting might not occur to future generations: Outreach teams are introducing kids to quantum physics through games and videos. Caltech’s Institute for Quantum Information and Matter has partnered with Google to produce QCraft, a quantum variation on Minecraft, and with the University of Southern California on quantum chess. In 2017, the American Physical Society’s largest annual conference featured a session called “Gamification and other Novel Approaches in Quantum Physics Outreach.” Such outreach exposes kids to quantum terminology and concepts early. Quantum theory becomes a playground to explore, rather than a source of intimidation. Players will grow up primed to think about quantum-mechanics courses not “Will my grade-point average survive this semester?” but “Ah, so this is the math under the hood of entanglement.”

qcraft 2

Sociology restricts people to thinking quantum physics weird. But quantum theory defies classical expectations less than it could. Measurement outcomes could share correlations stronger than the correlations sourced by entanglement. How strong could the correlations grow? How else could physics depart farther from classical physics than quantum physics does? Imagine the worlds governed by all possible types of physics, called “generalized probabilistic theories” (GPTs). GPTs form a landscape in which quantum theory constitutes an island, on which classical physics constitutes a hill. Compared with the landscape’s outskirts, our quantum world looks tame.

GPTs fall under the research category of quantum foundations. Quantum foundations concerns why the math that describes quantum systems describes quantum systems, reformulations of quantum theory, how quantum theory differs from classical mechanics, how quantum theory could deviate but doesn’t, and what happens during measurements of quantum systems. Though questions about quantum foundations remain, they don’t block us from intuiting about quantum theory. A stable owner can sense when a horse has colic despite lacking a veterinary degree.

Moreover, quantum-foundations research has advanced over the past few decades. Collaborations and tools have helped: Theorists have been partnering with experimentalists, such as on the Cosmic Bell test and on studies of measurement. Information theory has engendered mathematical tools for quantifying entanglement and other quantum phenomena. Information theory has also firmed up an approach called “operationalism.” Operationalists emphasize preparation procedures, evolutions, and measurements. Focusing on actions and data concretizes arguments and facilitates comparisons with experiments. As quantum-foundations research has advanced, so have quantum information theory, quantum experiments, quantum technologies, and interdisciplinary cross-pollination. Twentieth-century quantum physicists didn’t imagine the community, perspectives, and knowledge that we’ve accrued. So don’t adopt 20th-century pessimism about understanding quantum theory. Einstein grasped much, but today’s scientific community grasps more. Richard Feynman said, “I think I can safely say that nobody understands quantum mechanics.” Feynman helped spur the quantum-information revolution; he died before its adolescence. Besides, Feynman understood plenty about quantum theory. Intuition jumps off the pages of his lecture notes and speeches.

landscape

Landscape beyond quantum theory

I’ve swum in oceans and lakes, studied how the moon generates tides, and canoed. But piloting a steamboat along the Mississippi would baffle me. I could learn, given time, instruction, and practice; so can you learn quantum theory. Don’t let “weirdness,” “bizarreness,” or “counterintuitiveness” intimidate you. Humans can intuit quantum physics.

Chasing Ed Jaynes’s ghost

You can’t escape him, working where information theory meets statistical mechanics.

Information theory concerns how efficiently we can encode information, compute, evade eavesdroppers, and communicate. Statistical mechanics is the physics of  many particles. We can’t track every particle in a material, such as a sheet of glass. Instead, we reason about how the conglomerate likely behaves. Since we can’t know how all the particles behave, uncertainty blunts our predictions. Uncertainty underlies also information theory: You can think that your brother wished you a happy birthday on the phone. But noise corroded the signal; he might have wished you a madcap Earth Day. 

Edwin Thompson Jaynes united the fields, in two 1957 papers entitled “Information theory and statistical mechanics.” I’ve cited the papers in at least two of mine. Those 1957 papers, and Jaynes’s philosophy, permeate pockets of quantum information theory, statistical mechanics, and biophysics. Say you know a little about some system, Jaynes wrote, like a gas’s average energy. Say you want to describe the gas’s state mathematically. Which state can you most reasonably ascribe to the gas? The state that, upon satisfying the average-energy constraint, reflects our ignorance of the rest of the gas’s properties. Information theorists quantify ignorance with a function called entropy, so we ascribe to the gas a large-entropy state. Jaynes’s Principle of Maximum Entropy has spread from statistical mechanics to image processing and computer science and beyond. You can’t evade Ed Jaynes.

I decided to turn the tables on him this December. I was visiting to Washington University in St. Louis, where Jaynes worked until six years before his 1998 death. Haunted by Jaynes, I’d hunt down his ghost.

WashU

I began with my host, Kater Murch. Kater’s lab performs experiments with superconducting qubits. These quantum circuits sustain currents that can flow forever, without dissipating. I questioned Kater over hummus, the evening after I presented a seminar about quantum uncertainty and equilibration. Kater had arrived at WashU a decade-and-a-half after Jaynes’s passing but had kept his ears open.

Ed Jaynes, Kater said, consulted for a startup, decades ago. The company lacked the funds to pay him, so it offered him stock. That company was Varian, and Jaynes wound up with a pretty penny. He bought a mansion, across the street from campus, where he hosted the physics faculty and grad students every Friday. He’d play a grand piano, and guests would accompany him on instruments they’d bring. The department doubled as his family. 

The library kept a binder of Jaynes’s papers, which Kater had skimmed the previous year. What clarity shined through those papers! With a touch of pride, Kater added that he inhabited Jaynes’s former office. Or the office next door. He wasn’t certain.

I passed the hummus to a grad student of Kater’s. Do you hear stories about Jaynes around the department? I asked. I’d heard plenty about Feynman, as a PhD student at Caltech.

Not many, he answered. Just in conversations like this.

Later that evening, I exchanged emails with Kater. A contemporary of Jaynes’s had attended my seminar, he mentioned. Pity that I’d missed meeting the contemporary.

The following afternoon, I climbed to the physics library on the third floor of Crow Hall. Portraits of suited men greeted me. At the circulation desk, I asked for the binders of Jaynes’s papers.

Who? asked the student behind the granola bars advertised as “Free study snacks—help yourself!” 

E.T. Jaynes, I repeated. He worked here as a faculty member.

She turned to her computer. Can you spell that?

I obeyed while typing the name into the computer for patrons. The catalogue proffered several entries, one of which resembled my target. I wrote down the call number, then glanced at the notes over which the student was bending: “The harmonic oscillator.” An undergrad studying physics, I surmised. Maybe she’ll encounter Jaynes in a couple of years. 

I hiked upstairs, located the statistical-mechanics section, and ran a finger along the shelf. Hurt and Hermann, Itzykson and Drouffe, …Kadanoff and Baym. No Jaynes? I double-checked. No Jaynes. 

Library books

Upon descending the stairs, I queried the student at the circulation desk. She checked the catalogue entry, then ahhhed. You’d have go to the main campus library for this, she said. Do you want directions? I declined, thanked her, and prepared to return to Kater’s lab. Calculations awaited me there; I’d have no time for the main library. 

As I reached the physics library’s door, a placard caught my eye. It appeared to list the men whose portraits lined the walls. Arthur Compton…I only glanced at the placard, but I didn’t notice any “Jaynes.”

Arthur Compton greeted me also from an engraving en route to Kater’s lab. Down the hall lay a narrow staircase on whose installation, according to Kater, Jaynes had insisted. Physicists would have, in the stairs’ absence, had to trek down the hall to access the third floor. Of course I wouldn’t photograph the staircase for a blog post. I might belong to the millenial generation, but I aim and click only with purpose. What, though, could I report in a blog post? 

That night, I googled “e.t. jaynes.” His Wikipedia page contained only introductory and “Notes” sections. A WashU website offered a biography and unpublished works. But another tidbit I’d heard in the department yielded no Google hits, at first glance. I forbore a second glance, navigated to my inbox, and emailed Kater about plans for the next day.

I’d almost given up on Jaynes when Kater responded. After agreeing to my suggestion, he reported feedback about my seminar: A fellow faculty member “thought that Ed Jaynes (his contemporary) would have been very pleased.” 

The email landed in my “Nice messages” folder within two shakes. 

Leaning back, I reevaluated my data about Jaynes. I’d unearthed little, and little surprise: According to the WashU website, Jaynes “would undoubtedly be uncomfortable with all of the attention being lavished on him now that he is dead.” I appreciate privacy and modesty. Nor does Jaynes need portraits or engravings. His legacy lives in ideas, in people. Faculty from across his department attended a seminar about equilibration and about how much we can know about quantum systems. Kater might or might not inhabit Jaynes’s office. But Kater wears a strip cut from Jaynes’s mantle: Kater’s lab probes the intersection of information theory and statistical mechanics. They’ve built a Maxwell demon, a device that uses information as a sort of fuel to perform thermodynamic work. 

I’ve blogged about legacies that last. Assyrian reliefs carved in alabaster survive for millennia, as do ideas. Jaynes’s ideas thrive; they live even in me.

Did I find Ed Jaynes’s ghost at WashU? I think I honored it, by pursuing calculations instead of pursuing his ghost further. I can’t say whether I found his ghost. But I gained enough information.

 

With thanks to Kater and to the Washington University Department of Physics for their hospitality.

Doctrine of the (measurement) mean

Don’t invite me to dinner the night before an academic year begins.

You’ll find me in an armchair or sitting on my bed, laptop on my lap, journaling. I initiated the tradition the night before beginning college. I take stock of the past year, my present state, and hopes for the coming year.

Much of the exercise fosters what my high-school physics teacher called “an attitude of gratitude”: I reflect on cities I’ve visited, projects firing me up, family events attended, and subfields sampled. Other paragraphs, I want off my chest: Have I pushed this collaborator too hard or that project too little? Miscommunicated or misunderstood? Strayed too far into heuristics or into mathematical formalisms?

If only the “too much” errors, I end up thinking, could cancel the “too little.”

In one quantum-information context, they can.

Seesaw

Imagine that you’ve fabricated the material that will topple steel and graphene; let’s call it a supermetatopoconsulator. How, you wonder, do charge, energy, and particles move through this material? You’ll learn by measuring correlators.

A correlator signals how much, if you poke this piece here, that piece there responds. At least, a two-point correlator does: \langle A(0) B(\tau) \rangle. A(0) represents the poke, which occurs at time t = 0. B(\tau) represents the observable measured there at t = \tau. The \langle . \rangle encapsulates which state \rho the system started in.

Condensed-matter, quantum-optics, and particle experimentalists have measured two-point correlators for years. But consider the three-point correlator \langle A(0) B(\tau) C (\tau' ) \rangle, or a k-point \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle, for any k \geq 2. Higher-point correlators relate more-complicated relationships amongst events. Four-pointcorrelators associated with multiple times signal quantum chaos and information scrambling. Quantum information scrambles upon spreading across a system through many-body entanglement. Could you measure arbitrary-point, arbitrary-time correlators?

New material

Supermetatopoconsulator (artist’s conception)

Yes, collaborators and I have written, using weak measurements. Weak measurements barely disturb the system being measured. But they extract little information about the measured system. So, to measure a correlator, you’d have to perform many trials. Moreover, your postdocs and students might have little experience with weak measurements. They might not want to learn the techniques required, to recalibrate their detectors, etc. Could you measure these correlators easily?

Yes, if the material consists of qubits,2 according to a paper I published with Justin Dressel, José Raúl González Alsonso, and Mordecai Waegell this summer. You could build such a system from, e.g., superconducting circuits, trapped ions, or quantum dots.

You can measure \langle \underbrace{ A(0) B (\tau') C (\tau'') \ldots M (\tau^{(k)}) }_k \rangle, we show, by measuring A at t = 0, waiting until t = \tau', measuring B, and so on until measuring M at t = \tau^{(k)}. The t-values needn’t increase sequentially: \tau'' could be less than \tau', for instance. You’d have to effectively reverse the flow of time experienced by the qubits. Experimentalists can do so by, for example, flipping magnetic fields upside-down.

Each measurement requires an ancilla, or helper qubit. The ancilla acts as a detector that records the measurement’s outcome. Suppose that A is an observable of qubit #1 of the system of interest. You bring an ancilla to qubit 1, entangle the qubits (force them to interact), and look at the ancilla. (Experts: You perform a controlled rotation on the ancilla, conditioning on the system qubit.)

Each trial yields k measurement outcomes. They form a sequence S, such as (1, 1, 1, -1, -1, \ldots). You should compute a number \alpha, according to a formula we provide, from each measurement outcome and from the measurement’s settings. These numbers form a new sequence S' = \mathbf{(} \alpha_S(1), \alpha_S(1), \ldots \mathbf{)}. Why bother? So that you can force errors to cancel.

Multiply the \alpha’s together, \alpha_S(1) \times \alpha_S(1) \times \ldots, and average the product over the possible sequences S. This average equals the correlator \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle. Congratulations; you’ve characterized transport in your supermetatopoconsulator.

Success

When measuring, you can couple the ancillas to the system weakly or strongly, disturbing the system a little or a lot. Wouldn’t strong measurements perturb the state \rho whose properties you hope to measure? Wouldn’t the perturbations by measurements one through \ell throw off measurement \ell + 1?

Yes. But the errors introduced by those perturbations cancel in the average. The reason stems from how we construct \alpha’s: Our formula makes some products positive and some negative. The positive and negative terms sum to zero.

Balance 2

The cancellation offers hope for my journal assessment: Errors can come out in the wash. Not of their own accord, not without forethought. But errors can cancel out in the wash—if you soap your \alpha’s with care.

 

1and six-point, eight-point, etc.

2Rather, each measured observable must square to the identity, e.g., A^2 = 1. Qubit Pauli operators satisfy this requirement.

 

With apologies to Aristotle.

I get knocked down…

“You’ll have to have a thick skin.”

Marcelo Gleiser, a college mentor of mine, emailed the warning. I’d sent a list of physics PhD programs and requested advice about which to attend. Marcelo’s and my department had fostered encouragement and consideration.

Suit up, Marcelo was saying.

Criticism fuels science, as Oxford physicist David Deutsch has written. We have choices about how we criticize. Some criticism styles reflect consideration for the criticized work’s creator. Tufts University philosopher Daniel Dennett has devised guidelines for “criticizing with kindness”:1

1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.

2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).

3. You should mention anything you have learned from your target.

4. Only then are you permitted to say so much as a word of rebuttal or criticism.

Scientists skip to step four often—when refereeing papers submitted to journals, when posing questions during seminars, when emailing collaborators, when colleagues sketch ideas at a blackboard. Why? Listening and criticizing require time, thought, and effort—three of a scientist’s most valuable resources. Should any scientist spend those resources on an idea of mine, s/he deserves my gratitude. Spending empathy atop time, thought, and effort can feel supererogatory. Nor do all scientists prioritize empathy and kindness. Others of us prioritize empathy but—as I have over the past five years—grown so used to its latency, I forget to demonstrate it.

Doing science requires facing not only criticism, but also “That doesn’t make sense,” “Who cares?” “Of course not,” and other morale boosters.

Doing science requires resilience.

Resilience

So do measurements of quantum information (QI) scrambling. Scrambling is a subtle, late, quantum stage of equilibration2 in many-body systems. Example systems include chains of spins,3 such as in ultracold atoms, that interact with each other strongly. Exotic examples include black holes in anti-de Sitter space.4

Imagine whacking one side of a chain of interacting spins. Information about the whack will disseminate throughout the chain via entanglement.5 After a long interval (the scrambling time, t_*), spins across the systems will share many-body entanglement. No measurement of any few, close-together spins can disclose much about the whack. Information will have scrambled across the system.

QI scrambling has the subtlety of an assassin treading a Persian carpet at midnight. Can we observe scrambling?

Carpet

A Stanford team proposed a scheme for detecting scrambling using interferometry.6 Justin Dressel, Brian Swingle, and I proposed a scheme based on weak measurements, which refrain from disturbing the measured system much. Other teams have proposed alternatives.

Many schemes rely on effective time reversal: The experimentalist must perform the quantum analog of inverting particles’ momenta. One must negate the Hamiltonian \hat{H}, the observable that governs how the system evolves: \hat{H} \mapsto - \hat{H}.

At least, the experimentalist must try. The experimentalist will likely map \hat{H} to - \hat{H} + \varepsilon. The small error \varepsilon could wreak havoc: QI scrambling relates to chaos, exemplified by the butterfly effect. Tiny perturbations, such as the flap of a butterfly’s wings, can snowball in chaotic systems, as by generating tornadoes. Will the \varepsilon snowball, obscuring observations of scrambling?

Snowball

It needn’t, Brian and I wrote in a recent paper. You can divide out much of the error until t_*.

You can detect scrambling by measuring an out-of-time-ordered correlator (OTOC), an object I’ve effused about elsewhere. Let’s denote the time-t correlator by F(t). You can infer an approximation \tilde{F}(t) to F(t) upon implementing an \varepsilon-ridden interferometry or weak-measurement protocol. Remove some steps from that protocol, Brian and I say. Infer a simpler, easier-to-measure object \tilde{F}_{\rm simple}(t). Divide the two measurement outcomes to approximate the OTOC:

F(t)  \approx \frac{ \tilde{F}(t) }{ \tilde{F}_{\rm simple}(t) }.

OTOC measurements exhibit resilience to error.

Arm 2

Physicists need resilience. Brian criticizes with such grace, he could serve as the poster child for Daniel Dennett’s guidelines. But not every scientist could. How can we withstand kindness-lite criticism?

By drawing confidence from what we’ve achieved, with help from mentors like Marcelo. I couldn’t tell what about me—if anything—could serve as a rock on which to plant a foot, as an undergrad. Mentors identified what I had too little experience to appreciate. You question what you don’t understand, they said. You assimilate perspectives from textbooks, lectures, practice problems, and past experiences. You scrutinize details while keeping an eye on the big picture. So don’t let so-and-so intimidate you.

I still lack my mentors’ experience, but I’ve imbibed a drop of their insight. I savor calculations that I nail, congratulate myself upon nullifying referees’ concerns, and celebrate the theorems I prove.

I’ve also created an email folder entitled “Nice messages.” In go “I loved your new paper; combining those topics was creative,” “Well done on the seminar; I’m now thinking of exploring that field,” and other rarities. The folder affords an umbrella when physics clouds gather.

Finally, I try to express appreciation of others’ work.7 Science thrives on criticism, but scientists do science. And scientists are human—undergrads, postdocs, senior researchers, and everyone else.

Doing science—and attempting to negate Hamiltonians—we get knocked down. But we can get up again.

 

Around the time Brian and I released “Resilience” two other groups proposed related renormalizations. Check out their schemes here and here.

1Thanks to Sean Carroll for alerting me to this gem of Dennett’s.

2A system equilibrates as its large-scale properties, like energy, flatline.

3Angular-momentum-like quantum properties

4Certain space-times different from ours

5Correlations, shareable by quantum systems, stronger than any achievable by classical systems

6The cancellation (as by a crest of one wave and a trough of another) of components of a quantum state, or the addition of components (as two waves’ crests)

7Appreciation of specific qualities. “Nice job” can reflect a speaker’s belief but often reflects a desire to buoy a receiver whose work has few merits to elaborate on. I applaud that desire and recommend reinvesting it. “Nice job” carries little content, which evaporates under repetition. Specificity provides content: “Your idea is alluringly simple but could reverberate across multiple fields” has gristle.

What’s the worst that could happen?

The archaeologist Howard Carter discovered Tutankhamun’s burial site in 1922. No other Egyptian pharaoh’s tomb had survived mostly intact until the modern era. Gold and glass and faience, statues and pendants and chariots, had evaded looting. The discovery would revolutionize the world’s understanding of, and enthusiasm for, ancient Egypt.

First, the artifacts had to leave the tomb.

Tutankhamun lay in three coffins nested like matryoshka dolls. Carter describes the nesting in his book The Tomb of Tutankhamen. Lifting the middle coffin from the outer coffin raised his blood pressure:

Everything may seem to be going well until suddenly, in the crisis of the process, you hear a crack—little pieces of surface ornament fall. Your nerves are at an almost painful tension. What is happening? All available room in the narrow space is crowded by your men. What action is needed to avert a catastrophe?

In other words, “What’s the worst that could happen?”

Matryoshka dolls

Collaborators and I asked that question in a paper published last month. We had in mind less Egyptology than thermodynamics and information theory. But never mind the distinction; you’re reading Quantum Frontiers! Let’s mix the fields like flour and oil in a Biblical grain offering.

Carter’s team had trouble separating the coffins: Ancient Egyptian priests (presumably) had poured fluid atop the innermost, solid-gold coffin. The fluid had congealed into a brown gunk, gluing the gold coffin to the bottom of the middle coffin. Removing the gold coffin required work—thermodynamic work.

Work consists of “well-ordered” energy usable in tasks like levering coffins out of sarcophagi and motoring artifacts from Egypt’s Valley of the Kings toward Cairo. We can model the gunk as a spring, one end of which was fixed to the gold coffin and one end of which was fixed to the middle coffin. The work W required to stretch a spring depends on the spring’s stiffness (the gunk’s viscosity) and on the distance stretched through.

W depends also on details: How many air molecules struck the gold coffin from above, opposing the team’s effort? How quickly did Carter’s team pull? Had the gunk above Tuankhamun’s nose settled into a hump or spread out? How about the gunk above Tutankhamun’s left eye socket? Such details barely impact the work required to open a 6.15-foot-long coffin. But air molecules would strongly impact W if Tutankhamun measured a few nanometers in length. So imagine Egyptian matryoshka dolls as long as stubs of DNA.

DNA

Imagine that Carter found one million sets of these matryoshka dolls. Lifting a given set’s innermost coffin would require an amount W of work that would vary from set of coffins to set of coffins. W would satisfy fluctuation relations, equalities I’ve blogged about many times.

Fluctuation relations resemble the Second Law of Thermodynamics, which illuminates why time flows in just one direction. But fluctuation relations imply more-precise predictions about W than the Second Law does.

Some predictions concern dissipated work: Carter’s team could avoid spending much work by opening the coffin infinitesimally slowly. Speeding up would heat the gunk, roil air molecules, and more. The heating and roiling would cost extra work, called dissipated work, denoted by W_{\rm diss}.

Suppose that Carter’s team has chosen a lid-opening speed v. Consider the greatest W_{\rm diss} that the team might have to waste on any nanoscale coffin. W_{\rm diss}^{\rm worst} is proportional to each of three information-theoretic quantities, my coauthors and I proved.

For experts: Each information-theoretic quantity is an order-infinity Rényi divergence D_\infty ( X || Y). The Rényi divergences generalize the relative entropy D ( X || Y ). D quantifies how efficiently one can distinguish between probability distributions, or quantum states, X and Y on average. The average is over many runs of a guessing game.

Imagine the worst possible run, which offers the lowest odds of guessing correctly. D_\infty quantifies your likelihood of winning. We related W_{\rm diss}^{\rm worst} to a D_\infty between two statistical-mechanical phase-space distributions (when we described classical systems), to a D_\infty between two quantum states (when we described quantum systems), and to a D_\infty between two probability distributions over work quantities W (when we described systems quantum and classical).

Book-paper

The worst case marks an extreme. How do the extremes consistent with physical law look? As though they’ve escaped from a mythologist’s daydream.

In an archaeologist’s worst case, arriving at home in the evening could lead to the following conversation:

“How was your day, honey?”

“The worst possible.”

“What happened?”

“I accidentally eviscerated a 3.5-thousand-year-old artifact—the most valuable, best-preserved, most information-rich, most lavishly wrought ancient Egyptian coffin that existed yesterday.”

Suppose that the archaeologist lived with a physicist. My group (guided by a high-energy physicist) realized that the conversation could continue as follows:

“And how was your day?”

“Also the worst possible.”

“What happened?”

“I created a black hole.”

General relativity and high-energy physics have begun breeding with quantum information and thermodynamics. The offspring bear extremes like few other systems imaginable. I wonder what our results would have to say about those offspring.

Oops

National Geographic reprinted Carter’s The Tomb of Tutankhamen in its “Adventure Classics” series. The series title fits Tomb as a mummy’s bandages fit the mummy. Carter’s narrative stretches from Egypt’s New Kingdom (of 3.5 thousand years ago) through the five-year hunt for the tomb (almost fruitless until the final season), to a water boy’s discovery of steps into the tomb, to the unsealing of the burial chamber, to the confrontation of Tutankhamun’s mummy.

Carter’s book guided me better than any audio guide could have at the California Science Center. The center is hosting the exhibition “King Tut: Treasures of the Golden Pharaoh.” After debuting in Los Angeles, the exhibition will tour the world. The tour showcases 150 artifacts from Tutankhamun’s tomb.

Those artifacts drove me to my desk—to my physics—as soon as I returned home from the museum. Tutankhamun’s tomb, Carter argues in his book, ranks amongst the 20th century’s most important scientific discoveries. I’d seen a smidgeon of the magnificence that Carter’s team— with perseverance, ingenuity, meticulousness, and buckets of sweat shed in Egypt’s heat—had discovered. I don’t expect to discover anything a tenth as magnificent. But how can a young scientist resist trying?

People say, “Prepare for the worst. Hope for the best.” I prefer “Calculate the worst. Hope and strive for a Tutankhamun.”

Outside exhibition

Postscript: Carter’s team failed to unglue the gold coffin by just “stretching” the gunky “spring.” The team resorted to heat, a thermodynamic quantity alternative to work: The team flipped the middle coffin upside-down above a heat lamp. The lamp raised the temperature to 932°F, melting the goo. The melting, with more work, caused the gold coffin to plop out of the middle coffin.