Chasing Ed Jaynes’s ghost

You can’t escape him, working where information theory meets statistical mechanics.

Information theory concerns how efficiently we can encode information, compute, evade eavesdroppers, and communicate. Statistical mechanics is the physics of  many particles. We can’t track every particle in a material, such as a sheet of glass. Instead, we reason about how the conglomerate likely behaves. Since we can’t know how all the particles behave, uncertainty blunts our predictions. Uncertainty underlies also information theory: You can think that your brother wished you a happy birthday on the phone. But noise corroded the signal; he might have wished you a madcap Earth Day. 

Edwin Thompson Jaynes united the fields, in two 1957 papers entitled “Information theory and statistical mechanics.” I’ve cited the papers in at least two of mine. Those 1957 papers, and Jaynes’s philosophy, permeate pockets of quantum information theory, statistical mechanics, and biophysics. Say you know a little about some system, Jaynes wrote, like a gas’s average energy. Say you want to describe the gas’s state mathematically. Which state can you most reasonably ascribe to the gas? The state that, upon satisfying the average-energy constraint, reflects our ignorance of the rest of the gas’s properties. Information theorists quantify ignorance with a function called entropy, so we ascribe to the gas a large-entropy state. Jaynes’s Principle of Maximum Entropy has spread from statistical mechanics to image processing and computer science and beyond. You can’t evade Ed Jaynes.

I decided to turn the tables on him this December. I was visiting to Washington University in St. Louis, where Jaynes worked until six years before his 1998 death. Haunted by Jaynes, I’d hunt down his ghost.

WashU

I began with my host, Kater Murch. Kater’s lab performs experiments with superconducting qubits. These quantum circuits sustain currents that can flow forever, without dissipating. I questioned Kater over hummus, the evening after I presented a seminar about quantum uncertainty and equilibration. Kater had arrived at WashU a decade-and-a-half after Jaynes’s passing but had kept his ears open.

Ed Jaynes, Kater said, consulted for a startup, decades ago. The company lacked the funds to pay him, so it offered him stock. That company was Varian, and Jaynes wound up with a pretty penny. He bought a mansion, across the street from campus, where he hosted the physics faculty and grad students every Friday. He’d play a grand piano, and guests would accompany him on instruments they’d bring. The department doubled as his family. 

The library kept a binder of Jaynes’s papers, which Kater had skimmed the previous year. What clarity shined through those papers! With a touch of pride, Kater added that he inhabited Jaynes’s former office. Or the office next door. He wasn’t certain.

I passed the hummus to a grad student of Kater’s. Do you hear stories about Jaynes around the department? I asked. I’d heard plenty about Feynman, as a PhD student at Caltech.

Not many, he answered. Just in conversations like this.

Later that evening, I exchanged emails with Kater. A contemporary of Jaynes’s had attended my seminar, he mentioned. Pity that I’d missed meeting the contemporary.

The following afternoon, I climbed to the physics library on the third floor of Crow Hall. Portraits of suited men greeted me. At the circulation desk, I asked for the binders of Jaynes’s papers.

Who? asked the student behind the granola bars advertised as “Free study snacks—help yourself!” 

E.T. Jaynes, I repeated. He worked here as a faculty member.

She turned to her computer. Can you spell that?

I obeyed while typing the name into the computer for patrons. The catalogue proffered several entries, one of which resembled my target. I wrote down the call number, then glanced at the notes over which the student was bending: “The harmonic oscillator.” An undergrad studying physics, I surmised. Maybe she’ll encounter Jaynes in a couple of years. 

I hiked upstairs, located the statistical-mechanics section, and ran a finger along the shelf. Hurt and Hermann, Itzykson and Drouffe, …Kadanoff and Baym. No Jaynes? I double-checked. No Jaynes. 

Library books

Upon descending the stairs, I queried the student at the circulation desk. She checked the catalogue entry, then ahhhed. You’d have go to the main campus library for this, she said. Do you want directions? I declined, thanked her, and prepared to return to Kater’s lab. Calculations awaited me there; I’d have no time for the main library. 

As I reached the physics library’s door, a placard caught my eye. It appeared to list the men whose portraits lined the walls. Arthur Compton…I only glanced at the placard, but I didn’t notice any “Jaynes.”

Arthur Compton greeted me also from an engraving en route to Kater’s lab. Down the hall lay a narrow staircase on whose installation, according to Kater, Jaynes had insisted. Physicists would have, in the stairs’ absence, had to trek down the hall to access the third floor. Of course I wouldn’t photograph the staircase for a blog post. I might belong to the millenial generation, but I aim and click only with purpose. What, though, could I report in a blog post? 

That night, I googled “e.t. jaynes.” His Wikipedia page contained only introductory and “Notes” sections. A WashU website offered a biography and unpublished works. But another tidbit I’d heard in the department yielded no Google hits, at first glance. I forbore a second glance, navigated to my inbox, and emailed Kater about plans for the next day.

I’d almost given up on Jaynes when Kater responded. After agreeing to my suggestion, he reported feedback about my seminar: A fellow faculty member “thought that Ed Jaynes (his contemporary) would have been very pleased.” 

The email landed in my “Nice messages” folder within two shakes. 

Leaning back, I reevaluated my data about Jaynes. I’d unearthed little, and little surprise: According to the WashU website, Jaynes “would undoubtedly be uncomfortable with all of the attention being lavished on him now that he is dead.” I appreciate privacy and modesty. Nor does Jaynes need portraits or engravings. His legacy lives in ideas, in people. Faculty from across his department attended a seminar about equilibration and about how much we can know about quantum systems. Kater might or might not inhabit Jaynes’s office. But Kater wears a strip cut from Jaynes’s mantle: Kater’s lab probes the intersection of information theory and statistical mechanics. They’ve built a Maxwell demon, a device that uses information as a sort of fuel to perform thermodynamic work. 

I’ve blogged about legacies that last. Assyrian reliefs carved in alabaster survive for millennia, as do ideas. Jaynes’s ideas thrive; they live even in me.

Did I find Ed Jaynes’s ghost at WashU? I think I honored it, by pursuing calculations instead of pursuing his ghost further. I can’t say whether I found his ghost. But I gained enough information.

 

With thanks to Kater and to the Washington University Department of Physics for their hospitality.

Doctrine of the (measurement) mean

Don’t invite me to dinner the night before an academic year begins.

You’ll find me in an armchair or sitting on my bed, laptop on my lap, journaling. I initiated the tradition the night before beginning college. I take stock of the past year, my present state, and hopes for the coming year.

Much of the exercise fosters what my high-school physics teacher called “an attitude of gratitude”: I reflect on cities I’ve visited, projects firing me up, family events attended, and subfields sampled. Other paragraphs, I want off my chest: Have I pushed this collaborator too hard or that project too little? Miscommunicated or misunderstood? Strayed too far into heuristics or into mathematical formalisms?

If only the “too much” errors, I end up thinking, could cancel the “too little.”

In one quantum-information context, they can.

Seesaw

Imagine that you’ve fabricated the material that will topple steel and graphene; let’s call it a supermetatopoconsulator. How, you wonder, do charge, energy, and particles move through this material? You’ll learn by measuring correlators.

A correlator signals how much, if you poke this piece here, that piece there responds. At least, a two-point correlator does: \langle A(0) B(\tau) \rangle. A(0) represents the poke, which occurs at time t = 0. B(\tau) represents the observable measured there at t = \tau. The \langle . \rangle encapsulates which state \rho the system started in.

Condensed-matter, quantum-optics, and particle experimentalists have measured two-point correlators for years. But consider the three-point correlator \langle A(0) B(\tau) C (\tau' ) \rangle, or a k-point \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle, for any k \geq 2. Higher-point correlators relate more-complicated relationships amongst events. Four-pointcorrelators associated with multiple times signal quantum chaos and information scrambling. Quantum information scrambles upon spreading across a system through many-body entanglement. Could you measure arbitrary-point, arbitrary-time correlators?

New material

Supermetatopoconsulator (artist’s conception)

Yes, collaborators and I have written, using weak measurements. Weak measurements barely disturb the system being measured. But they extract little information about the measured system. So, to measure a correlator, you’d have to perform many trials. Moreover, your postdocs and students might have little experience with weak measurements. They might not want to learn the techniques required, to recalibrate their detectors, etc. Could you measure these correlators easily?

Yes, if the material consists of qubits,2 according to a paper I published with Justin Dressel, José Raúl González Alsonso, and Mordecai Waegell this summer. You could build such a system from, e.g., superconducting circuits, trapped ions, or quantum dots.

You can measure \langle \underbrace{ A(0) B (\tau') C (\tau'') \ldots M (\tau^{(k)}) }_k \rangle, we show, by measuring A at t = 0, waiting until t = \tau', measuring B, and so on until measuring M at t = \tau^{(k)}. The t-values needn’t increase sequentially: \tau'' could be less than \tau', for instance. You’d have to effectively reverse the flow of time experienced by the qubits. Experimentalists can do so by, for example, flipping magnetic fields upside-down.

Each measurement requires an ancilla, or helper qubit. The ancilla acts as a detector that records the measurement’s outcome. Suppose that A is an observable of qubit #1 of the system of interest. You bring an ancilla to qubit 1, entangle the qubits (force them to interact), and look at the ancilla. (Experts: You perform a controlled rotation on the ancilla, conditioning on the system qubit.)

Each trial yields k measurement outcomes. They form a sequence S, such as (1, 1, 1, -1, -1, \ldots). You should compute a number \alpha, according to a formula we provide, from each measurement outcome and from the measurement’s settings. These numbers form a new sequence S' = \mathbf{(} \alpha_S(1), \alpha_S(1), \ldots \mathbf{)}. Why bother? So that you can force errors to cancel.

Multiply the \alpha’s together, \alpha_S(1) \times \alpha_S(1) \times \ldots, and average the product over the possible sequences S. This average equals the correlator \langle \underbrace{ A(0) \ldots M (\tau^{(k)}) }_k \rangle. Congratulations; you’ve characterized transport in your supermetatopoconsulator.

Success

When measuring, you can couple the ancillas to the system weakly or strongly, disturbing the system a little or a lot. Wouldn’t strong measurements perturb the state \rho whose properties you hope to measure? Wouldn’t the perturbations by measurements one through \ell throw off measurement \ell + 1?

Yes. But the errors introduced by those perturbations cancel in the average. The reason stems from how we construct \alpha’s: Our formula makes some products positive and some negative. The positive and negative terms sum to zero.

Balance 2

The cancellation offers hope for my journal assessment: Errors can come out in the wash. Not of their own accord, not without forethought. But errors can cancel out in the wash—if you soap your \alpha’s with care.

 

1and six-point, eight-point, etc.

2Rather, each measured observable must square to the identity, e.g., A^2 = 1. Qubit Pauli operators satisfy this requirement.

 

With apologies to Aristotle.

I get knocked down…

“You’ll have to have a thick skin.”

Marcelo Gleiser, a college mentor of mine, emailed the warning. I’d sent a list of physics PhD programs and requested advice about which to attend. Marcelo’s and my department had fostered encouragement and consideration.

Suit up, Marcelo was saying.

Criticism fuels science, as Oxford physicist David Deutsch has written. We have choices about how we criticize. Some criticism styles reflect consideration for the criticized work’s creator. Tufts University philosopher Daniel Dennett has devised guidelines for “criticizing with kindness”:1

1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.

2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).

3. You should mention anything you have learned from your target.

4. Only then are you permitted to say so much as a word of rebuttal or criticism.

Scientists skip to step four often—when refereeing papers submitted to journals, when posing questions during seminars, when emailing collaborators, when colleagues sketch ideas at a blackboard. Why? Listening and criticizing require time, thought, and effort—three of a scientist’s most valuable resources. Should any scientist spend those resources on an idea of mine, s/he deserves my gratitude. Spending empathy atop time, thought, and effort can feel supererogatory. Nor do all scientists prioritize empathy and kindness. Others of us prioritize empathy but—as I have over the past five years—grown so used to its latency, I forget to demonstrate it.

Doing science requires facing not only criticism, but also “That doesn’t make sense,” “Who cares?” “Of course not,” and other morale boosters.

Doing science requires resilience.

Resilience

So do measurements of quantum information (QI) scrambling. Scrambling is a subtle, late, quantum stage of equilibration2 in many-body systems. Example systems include chains of spins,3 such as in ultracold atoms, that interact with each other strongly. Exotic examples include black holes in anti-de Sitter space.4

Imagine whacking one side of a chain of interacting spins. Information about the whack will disseminate throughout the chain via entanglement.5 After a long interval (the scrambling time, t_*), spins across the systems will share many-body entanglement. No measurement of any few, close-together spins can disclose much about the whack. Information will have scrambled across the system.

QI scrambling has the subtlety of an assassin treading a Persian carpet at midnight. Can we observe scrambling?

Carpet

A Stanford team proposed a scheme for detecting scrambling using interferometry.6 Justin Dressel, Brian Swingle, and I proposed a scheme based on weak measurements, which refrain from disturbing the measured system much. Other teams have proposed alternatives.

Many schemes rely on effective time reversal: The experimentalist must perform the quantum analog of inverting particles’ momenta. One must negate the Hamiltonian \hat{H}, the observable that governs how the system evolves: \hat{H} \mapsto - \hat{H}.

At least, the experimentalist must try. The experimentalist will likely map \hat{H} to - \hat{H} + \varepsilon. The small error \varepsilon could wreak havoc: QI scrambling relates to chaos, exemplified by the butterfly effect. Tiny perturbations, such as the flap of a butterfly’s wings, can snowball in chaotic systems, as by generating tornadoes. Will the \varepsilon snowball, obscuring observations of scrambling?

Snowball

It needn’t, Brian and I wrote in a recent paper. You can divide out much of the error until t_*.

You can detect scrambling by measuring an out-of-time-ordered correlator (OTOC), an object I’ve effused about elsewhere. Let’s denote the time-t correlator by F(t). You can infer an approximation \tilde{F}(t) to F(t) upon implementing an \varepsilon-ridden interferometry or weak-measurement protocol. Remove some steps from that protocol, Brian and I say. Infer a simpler, easier-to-measure object \tilde{F}_{\rm simple}(t). Divide the two measurement outcomes to approximate the OTOC:

F(t)  \approx \frac{ \tilde{F}(t) }{ \tilde{F}_{\rm simple}(t) }.

OTOC measurements exhibit resilience to error.

Arm 2

Physicists need resilience. Brian criticizes with such grace, he could serve as the poster child for Daniel Dennett’s guidelines. But not every scientist could. How can we withstand kindness-lite criticism?

By drawing confidence from what we’ve achieved, with help from mentors like Marcelo. I couldn’t tell what about me—if anything—could serve as a rock on which to plant a foot, as an undergrad. Mentors identified what I had too little experience to appreciate. You question what you don’t understand, they said. You assimilate perspectives from textbooks, lectures, practice problems, and past experiences. You scrutinize details while keeping an eye on the big picture. So don’t let so-and-so intimidate you.

I still lack my mentors’ experience, but I’ve imbibed a drop of their insight. I savor calculations that I nail, congratulate myself upon nullifying referees’ concerns, and celebrate the theorems I prove.

I’ve also created an email folder entitled “Nice messages.” In go “I loved your new paper; combining those topics was creative,” “Well done on the seminar; I’m now thinking of exploring that field,” and other rarities. The folder affords an umbrella when physics clouds gather.

Finally, I try to express appreciation of others’ work.7 Science thrives on criticism, but scientists do science. And scientists are human—undergrads, postdocs, senior researchers, and everyone else.

Doing science—and attempting to negate Hamiltonians—we get knocked down. But we can get up again.

 

Around the time Brian and I released “Resilience” two other groups proposed related renormalizations. Check out their schemes here and here.

1Thanks to Sean Carroll for alerting me to this gem of Dennett’s.

2A system equilibrates as its large-scale properties, like energy, flatline.

3Angular-momentum-like quantum properties

4Certain space-times different from ours

5Correlations, shareable by quantum systems, stronger than any achievable by classical systems

6The cancellation (as by a crest of one wave and a trough of another) of components of a quantum state, or the addition of components (as two waves’ crests)

7Appreciation of specific qualities. “Nice job” can reflect a speaker’s belief but often reflects a desire to buoy a receiver whose work has few merits to elaborate on. I applaud that desire and recommend reinvesting it. “Nice job” carries little content, which evaporates under repetition. Specificity provides content: “Your idea is alluringly simple but could reverberate across multiple fields” has gristle.

What’s the worst that could happen?

The archaeologist Howard Carter discovered Tutankhamun’s burial site in 1922. No other Egyptian pharaoh’s tomb had survived mostly intact until the modern era. Gold and glass and faience, statues and pendants and chariots, had evaded looting. The discovery would revolutionize the world’s understanding of, and enthusiasm for, ancient Egypt.

First, the artifacts had to leave the tomb.

Tutankhamun lay in three coffins nested like matryoshka dolls. Carter describes the nesting in his book The Tomb of Tutankhamen. Lifting the middle coffin from the outer coffin raised his blood pressure:

Everything may seem to be going well until suddenly, in the crisis of the process, you hear a crack—little pieces of surface ornament fall. Your nerves are at an almost painful tension. What is happening? All available room in the narrow space is crowded by your men. What action is needed to avert a catastrophe?

In other words, “What’s the worst that could happen?”

Matryoshka dolls

Collaborators and I asked that question in a paper published last month. We had in mind less Egyptology than thermodynamics and information theory. But never mind the distinction; you’re reading Quantum Frontiers! Let’s mix the fields like flour and oil in a Biblical grain offering.

Carter’s team had trouble separating the coffins: Ancient Egyptian priests (presumably) had poured fluid atop the innermost, solid-gold coffin. The fluid had congealed into a brown gunk, gluing the gold coffin to the bottom of the middle coffin. Removing the gold coffin required work—thermodynamic work.

Work consists of “well-ordered” energy usable in tasks like levering coffins out of sarcophagi and motoring artifacts from Egypt’s Valley of the Kings toward Cairo. We can model the gunk as a spring, one end of which was fixed to the gold coffin and one end of which was fixed to the middle coffin. The work W required to stretch a spring depends on the spring’s stiffness (the gunk’s viscosity) and on the distance stretched through.

W depends also on details: How many air molecules struck the gold coffin from above, opposing the team’s effort? How quickly did Carter’s team pull? Had the gunk above Tuankhamun’s nose settled into a hump or spread out? How about the gunk above Tutankhamun’s left eye socket? Such details barely impact the work required to open a 6.15-foot-long coffin. But air molecules would strongly impact W if Tutankhamun measured a few nanometers in length. So imagine Egyptian matryoshka dolls as long as stubs of DNA.

DNA

Imagine that Carter found one million sets of these matryoshka dolls. Lifting a given set’s innermost coffin would require an amount W of work that would vary from set of coffins to set of coffins. W would satisfy fluctuation relations, equalities I’ve blogged about many times.

Fluctuation relations resemble the Second Law of Thermodynamics, which illuminates why time flows in just one direction. But fluctuation relations imply more-precise predictions about W than the Second Law does.

Some predictions concern dissipated work: Carter’s team could avoid spending much work by opening the coffin infinitesimally slowly. Speeding up would heat the gunk, roil air molecules, and more. The heating and roiling would cost extra work, called dissipated work, denoted by W_{\rm diss}.

Suppose that Carter’s team has chosen a lid-opening speed v. Consider the greatest W_{\rm diss} that the team might have to waste on any nanoscale coffin. W_{\rm diss}^{\rm worst} is proportional to each of three information-theoretic quantities, my coauthors and I proved.

For experts: Each information-theoretic quantity is an order-infinity Rényi divergence D_\infty ( X || Y). The Rényi divergences generalize the relative entropy D ( X || Y ). D quantifies how efficiently one can distinguish between probability distributions, or quantum states, X and Y on average. The average is over many runs of a guessing game.

Imagine the worst possible run, which offers the lowest odds of guessing correctly. D_\infty quantifies your likelihood of winning. We related W_{\rm diss}^{\rm worst} to a D_\infty between two statistical-mechanical phase-space distributions (when we described classical systems), to a D_\infty between two quantum states (when we described quantum systems), and to a D_\infty between two probability distributions over work quantities W (when we described systems quantum and classical).

Book-paper

The worst case marks an extreme. How do the extremes consistent with physical law look? As though they’ve escaped from a mythologist’s daydream.

In an archaeologist’s worst case, arriving at home in the evening could lead to the following conversation:

“How was your day, honey?”

“The worst possible.”

“What happened?”

“I accidentally eviscerated a 3.5-thousand-year-old artifact—the most valuable, best-preserved, most information-rich, most lavishly wrought ancient Egyptian coffin that existed yesterday.”

Suppose that the archaeologist lived with a physicist. My group (guided by a high-energy physicist) realized that the conversation could continue as follows:

“And how was your day?”

“Also the worst possible.”

“What happened?”

“I created a black hole.”

General relativity and high-energy physics have begun breeding with quantum information and thermodynamics. The offspring bear extremes like few other systems imaginable. I wonder what our results would have to say about those offspring.

Oops

National Geographic reprinted Carter’s The Tomb of Tutankhamen in its “Adventure Classics” series. The series title fits Tomb as a mummy’s bandages fit the mummy. Carter’s narrative stretches from Egypt’s New Kingdom (of 3.5 thousand years ago) through the five-year hunt for the tomb (almost fruitless until the final season), to a water boy’s discovery of steps into the tomb, to the unsealing of the burial chamber, to the confrontation of Tutankhamun’s mummy.

Carter’s book guided me better than any audio guide could have at the California Science Center. The center is hosting the exhibition “King Tut: Treasures of the Golden Pharaoh.” After debuting in Los Angeles, the exhibition will tour the world. The tour showcases 150 artifacts from Tutankhamun’s tomb.

Those artifacts drove me to my desk—to my physics—as soon as I returned home from the museum. Tutankhamun’s tomb, Carter argues in his book, ranks amongst the 20th century’s most important scientific discoveries. I’d seen a smidgeon of the magnificence that Carter’s team— with perseverance, ingenuity, meticulousness, and buckets of sweat shed in Egypt’s heat—had discovered. I don’t expect to discover anything a tenth as magnificent. But how can a young scientist resist trying?

People say, “Prepare for the worst. Hope for the best.” I prefer “Calculate the worst. Hope and strive for a Tutankhamun.”

Outside exhibition

Postscript: Carter’s team failed to unglue the gold coffin by just “stretching” the gunky “spring.” The team resorted to heat, a thermodynamic quantity alternative to work: The team flipped the middle coffin upside-down above a heat lamp. The lamp raised the temperature to 932°F, melting the goo. The melting, with more work, caused the gold coffin to plop out of the middle coffin.

Catching up with the quantum-thermo crowd

You have four hours to tour Oxford University.

What will you visit? The Ashmolean Museum, home to da Vinci drawings, samurai armor, and Egyptian mummies? The Bodleian, one of Europe’s oldest libraries? Turf Tavern, where former president Bill Clinton reportedly “didn’t inhale” marijuana?

Felix Binder showed us a cemetery.

Of course he showed us a cemetery. We were at a thermodynamics conference.

The Fifth Quantum Thermodynamics Conference took place in the City of Dreaming Spires.Participants enthused about energy, information, engines, and the flow of time. About 160 scientists attended—roughly 60 more than attended the first conference, co-organizer Janet Anders estimated.

IMG_1177

Weak measurements and quasiprobability distributions were trending. The news delighted me, Quantum Frontiers regulars won’t be surprised to hear.

Measurements disturb quantum systems, as early-20th-century physicist Werner Heisenberg intuited. Measure a system’s position strongly, and you forfeit your ability to predict the outcomes of future momentum measurements. Weak measurements don’t disturb the system much. In exchange, weak measurements provide little information about the system. But you can recoup information by performing a weak measurement in each of many trials, then processing the outcomes.

Strong measurements lead to probability distributions: Imagine preparing a particle in some quantum state, then measuring its position strongly, in each of many trials. From the outcomes, you can infer a probability distribution \{ p(x) \}, wherein p(x) denotes the probability that the next trial will yield position x.

Weak measurements lead analogously to quasiprobability distributions. Quasiprobabilities resemble probabilities but can misbehave: Probabilities are real numbers no less than zero. Quasiprobabilities can dip below zero and can assume nonreal values.

Do not disturb 2

What relevance have weak measurements and quasiprobabilities to quantum thermodynamics? Thermodynamics involves work and heat. Work is energy harnessed to perform useful tasks, like propelling a train from London to Oxford. Heat is energy that jiggles systems randomly.

Quantum properties obscure the line between work and heat. (Here’s an illustration for experts: Consider an isolated quantum, such as a spin chain. Let H(t) denote the Hamiltonian that evolves with the time t \in [0, t_f]. Consider preparing the system in an energy eigenstate | E_i(0) \rangle. This state has zero diagonal entropy: Measuring the energy yields E_i(0) deterministically. Considering tuning H(t), as by changing a magnetic field. This change constitutes work, we learn in electrodynamics class. But if H(t) changes quickly, the state can acquire weight on multiple energy eigenstates. The diagonal entropy rises. The system’s energetics have gained an unreliability characteristic of heat absorption. But the system has remained isolated from any heat bath. Work mimics heat.)

Quantum thermodynamicists have defined work in terms of a two-point measurement scheme: Initialize the quantum system, such as by letting heat flow between the system and a giant, fixed-temperature heat reservoir until the system equilibrates. Measure the system’s energy strongly, and call the outcome E_i. Isolate the system from the reservoir. Tune the Hamiltonian, performing the quantum equivalent of propelling the London train up a hill. Measure the energy, and call the outcome E_f.

Any change \Delta E in a system’s energy comes from heat Q and/or from work W, by the First Law of Thermodynamics, \Delta E = Q + W.  Our system hasn’t exchanged energy with any heat reservoir between the measurements. So the energy change consists of work: E_f - E_i =: W.

Train

Imagine performing this protocol in each of many trials. Different trials will require different amounts W of work. Upon recording the amounts, you can infer a distribution \{ p(W) \}. p(W) denotes the probability that the next trial will require an amount W of work.

Measuring the system’s energy disturbs the system, squashing some of its quantum properties. (The measurement eliminates coherences, relative to the energy eigenbasis, from the state.) Quantum properties star in quantum thermodynamics. So the two-point measurement scheme doesn’t satisfy everyone.

Enter weak measurements. They can provide information about the system’s energy without disturbing the system much. Work probability distributions \{ p(W) \} give way to quasiprobability distributions \{ \tilde{p}(W) \}.

So propose Solinas and Gasparinetti, in these papers. Other quantum thermodynamicists apply weak measurements and quasiprobabilities differently.2 I proposed applying them to characterize chaos, and the scrambling of quantum information in many-body systems, at the conference.3 Feel free to add your favorite applications to the “comments” section.

Dinner

All the quantum ladies: The conference’s female participants gathered for dinner one conference night.

Wednesday afforded an afternoon for touring. Participants congregated at the college of conference co-organizer Felix Binder.3 His tour evoked, for me, the ghosts of thermo conferences past: One conference, at the University of Cambridge, had brought me to the grave of thermodynamicist Arthur Eddington. Another conference, about entropies in information theory, had convened near Canada’s Banff Cemetery. Felix’s tour began with St. Edmund Hall’s cemetery. Thermodynamics highlights equilibrium, a state in which large-scale properties—like temperature and pressure—remain constant. Some things never change.

IMG_1192

 

With thanks to Felix, Janet, and the other coordinators for organizing the conference.

1Oxford derives its nickname from an elegy by Matthew Arnold. Happy National Poetry Month!

2https://arxiv.org/abs/1508.00438https://arxiv.org/abs/1610.04285,
https://arxiv.org/abs/1607.02404,
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.118.070601,
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.120.040602

3Michele Campisi joined me in introducing out-of-time-ordered correlators (OTOCs) into the quantum-thermo conference: He, with coauthor John Goold, combined OTOCs with the two-point measurement scheme.

3Oxford University contains 38 colleges, the epicenters of undergraduates’ social, dining, and housing experiences. Graduate students and postdoctoral scholars affiliate with colleges, and senior fellows—faculty members—govern the colleges.

Rock-paper-scissors, granite-clock-idea

I have a soft spot for lamassu. Ten-foot-tall statues of these winged bull-men guarded the entrances to ancient Assyrian palaces. Show me lamassu, or apkallu—human-shaped winged deities—or other reliefs from the Neo-Assyrian capital of Nineveh, and you’ll have trouble showing me the door.

Assyrian art fills a gallery in London’s British Museum. Lamassu flank the gallery’s entrance. Carvings fill the interior: depictions of soldiers attacking, captives trudging, and kings hunting lions. The artwork’s vastness, its endurance, and the contact with a three-thousand-year-old civilization floor me. I tore myself away as the museum closed one Sunday night.

Lamassu

I visited the British Museum the night before visiting Jonathan Oppenheim’s research group at University College London (UCL). Jonathan combines quantum information theory with thermodynamics. He and others co-invented thermodynamic resource theories (TRTs), which Quantum Frontiers regulars will know of. TRTs are quantum-information-theoretic models for systems that exchange energy with their environments.

Energy is conjugate to time: Hamiltonians, mathematical objects that represent energy, represent also translations through time. We measure time with clocks. Little wonder that one can study quantum clocks using a model for energy exchanges.

Mischa Woods, Ralph Silva, and Jonathan used a resource theory to design an autonomous quantum clock. “Autonomous” means that the clock contains all the parts it needs to operate, needs no periodic winding-up, etc. When might we want an autonomous clock? When building quantum devices that operate independently of classical engineers. Or when performing a quantum computation: Computers must perform logical gates at specific times.

Clock

Wolfgang Pauli and others studied quantum clocks, the authors recall. How, Pauli asked, would an ideal clock look? Its Hamiltonian, \hat{H}_{\rm C}, would have eigenstates | E \rangle. The labels E denote possible amounts of energy.

The Hamiltonian would be conjugate to a “time operator” \hat{t}. Let | \theta \rangle denote an eigenstate of \hat{t}. This “time state” would equal an even superposition over the | E \rangle’s. The clock would occupy the state | \theta \rangle at time t_\theta.

Imagine measuring the clock, to learn the time, or controlling another system with the clock. The interaction would disturb the clock, changing the clock’s state. The disturbance wouldn’t mar the clock’s timekeeping, if the clock were ideal. What would enable an ideal clock to withstand the disturbances? The ability to have any amount of energy: E must stretch from - \infty to \infty. Such clocks can’t exist.

Approximations to them can. Mischa, Ralph, and Jonathan designed a finite-size clock, then characterized how accurately the clock mimics the ideal. (Experts: The clock corresponds to a Hilbert space of finite dimensionality d. The clock begins in a Gaussian state that peaks at one time state | \theta \rangle. The finite-width Gaussian offers more stability than a clock state.)

Disturbances degrade our ability to distinguish instants by measuring the clock. Imagine gazing at a kitchen clock through blurry lenses: You couldn’t distinguish 6:00 from 5:59 or 6:01. Disturbances also hinder the clock’s ability to implement processes, such as gates in a computation, at desired instants.

Mischa & co. quantified these degradations. The errors made by the clock, they found, decay inverse-exponentially with the clock’s size: Grow the clock a little, and the errors shrink a lot.

Reliefs

Time has degraded the lamassu, but only a little. You can distinguish feathers in their wings and strands in their beards. People portray such artifacts as having “withstood the flow of time,” or “evaded,” or “resisted.” Such portrayals have never appealed to me. I prefer to think of the lamassu as surviving not because they clash with time, but because they harmonize with it. The prospect of harmonizing with time—whatever that means—has enticed me throughout my life. The prospect partially underlies my research into time—perhaps childishly, foolishly—I recognize if I remove my blurry lenses before gazing in the mirror.

The creation of lasting works, like lamassu, has enticed me throughout my life. I’ve scrapbooked, archived, and recorded, and tended memories as though they were Great-Grandma’s cookbook. Ancient civilizations began alluring me at age six, partially due to artifacts’ longevity. No wonder I study the second law of thermodynamics.

Yet doing theoretical physics makes no sense from another perspective. The ancient Egyptians sculpted granite, when they could afford it. Gudea, king of the ancient city-state of Lagash, immortalized himself in diorite. I fashion ideas, which lack substance. Imagine playing, rather than rock-paper-scissors, granite-diorite-idea. The idea wouldn’t stand a chance.

Would it? Because an idea lacks substance, it can manifest in many forms. Plato’s cave allegory has manifested as a story, as classroom lectures, on handwritten pages, on word processors and websites, in cartloads of novels, in the film The Matrix, in one of the four most memorable advertisements I received from colleges as a high-school junior, and elsewhere. Plato’s allegory has survived since about the fourth century BCE. King Ashurbanipal’s lion-hunt reliefs have survived for only about 200 years longer.

The lion-hunt reliefs—and lamassu—exude a grandness, a majesty that’s attracted me as their longevity has. The nature of time and the perfect clock have as much grandness. Leaving the British Museum’s Assyrian gallery at 6 PM one Sunday, I couldn’t have asked for a more fitting location, 24 hours later, than in a theoretical-physics conversation.

 

With thanks to Jonathan, to Álvaro Martín-Alhambra, and to Mischa for their hospitality at UCL; to Ada Cohen for the “Art history of ancient Egypt and the ancient Near East” course for which I’d been hankering for years; to my brother, for transmitting the ancient-civilizations bug; and to my parents, who fed the infection with museum visits.

Click here for a follow-up to the quantum-clock paper.

What makes extraordinary science extraordinary?

My article for this month appears on Sean Carroll’s blog, Preposterous UniverseSean is a theoretical physicist who practices cosmology at Caltech. He interfaces with philosophy, which tinges the question I confront: What distinguishes extraordinary science from good science? The topic seemed an opportunity to take Sean up on an invitation to guest-post on Preposterous Universe. Head there for my article. Thanks to Sean for hosting!

Big Dipper