Memories of things past

My best friend—who’s held the title of best friend since kindergarten—calls me the keeper of her childhood memories. I recall which toys we played with, the first time I visited her house,1 and which beverages our classmates drank during snack time in kindergarten.2 She wouldn’t be surprised to learn that the first workshop I’ve co-organized centered on memory.

Memory—and the loss of memory—stars in thermodynamics. As an example, take what my husband will probably do this evening: bake tomorrow’s breakfast. I don’t know whether he’ll bake fruit-and-oat cookies, banana muffins, pear muffins, or pumpkin muffins. Whichever he chooses, his baking will create a scent. That scent will waft across the apartment, seep into air vents, and escape into the corridor—will disperse into the environment. By tomorrow evening, nobody will be able to tell by sniffing what my husband will have baked. 

That is, the kitchen’s environment lacks a memory. This lack contributes to our experience of time’s arrow: We sense that time passes partially by smelling less and less of breakfast. Physicists call memoryless systems and processes Markovian.

Our kitchen’s environment is Markovian because it’s large and particles churn through it randomly. But not all environments share these characteristics. Metaphorically speaking, a dispersed memory of breakfast may recollect, return to a kitchen, and influence the following week’s baking. For instance, imagine an atom in a quantum computer, rather than a kitchen in an apartment. A few other atoms may form our atom’s environment. Quantum information may leak from our atom into that environment, swish around in the environment for a time, and then return to haunt our atom. We’d call the atom’s evolution and environment non-Markovian.

I had the good fortune to co-organize a workshop about non-Markovianity—about memory—this February. The workshop took place at the Banff International Research Station, abbreviated BIRS, which you pronounce like the plural of what you say when shivering outdoors in Canada. BIRS operates in the Banff Centre for Arts and Creativity, high in the Rocky Mountains. The Banff Centre could accompany a dictionary entry for pristine, to my mind. The air feels crisp, the trees on nearby peaks stand out against the snow like evergreen fringes on white velvet, and the buildings balance a rustic-mountain-lodge style with the avant-garde. 

The workshop balanced styles, too, but skewed toward the theoretical and abstract. We learned about why the world behaves classically in our everyday experiences; about information-theoretic measures of the distances between quantum states; and how to simulate, on quantum computers, chemical systems that interact with environments. One talk, though, brought our theory back down to (the snow-dusted) Earth.

Gabriela Schlau-Cohen runs a chemistry lab at MIT. She wants to understand how plants transport energy. Energy arrives at a plant from the sun in the form of light. The light hits a pigment-and-protein complex. If the plant is lucky, the light transforms into a particle-like packet of energy called an exciton. The exciton traverses the receptor complex, then other complexes. Eventually, the exciton finds a spot where it can enable processes such as leaf growth. 

A high fraction of the impinging photons—85%—transform into excitons. How do plants convert and transport energy as efficiently as they do?

Gabriela’s group aims to find out—not by testing natural light-harvesting complexes, but by building complexes themselves. The experimentalists mimic the complex’s protein using DNA. You can fold DNA into almost any shape you want, by choosing the DNA’s base pairs (basic units) adroitly and by using “staples” formed from more DNA scraps. The sculpted molecules are called DNA origami.

Gabriela’s group engineers different DNA structures, analogous to complexes’ proteins, to have different properties. For instance, the experimentalists engineer rigid structures and flexible structures. Then, the group assesses how energy moves through each structure. Each structure forms an environment that influences excitons’ behaviors, similarly to how a memory-containing environment influences an atom.

Courtesy of Gabriela Schlau-Cohen

The Banff environment influenced me, stirring up memories like powder displaced by a skier on the slopes above us. I first participated in a BIRS workshop as a PhD student, and then I returned as a postdoc. Now, I was co-organizing a workshop to which I brought a PhD student of my own. Time flows, as we’re reminded while walking down the mountain from the Banff Centre into town: A cemetery borders part of the path. Time flows, but we belong to that thermodynamically remarkable class of systems that retain memories…memories and a few other treasures that resist change, such as friendships held since kindergarten.

1Plushy versions of Simba and Nala from The Lion King. I remain grateful to her for letting me play at being Nala.

2I’d request milk, another kid would request apple juice, and everyone else would request orange juice.

A (quantum) complex legacy: Part deux

I didn’t fancy the research suggestion emailed by my PhD advisor.

A 2016 email from John Preskill led to my publishing a paper about quantum complexity in 2022, as I explained in last month’s blog post. But I didn’t explain what I thought of his email upon receiving it.

It didn’t float my boat. (Hence my not publishing on it until 2022.)

The suggestion contained ingredients that ordinarily would have caulked any cruise ship of mine: thermodynamics, black-hole-inspired quantum information, and the concept of resources. John had forwarded a paper drafted by Stanford physicists Adam Brown and Lenny Susskind. They act as grand dukes of the community sussing out what happens to information swallowed by black holes. 

From Rare-Gallery

We’re not sure how black holes work. However, physicists often model a black hole with a clump of particles squeezed close together and so forced to interact with each other strongly. The interactions entangle the particles. The clump’s quantum state—let’s call it | \psi(t) \rangle—grows not only complicated with time (t), but also complex in a technical sense: Imagine taking a fresh clump of particles and preparing it in the state | \psi(t) \rangle via a sequence of basic operations, such as quantum gates performable with a quantum computer. The number of basic operations needed is called the complexity of | \psi(t) \rangle. A black hole’s state has a complexity believed to grow in time—and grow and grow and grow—until plateauing. 

This growth echoes the second law of thermodynamics, which helps us understand why time flows in only one direction. According to the second law, every closed, isolated system’s entropy grows until plateauing.1 Adam and Lenny drew parallels between the second law and complexity’s growth.

The less complex a quantum state is, the better it can serve as a resource in quantum computations. Recall, as we did last month, performing calculations in math class. You needed clean scratch paper on which to write the calculations. So does a quantum computer. “Scratch paper,” to a quantum computer, consists of qubits—basic units of quantum information, realized in, for example, atoms or ions. The scratch paper is “clean” if the qubits are in a simple, unentangled quantum state—a low-complexity state. A state’s greatest possible complexity, minus the actual complexity, we can call the state’s uncomplexity. Uncomplexity—a quantum state’s blankness—serves as a resource in quantum computation.

Manny Knill and Ray Laflamme realized this point in 1998, while quantifying the “power of one clean qubit.” Lenny arrived at a similar conclusion while reasoning about black holes and firewalls. For an introduction to firewalls, see this blog post by John. Suppose that someone—let’s call her Audrey—falls into a black hole. If it contains a firewall, she’ll burn up. But suppose that someone tosses a qubit into the black hole before Audrey falls. The qubit kicks the firewall farther away from the event horizon, so Audrey will remain safe for longer. Also, the qubit increases the uncomplexity of the black hole’s quantum state. Uncomplexity serves as a resource also to Audrey.

A resource is something that’s scarce, valuable, and useful for accomplishing tasks. Different things qualify as resources in different settings. For instance, imagine wanting to communicate quantum information to a friend securely. Entanglement will serve as a resource. How can we quantify and manipulate entanglement? How much entanglement do we need to perform a given communicational or computational task? Quantum scientists answer such questions with a resource theory, a simple information-theoretic model. Theorists have defined resource theories for entanglement, randomness, and more. In many a blog post, I’ve eulogized resource theories for thermodynamic settings. Can anyone define, Adam and Lenny asked, a resource theory for quantum uncomplexity?

Resource thinking pervades our world.

By late 2016, I was a quantum thermodynamicist, I was a resource theorist, and I’d just debuted my first black-hole–inspired quantum information theory. Moreover, I’d coauthored a review about the already-extant resource theory that looked closest to what Adam and Lenny sought. Hence John’s email, I expect. Yet that debut had uncovered reams of questions—questions that, as a budding physicist heady with the discovery of discovery, I could own. Why would I answer a question of someone else’s instead?

So I thanked John, read the paper draft, and pondered it for a few days. Then, I built a research program around my questions and waited for someone else to answer Adam and Lenny.

Three and a half years later, I was still waiting. The notion of uncomplexity as a resource had enchanted the black-hole-information community, so I was preparing a resource-theory talk for a quantum-complexity workshop. The preparations set wheels churning in my mind, and inspiration struck during a long walk.2

After watching my workshop talk, Philippe Faist reached out about collaborating. Philippe is a coauthor, a friend, and a fellow quantum thermodynamicist and resource theorist. Caltech’s influence had sucked him, too, into the black-hole community. We Zoomed throughout the pandemic’s first spring, widening our circle to include Teja Kothakonda, Jonas Haferkamp, and Jens Eisert of Freie University Berlin. Then, Anthony Munson joined from my nascent group in Maryland. Physical Review A published our paper, “Resource theory of quantum uncomplexity,” in January.

The next four paragraphs, I’ve geared toward experts. An agent in the resource theory manipulates a set of n qubits. The agent can attempt to perform any gate U on any two qubits. Noise corrupts every real-world gate implementation, though. Hence the agent effects a gate chosen randomly from near U. Such fuzzy gates are free. The agent can’t append or discard any system for free: Appending even a maximally mixed qubit increases the state’s uncomplexity, as Knill and Laflamme showed. 

Fuzzy gates’ randomness prevents the agent from mapping complex states to uncomplex states for free (with any considerable probability). Complexity only grows or remains constant under fuzzy operations, under appropriate conditions. This growth echoes the second law of thermodynamics. 

We also defined operational tasks—uncomplexity extraction and expenditure analogous to work extraction and expenditure. Then, we bounded the efficiencies with which the agent can perform these tasks. The efficiencies depend on a complexity entropy that we defined—and that’ll star in part trois of this blog-post series.

Now, I want to know what purposes the resource theory of uncomplexity can serve. Can we recast black-hole problems in terms of the resource theory, then leverage resource-theory results to solve the black-hole problem? What about problems in condensed matter? Can our resource theory, which quantifies the difficulty of preparing quantum states, merge with the resource theory of magic, which quantifies that difficulty differently?

Unofficial mascot for fuzzy operations

I don’t regret having declined my PhD advisor’s recommendation six years ago. Doing so led me to explore probability theory and measurement theory, collaborate with two experimental labs, and write ten papers with 21 coauthors whom I esteem. But I take my hat off to Adam and Lenny for their question. And I remain grateful to the advisor who kept my goals and interests in mind while checking his email. I hope to serve Anthony and his fellow advisees as well.

1…en route to obtaining a marriage license. My husband and I married four months after the pandemic throttled government activities. Hours before the relevant office’s calendar filled up, I scored an appointment to obtain our license. Regarding the metro as off-limits, my then-fiancé and I walked from Cambridge, Massachusetts to downtown Boston for our appointment. I thank him for enduring my requests to stop so that I could write notes.

2At least, in the thermodynamic limit—if the system is infinitely large. If the system is finite-size, its entropy grows on average.

A (quantum) complex legacy

Early in the fourth year of my PhD, I received a most John-ish email from John Preskill, my PhD advisor. The title read, “thermodynamics of complexity,” and the message was concise the way that the Amazon River is damp: “Might be an interesting subject for you.” 

Below the signature, I found a paper draft by Stanford physicists Adam Brown and Lenny Susskind. Adam is a Brit with an accent and a wit to match his Oxford degree. Lenny, known to the public for his books and lectures, is a New Yorker with an accent that reminds me of my grandfather. Before the physicists posted their paper online, Lenny sought feedback from John, who forwarded me the email.

The paper concerned a confluence of ideas that you’ve probably encountered in the media: string theory, black holes, and quantum information. String theory offers hope for unifying two physical theories: relativity, which describes large systems such as our universe, and quantum theory, which describes small systems such as atoms. A certain type of gravitational system and a certain type of quantum system participate in a duality, or equivalence, known since the 1990s. Our universe isn’t such a gravitational system, but never mind; the duality may still offer a toehold on a theory of quantum gravity. Properties of the gravitational system parallel properties of the quantum system and vice versa. Or so it seemed.

The gravitational system can have two black holes linked by a wormhole. The wormhole’s volume can grow linearly in time for a time exponentially long in the black holes’ entropy. Afterward, the volume hits a ceiling and approximately ceases changing. Which property of the quantum system does the wormhole’s volume parallel?

Envision the quantum system as many particles wedged close together, so that they interact with each other strongly. Initially uncorrelated particles will entangle with each other quickly. A quantum system has properties, such as average particle density, that experimentalists can measure relatively easily. Does such a measurable property—an observable of a small patch of the system—parallel the wormhole volume? No; such observables cease changing much sooner than the wormhole volume does. The same conclusion applies to the entanglement amongst the particles.

What about a more sophisticated property of the particles’ quantum state? Researchers proposed that the state’s complexity parallels the wormhole’s volume. To grasp complexity, imagine a quantum computer performing a computation. When performing computations in math class, you needed blank scratch paper on which to write your calculations. A quantum computer needs the quantum equivalent of blank scratch paper: qubits (basic units of quantum information, realized, for example, as atoms) in a simple, unentangled, “clean” state. The computer performs a sequence of basic operations—quantum logic gates—on the qubits. These operations resemble addition and subtraction but can entangle the qubits. What’s the minimal number of basic operations needed to prepare a desired quantum state (or to “uncompute” a given state to the blank state)? The state’s quantum complexity.1 

Quantum complexity has loomed large over multiple fields of physics recently: quantum computing, condensed matter, and quantum gravity. The latter, we established, entails a duality between a gravitational system and a quantum system. The quantum system begins in a simple quantum state that grows complicated as the particles interact. The state’s complexity parallels the volume of a wormhole in the gravitational system, according to a conjecture.2 

The conjecture would hold more water if the quantum state’s complexity grew similarly to the wormhole’s volume: linearly in time, for a time exponentially large in the quantum system’s size. Does the complexity grow so? The expectation that it does became the linear-growth conjecture.

Evidence supported the conjecture. For instance, quantum information theorists modeled the quantum particles as interacting randomly, as though undergoing a quantum circuit filled with random quantum gates. Leveraging probability theory,3 the researchers proved that the state’s complexity grows linearly at short times. Also, the complexity grows linearly for long times if each particle can store a great deal of quantum information. But what if the particles are qubits, the smallest and most ubiquitous unit of quantum information? The question lingered for years.

Jonas Haferkamp, a PhD student in Berlin, dreamed up an answer to an important version of the question.4 I had the good fortune to help formalize that answer with him and members of his research group: master’s student Teja Kothakonda, postdoc Philippe Faist, and supervisor Jens Eisert. Our paper, published in Nature Physics last year, marked step one in a research adventure catalyzed by John Preskill’s email 4.5 years earlier.

Imagine, again, qubits undergoing a circuit filled with random quantum gates. That circuit has some architecture, or arrangement of gates. Slotting different gates into the architecture effects different transformations5 on the qubits. Consider the set of all transformations implementable with one architecture. This set has some size, which we defined and analyzed.

What happens to the set’s size if you add more gates to the circuit—let the particles interact for longer? We can bound the size’s growth using the mathematical toolkits of algebraic geometry and differential topology. Upon bounding the size’s growth, we can bound the state’s complexity. The complexity, we concluded, grows linearly in time for a time exponentially long in the number of qubits.

Our result lends weight to the complexity-equals-volume hypothesis. The result also introduces algebraic geometry and differential topology into complexity as helpful mathematical toolkits. Finally, the set size that we bounded emerged as a useful concept that may elucidate circuit analyses and machine learning.

John didn’t have machine learning in mind when forwarding me an email in 2017. He didn’t even have in mind proving the linear-growth conjecture. The proof enables step two of the research adventure catalyzed by that email: thermodynamics of quantum complexity, as the email’s title stated. I’ll cover that thermodynamics in its own blog post. The simplest of messages can spin a complex legacy.

The links provided above scarcely scratch the surface of the quantum-complexity literature; for a more complete list, see our paper’s bibliography. For a seminar about the linear-growth paper, see this video hosted by Nima Lashkari’s research group.

1The term complexity has multiple meanings; forget the rest for the purposes of this article.

2According to another conjecture, the quantum state’s complexity parallels a certain space-time region’s action. (An action, in physics, isn’t a motion or a deed or something that Hamlet keeps avoiding. An action is a mathematical object that determines how a system can and can’t change in time.) The first two conjectures snowballed into a paper entitled “Does complexity equal anything?” Whatever it parallels, complexity plays an important role in the gravitational–quantum duality. 

3Experts: Such as unitary t-designs.

4Experts: Our work concerns quantum circuits, rather than evolutions under fixed Hamiltonians. Also, our work concerns exact circuit complexity, the minimal number of gates needed to prepare a state exactly. A natural but tricky extension eluded us: approximate circuit complexity, the minimal number of gates needed to approximate the state.

5Experts: Unitary operators.

If I could do science like Spider-Man

A few Saturdays ago, I traveled home from a summer school at which I’d been lecturing in Sweden. Around 8:30 AM, before the taxi arrived, I settled into an armchair in my hotel room and refereed a manuscript from a colleague. After reaching the airport, I read an experimental proposal for measuring a quantity that colleagues and I had defined. I drafted an article for New Scientist on my trans-Atlantic flight, composed several emails, and provided feedback about a student’s results (we’d need more data). Around 8 PM Swedish time, I felt satisfyingly exhausted—and about ten hours of travel remained. So I switched on Finnair’s entertainment system and navigated to Spider-Man: No Way Home.

I found much to delight. Actor Alfred Molina plays the supervillain Doc Ock with charisma and verve that I hadn’t expected from a tentacled murderer. Playing on our heartstrings, Willem Dafoe imbues the supervillain Norman Osborn with frailty and humanity. Three characters (I won’t say which, for the spoiler-sensitive) exhibit a playful chemistry. To the writers who thought to bring the trio together, I tip my hat. I tip my hat also to the special-effects coders who sweated over reconciling Spider-Man’s swoops and leaps with the laws of mechanics.

I’m not a physicist to pick bones with films for breaking physical laws. You want to imagine a Mirror Dimension controlled by a flying erstwhile surgeon? Go for it. Falling into a vat of electrical eels endows you with the power to control electricity? Why not. Films like Spider-Man’s aren’t intended to portray physical laws accurately; they’re intended to portray people and relationships meaningfully. So I raised nary an eyebrow at characters’ zipping between universes (although I had trouble buying teenage New Yorkers who called adults “sir” and “ma’am”).

Anyway, no hard feelings about the portrayal of scientific laws. The portrayal of the scientific process, though, entertained me even more than Dr. Strange’s trademark facetiousness. In one scene, twelfth grader Peter Parker (Spider-Man’s alter-ego) commandeers a high-school lab with two buddies. In a fraction of a night, the trio concocts cures for four supervillains whose evil stems from physical, chemical, and biological accidents (e.g., falling into the aforementioned vat of electric eels).1 And they succeed. In a few hours. Without test subjects or even, as far as we could see, samples of their would-be test subjects. Without undergoing several thousand iterations of trying out their cures, failing, and tweaking their formulae—or even undergoing one iteration.

I once collaborated with an experimentalist renowned for his facility with superconducting qubits. He’d worked with a panjandrum of physics years before—a panjandrum who later reminisced to me, “A theorist would propose an experiment, [this experimentalist would tackle the proposal,] and boom—the proposal would work.” Yet even this experimentalist’s team invested a year in an experiment that he’d predicted would take a month.

Worse, the observatory LIGO detected gravitational waves in 2016 after starting to take data in 2002…after beginning its life during the 1960s.2 

Recalling the toil I’d undertaken all day—and only as a theorist, not even as an experimentalist charged with taking data through the night—I thought, I want to be like Spider-Man. Specifically, I want to do science like Spider-Man. Never mind shooting webs out of my wrists or swooping through the air. Never mind buddies in the Avengers, a Greek-statue physique, or high-tech Spandex. I want to try out a radical new idea and have it work. On the first try. Four times in a row on the same day. 

Daydreaming in the next airport (and awake past my bedtime), I imagined what a theorist could accomplish with Spider-Man’s scientific superpowers. I could calculate any integral…write code free of bugs on the first try3…prove general theorems in a single appendix!

Too few hours later, I woke up at home, jet-lagged but free of bites from radioactive calculators. I got up, breakfasted, showered, and settled down to work. Because that’s what scientists do—work. Long and hard, including when those around us are dozing or bartering frequent-flyer miles, such that the satisfaction of discoveries is well-earned. I have to go edit a paper now, but, if you have the time, I recommend watching the latest Spider-Man movie. It’s a feast of fantasy.

1And from psychological disorders, but the therapy needed to cure those would doom any blockbuster.

2You might complain that comparing Peter Parker’s labwork with LIGO’s is unfair. LIGO required the construction of large, high-tech facilities; Parker had only to cure a lizard-man of his reptilian traits and so on. But Tony Stark built a particle accelerator in his basement within a few hours, in Iron Man; and superheroes are all of a piece, as far as their scientific exploits are concerned.

3Except for spiders?

Quantum connections

We were seated in the open-air back of a boat, motoring around the Stockholm archipelago. The Swedish colors fluttered above our heads; the occasional speedboat zipped past, rocking us in its wake; and wildflowers dotted the bank on either side. Suddenly, a wood-trimmed boat glided by, and the captain waved from his perch.

The gesture surprised me. If I were in a vehicle of the sort most familiar to me—a car—I wouldn’t wave to other drivers. In a tram, I wouldn’t wave to passengers on a parallel track. Granted, trams and cars are closed, whereas boats can be open-air. But even as a pedestrian in a downtown crossing, I wouldn’t wave to everyone I passed. Yet, as boat after boat pulled alongside us, we received salutation after salutation.

The outing marked the midpoint of the Quantum Connections summer school. Physicists Frank Wilczek, Antti Niemi, and colleagues coordinate the school, which draws students and lecturers from across the globe. Although sponsored by Stockholm University, the school takes place at a century-old villa whose name I wish I could pronounce: Högberga Gård. The villa nestles atop a cliff on an island in the archipelago. We ventured off the island after a week of lectures.

Charlie Marcus lectured about materials formed from superconductors and semiconductors; John Martinis, about superconducting qubits; Jianwei Pan, about quantum advantages; and others, about symmetries, particle statistics, and more. Feeling like an ant among giants, I lectured about quantum thermodynamics. Two other lectures linked quantum physics with gravity—and in a way you might not expect. I appreciated the opportunity to reconnect with the lecturer: Igor Pikovski.

Cruising around Stockholm

Igor doesn’t know it, but he’s one of the reasons why I joined the Harvard-Smithsonian Institute for Theoretical Atomic, Molecular, and Optical Physics (ITAMP) as an ITAMP Postdoctoral Fellow in 2018. He’d held the fellowship beginning a few years before, and he’d earned a reputation for kindness and consideration. Also, his research struck me as some of the most fulfilling that one could undertake.

If you’ve heard about the intersection of quantum physics and gravity, you’ve probably heard of approaches other than Igor’s. For instance, physicists are trying to construct a theory of quantum gravity, which would describe black holes and the universe’s origin. Such a “theory of everything” would reduce to Einstein’s general theory of relativity when applied to planets and would reduce to quantum theory when applied to atoms. In another example, physicists leverage quantum technologies to observe properties of gravity. Such technologies enabled the observatory LIGO to register gravitational waves—ripples in space-time. 

Igor and his colleagues pursue a different goal: to observe phenomena whose explanations depend on quantum theory and on gravity.

In his lectures, Igor illustrated with an experiment first performed in 1975. The experiment relies on what happens if you jump: You gain energy associated with resisting the Earth’s gravitational pull—gravitational potential energy. A quantum object’s energy determines how the object’s quantum state changes in time. The experimentalists applied this fact to a beam of neutrons. 

They put the beam in a superposition of two locations: closer to the Earth’s surface and farther away. The closer component changed in time in one way, and the farther component changed another way. After a while, the scientists recombined the components. The two interfered with each other similarly to the waves created by two raindrops falling near each other on a puddle. The interference evidenced gravity’s effect on the neutrons’ quantum state.

Summer-school venue. I’d easily say it’s gorgeous but not easily pronounce its name.

The experimentalists approximated gravity as dominated by the Earth alone. But other masses can influence the gravitational field noticeably. What if you put a mass in a superposition of different locations? What would happen to space-time?

Or imagine two quantum particles too far apart to interact with each other significantly. Could a gravitational field entangle the particles by carrying quantum correlations from one to the other?

Physicists including Igor ponder these questions…and then ponder how experimentalists could test their predictions. The more an object influences gravity, the more massive the object tends to be, and the more easily the object tends to decohere—to spill the quantum information that it holds into its surroundings.

The “gravity-quantum interface,” as Igor entitled his lectures, epitomizes what I hoped to study in college, as a high-school student entranced by physics, math, and philosophy. What’s more curious and puzzling than superpositions, entanglement, and space-time? What’s more fundamental than quantum theory and gravity? Little wonder that connecting them inspires wonder.

But we humans are suckers for connections. I appreciated the opportunity to reconnect with a colleague during the summer school. Boaters on the Stockholm archipelago waved to our cohort as they passed. And who knows—gravitational influences may even have rippled between the boats, entangling us a little.

Requisite physicist-visiting-Stockholm photo

With thanks to the summer-school organizers, including Pouya Peighami and Elizabeth Yang, for their invitation and hospitality.

One equation to rule them all?

In lieu of composing a blog post this month, I’m publishing an article in Quanta Magazine. The article provides an introduction to fluctuation relations, souped-up variations on the second law of thermodynamics, which helps us understand why time flows in only one direction. The earliest fluctuation relations described classical systems, such as single strands of DNA. Many quantum versions have been proved since. Their proliferation contrasts with the stereotype of physicists as obsessed with unification—with slimming down a cadre of equations into one über-equation. Will one quantum fluctuation relation emerge to rule them all? Maybe, and maybe not. Maybe the multiplicity of quantum fluctuation relations reflects the richness of quantum thermodynamics.

You can read more in Quanta Magazine here and yet more in chapter 9 of my book. For recent advances in fluctuation relations, as opposed to the broad introduction there, check out earlier Quantum Frontiers posts here, here, here, here, and here.

The power of being able to say “I can explain that”

Caltech condensed-matter theorist Gil Refael explained his scientific raison dê’tre early in my grad-school career: “What really gets me going is seeing a plot [of experimental data] and being able to say, ‘I can explain that.’” The quote has stuck with me almost word for word. When I heard it, I was working deep in abstract quantum information theory and thermodynamics, proving theorems about thought experiments. Embedding myself in pure ideas has always held an aura of romance for me, so I nodded along without seconding Gil’s view.

Roughly nine years later, I concede his point.

The revelation walloped me last month, as I was polishing a paper with experimental collaborators. Members of the Institute for Quantum Optics and Quantum Information (IQOQI) in Innsbruck, Austria—Florian Kranzl, Manoj Joshi, and Christian Roos—had performed an experiment in trapped-ion guru Rainer Blatt’s lab. Their work realized an experimental proposal that I’d designed with fellow theorists near the beginning of my postdoc stint. We aimed to observe signatures of particularly quantum thermalization

Throughout the universe, small systems exchange stuff with their environments. For instance, the Earth exchanges heat and light with the rest of the solar system. After exchanging stuff for long enough, the small system equilibrates with the environment: Large-scale properties of the small system (such as its volume and energy) remain fairly constant; and as much stuff enters the small system as leaves, on average. The Earth remains far from equilibrium, which is why we aren’t dead yet

Far from equilibrium and proud of it

In many cases, in equilibrium, the small system shares properties of the environment, such as the environment’s temperature. In these cases, we say that the small system has thermalized and, if it’s quantum, has reached a thermal state.

The stuff exchanged can consist of energy, particles, electric charge, and more. Unlike classical planets, quantum systems can exchange things that participate in quantum uncertainty relations (experts: that fail to commute). Quantum uncertainty mucks up derivations of the thermal state’s mathematical form. Some of us quantum thermodynamicists discovered the mucking up—and identified exchanges of quantum-uncertain things as particularly nonclassical thermodynamics—only a few years ago. We reworked conventional thermodynamic arguments to accommodate this quantum uncertainty. The small system, we concluded, likely equilibrates to near a thermal state whose mathematical form depends on the quantum-uncertain stuff—what we termed a non-Abelian thermal state. I wanted to see this equilibration in the lab. So I proposed an experiment with theory collaborators; and Manoj, Florian, and Christian took a risk on us.

The experimentalists arrayed between six and fifteen ions in a line. Two ions formed the small system, and the rest formed the quantum environment. The ions exchanged the x-, y-, and z-components of their spin angular momentum—stuff that participates in quantum uncertainty relations. The ions began with a fairly well-defined amount of each spin component, as described in another blog post. The ions exchanged stuff for a while, and then the experimentalists measured the small system’s quantum state.

The small system equilibrated to near the non-Abelian thermal state, we found. No conventional thermal state modeled the results as accurately. Score!

My postdoc and numerical-simulation wizard Aleks Lasek modeled the experiment on his computer. The small system, he found, remained farther from the non-Abelian thermal state in his simulation than in the experiment. Aleks plotted the small system’s distance to the non-Abelian thermal state against the ion chain’s length. The points produced experimentally sat lower down than the points produced numerically. Why?

I think I can explain that, I said. The two ions exchange stuff with the rest of the ions, which serve as a quantum environment. But the two ions exchange stuff also with the wider world, such as stray electromagnetic fields. The latter exchanges may push the small system farther toward equilibrium than the extra ions alone do.

Fortunately for the development of my explanatory skills, collaborators prodded me to hone my argument. The wider world, they pointed out, effectively has a very high temperature—an infinite temperature.1 Equilibrating with that environment, the two ions would acquire an infinite temperature themselves. The two ions would approach an infinite-temperature thermal state, which differs from the non-Abelian thermal state we aimed to observe.

Fair, I said. But the extra ions probably have a fairly high temperature themselves. So the non-Abelian thermal state is probably close to the infinite-temperature thermal state. Analogously, if someone cooks goulash similarly to his father, and the father cooks goulash similarly to his grandfather, then the youngest chef cooks goulash similarly to his grandfather. If the wider world pushes the two ions to equilibrate to infinite temperature, then, because the infinite-temperature state lies near the non-Abelian thermal state, the wider world pushes the two ions to equilibrate to near the non-Abelian thermal state.

Tasty, tasty thermodynamicis

I plugged numbers into a few equations to check that the extra ions do have a high temperature. (Perhaps I should have done so before proposing the argument above, but my collaborators were kind enough not to call me out.) 

Aleks hammered the nail into the problem’s coffin by incorporating into his simulations the two ions’ interaction with an infinite-temperature wider world. His numerical data points dropped to near the experimental data points. The new plot supported my story.

I can explain that! Aleks’s results buoyed me the whole next day; I found myself smiling at random times throughout the afternoon. Not that I’d explained a grand mystery, like the unexpected hiss heard by Arno Penzias and Robert Wilson when they turned on a powerful antenna in 1964. The hiss turned out to come from the cosmic microwave background (CMB), a collection of photons that fill the visible universe. The CMB provided evidence for the then-controversial Big Bang theory of the universe’s origin. Discovering the CMB earned Penzias and Wilson a Nobel Prize. If the noise caused by the CMB was music to cosmologists’ ears, the noise in our experiment is the quiet wailing of a shy banshee. But it’s our experiment’s noise, and we understand it now.

The experience hasn’t weaned me off the romance of proving theorems about thought experiments. Theorems about thermodynamic quantum uncertainty inspired the experiment that yielded the plot that confused us. But I now second Gil’s sentiment. In the throes of an experiment, “I can explain that” can feel like a battle cry.

1Experts: The wider world effectively has an infinite temperature because (i) the dominant decoherence is dephasing relative to the \sigma_z product eigenbasis and (ii) the experimentalists rotate their qubits often, to simulate a rotationally invariant Hamiltonian evolution. So the qubits effectively undergo dephasing relative to the \sigma_x, \sigma_y, and \sigma_z eigenbases.

Space-time and the city

I felt like a gum ball trying to squeeze my way out of a gum-ball machine. 

I was one of 50-ish physicists crammed into the lobby—and in the doorway, down the stairs, and onto the sidewalk—of a Manhattan hotel last December. Everyone had received a COVID vaccine, and the omicron variant hadn’t yet begun chewing up North America. Everyone had arrived on the same bus that evening, feeding on the neon-bright views of Fifth Avenue through dinnertime. Everyone wanted to check in and offload suitcases before experiencing firsthand the reason for the nickname “the city that never sleeps.” So everyone was jumbled together in what passed for a line.

We’d just passed the halfway point of the week during which I was pretending to be a string theorist. I do that whenever my research butts up against black holes, chaos, quantum gravity (the attempt to unify quantum physics with Einstein’s general theory of relativity), and alternative space-times. These topics fall under the heading “It from Qubit,” which calls for understanding puzzling physics (“It”) by analyzing how quantum systems process information (“Qubit”). The “It from Qubit” crowd convenes for one week each December, to share progress and collaborate.1 The group spends Monday through Wednesday at Princeton’s Institute for Advanced Study (IAS), dogged by photographs of Einstein, busts of Einstein, and roads named after Einstein. A bus ride later, the group spends Thursday and Friday at the Simons Foundation in New York City.

I don’t usually attend “It from Qubit” gatherings, as I’m actually a quantum information theorist and quantum thermodynamicist. Having admitted as much during the talk I presented at the IAS, I failed at pretending to be a string theorist. Happily, I adore being the most ignorant person in a roomful of experts, as the experience teaches me oodles. At lunch and dinner, I’d plunk down next to people I hadn’t spoken to and ask what they see as trending in the “It from Qubit” community. 

One buzzword, I’d first picked up on shortly before the pandemic had begun (replicas). Having lived a frenetic life, that trend seemed to be declining. Rising buzzwords (factorization and islands), I hadn’t heard in black-hole contexts before. People were still tossing around terms from when I’d first forayed into “It from Qubit” (scrambling and out-of-time-ordered correlator), but differently from then. Five years ago, the terms identified the latest craze. Now, they sounded entrenched, as though everyone expected everyone else to know and accept their significance.

One buzzword labeled my excuse for joining the workshops: complexity. Complexity wears as many meanings as the stereotypical New Yorker wears items of black clothing. Last month, guest blogger Logan Hillberry wrote about complexity that emerges in networks such as brains and social media. To “It from Qubit,” complexity quantifies the difficulty of preparing a quantum system in a desired state. Physicists have conjectured that a certain quantum state’s complexity parallels properties of gravitational systems, such as the length of a wormhole that connects two black holes. The wormhole’s length grows steadily for a time exponentially large in the gravitational system’s size. So, to support the conjecture, researchers have been trying to prove that complexity typically grows similarly. Collaborators and I proved that it does, as I explained in my talk and as I’ll explain in a future blog post. Other speakers discussed experimental complexities, as well as the relationship between complexity and a simplified version of Einstein’s equations for general relativity.

Inside the Simons Foundation on Fifth Avenue in Manhattan

I learned a bushel of physics, moonlighting as a string theorist that week. The gum-ball-machine lobby, though, retaught me something I’d learned long before the pandemic. Around the time I squeezed inside the hotel, a postdoc struck up a conversation with the others of us who were clogging the doorway. We had a decent fraction of an hour to fill; so we chatted about quantum thermodynamics, grant applications, and black holes. I asked what the postdoc was working on, he explained a property of black holes, and it reminded me of a property of thermodynamics. I’d nearly reached the front desk when I realized that, out of the sheer pleasure of jawing about physics with physicists in person, I no longer wanted to reach the front desk. The moment dangles in my memory like a crystal ornament from the lobby’s tree—pendant from the pandemic, a few inches from the vaccines suspended on one side and from omicron on the other. For that moment, in a lobby buoyed by holiday lights, wrapped in enough warmth that I’d forgotten the December chill outside, I belonged to the “It from Qubit” community as I hadn’t belonged to any community in 22 months.

Happy new year.

Presenting at the IAS was a blast. Photo credit: Jonathan Oppenheim.

1In person or virtually, pandemic-dependently.

Thanks to the organizers of the IAS workshop—Ahmed Almheiri, Adam Bouland, Brian Swingle—for the invitation to present and to the organizers of the Simons Foundation workshop—Patrick Hayden and Matt Headrick—for the invitation to attend.

Balancing the tradeoff

So much to do, so little time. Tending to one task is inevitably at the cost of another, so how does one decide how to spend their time? In the first few years of my PhD, I balanced problem sets, literature reviews, and group meetings, but at the detriment to my hobbies. I have played drums my entire life, but I largely fell out of practice in graduate school. Recently, I made time to play with a group of musicians, even landing a couple gigs in downtown Austin, Texas, “live music capital of the world.” I have found attending to my non-physics interests makes my research hours more productive and less taxing. Finding the right balance of on- versus off-time has been key to my success as my PhD enters its final year.

Of course, life within physics is also full of tradeoffs. My day job is as an experimentalist. I use tightly focused laser beams, known as optical tweezers, to levitate micrometer-sized glass spheres. I monitor a single microsphere’s motion as it undergoes collisions with air molecules, and I study the system as an environmental sensor of temperature, fluid flow, and acoustic waves; however, by night I am a computational physicist. I code simulations of interacting qubits subject to kinetic constraints, so-called quantum cellular automata (QCA). My QCA work started a few years ago for my Master’s degree, but my interest in the subject persists. I recently co-authored one paper summarizing the work so far and another detailing an experimental implementation.

The author doing his part to “keep Austin weird” by playing the drums dressed as grackle (note the beak), the central-Texas bird notorious for overrunning grocery store parking lots.
Balancing research interests: Trapping a glass microsphere with optical tweezers.
Balancing research interests: Visualizing the time evolution of four different QCA rules.

QCA, the subject of this post, are themselves tradeoff-aware systems. To see what I mean, first consider their classical counterparts cellular automata. In their simplest construction, the system is a one-dimensional string of bits. Each bit takes a value of 0 or 1 (white or black). The bitstring changes in discrete time steps based on a simultaneously-applied local update rule: Each bit, along with its two nearest-neighbors, determine the next state of the central bit. Put another way, a bit either flips, i.e., changes 0 to 1 or 1 to 0, or remains unchanged over a timestep depending on the state of that bit’s local neighborhood. Thus, by choosing a particular rule, one encodes a trade off between activity (bit flips) and inactivity (bit remains unchanged). Despite their simple construction, cellular automata dynamics are diverse; they can produce fractals and encryption-quality random numbers. One rule even has the ability to run arbitrary computer algorithms, a property known as universal computation.

Classical cellular automata. Left: rule 90 producing the fractal Sierpiński’s triangle. Middle: rule 30 can be used to generate random numbers. Right: rule 110 is capable of universal computation.

In QCA, bits are promoted to qubits. Instead of being just 0 or 1 like a bit, a qubit can be a continuous mixture of both 0 and 1, a property called superposition. In QCA, a qubit’s two neighbors being 0 or 1 determine whether or not it changes. For example, when in an active neighborhood configuration, a qubit can be coded to change from 0 to “0 plus 1” or from 1 to “0 minus 1”. This is already a head-scratcher, but things get even weirder. If a qubit’s neighbors are in a superposition, then the center qubit can become entangled with those neighbors. Entanglement correlates qubits in a way that is not possible with classical bits.

Do QCA support the emergent complexity observed in their classical cousins? What are the effects of a continuous state space, superposition, and entanglement? My colleagues and I attacked these questions by re-examining many-body physics tools through the lens of complexity science. Singing the lead, we have a workhorse of quantum and solid-state physics: two-point correlations. Singing harmony we have the bread-and-butter of network analysis: complex-network measures. The duet between the two tells the story of structured correlations in QCA dynamics.

In a bit more detail, at each QCA timestep we calculate the mutual information between all qubits i and all other qubits j. Doing so reveals how much there is to learn about one qubit by measuring another, including effects of quantum entanglement. Visualizing each qubit as a node, the mutual information can be depicted as weighted links between nodes: the more correlated two qubits are, the more strongly they are linked. The collection of nodes and links makes a network. Some QCA form unstructured, randomly-linked networks while others are highly structured. 

Complex-network measures are designed to highlight certain structural patterns within a network. Historically, these measures have been used to study diverse networked-systems like friend groups on Facebook, biomolecule pathways in metabolism, and functional-connectivity in the brain. Remarkably, the most structured QCA networks we observed quantitatively resemble those of the complex systems just mentioned despite their simple construction and quantum unitary dynamics. 

Visualizing mutual information networks. Left: A Goldilocks-QCA generated network. Right: a random network.

What’s more, the particular QCA that generate the most complex networks are those that balance the activity-inactivity trade-off. From this observation, we formulate what we call the Goldilocks principle: QCA that generate the most complexity are those that change a qubit if and only if the qubit’s neighbors contain an equal number of 1’s and 0’s. The Goldilocks rules are neither too inactive nor too active, balancing the tradeoff to be “just right.”  We demonstrated the Goldilocks principle for QCA with nearest-neighbor constraints as well as QCA with nearest-and-next-nearest-neighbor constraints.

To my delight, the scientific conclusions of my QCA research resonate with broader lessons-learned from my time as a PhD student: Life is full of trade-offs, and finding the right balance is key to achieving that “just right” feeling.

Cutting the quantum mustard

I had a relative to whom my parents referred, when I was little, as “that great-aunt of yours who walked into a glass door at your cousin’s birthday party.” I was a small child in a large family that mostly lived far away; little else distinguished this great-aunt from other relatives, in my experience. She’d intended to walk from my grandmother’s family room to the back patio. A glass door stood in the way, but she didn’t see it. So my great-aunt whammed into the glass; spent part of the party on the couch, nursing a nosebleed; and earned the epithet via which I identified her for years.

After growing up, I came to know this great-aunt as a kind, gentle woman who adored her family and was adored in return. After growing into a physicist, I came to appreciate her as one of my earliest instructors in necessary and sufficient conditions.

My great-aunt’s intended path satisfied one condition necessary for her to reach the patio: Nothing visible obstructed the path. But the path failed to satisfy a sufficient condition: The invisible obstruction—the glass door—had been neither slid nor swung open. Sufficient conditions, my great-aunt taught me, mustn’t be overlooked.

Her lesson underlies a paper I published this month, with coauthors from the Cambridge other than mine—Cambridge, England: David Arvidsson-Shukur and Jacob Chevalier Drori. The paper concerns, rather than pools and patios, quasiprobabilities, which I’ve blogged about many times [1,2,3,4,5,6,7]. Quasiprobabilities are quantum generalizations of probabilities. Probabilities describe everyday, classical phenomena, from Monopoly to March Madness to the weather in Massachusetts (and especially the weather in Massachusetts). Probabilities are real numbers (not dependent on the square-root of -1); they’re at least zero; and they compose in certain ways (the probability of sun or hail equals the probability of sun plus the probability of hail). Also, the probabilities that form a distribution, or a complete set, sum to one (if there’s a 70% chance of rain, there’s a 30% chance of no rain). 

In contrast, quasiprobabilities can be negative and nonreal. We call such values nonclassical, as they’re unavailable to the probabilities that describe classical phenomena. Quasiprobabilities represent quantum states: Imagine some clump of particles in a quantum state described by some quasiprobability distribution. We can imagine measuring the clump however we please. We can calculate the possible outcomes’ probabilities from the quasiprobability distribution.

Not from my grandmother’s house, although I wouldn’t mind if it were.

My favorite quasiprobability is an obscure fellow unbeknownst even to most quantum physicists: the Kirkwood-Dirac distribution. John Kirkwood defined it in 1933, and Paul Dirac defined it independently in 1945. Then, quantum physicists forgot about it for decades. But the quasiprobability has undergone a renaissance over the past few years: Experimentalists have measured it to infer particles’ quantum states in a new way. Also, colleagues and I have generalized the quasiprobability and discovered applications of the generalization across quantum physics, from quantum chaos to metrology (the study of how we can best measure things) to quantum thermodynamics to the foundations of quantum theory.

In some applications, nonclassical quasiprobabilities enable a system to achieve a quantum advantage—to usefully behave in a manner impossible for classical systems. Examples include metrology: Imagine wanting to measure a parameter that characterizes some piece of equipment. You’ll perform many trials of an experiment. In each trial, you’ll prepare a system (for instance, a photon) in some quantum state, send it through the equipment, and measure one or more observables of the system. Say that you follow the protocol described in this blog post. A Kirkwood-Dirac quasiprobability distribution describes the experiment.1 From each trial, you’ll obtain information about the unknown parameter. How much information can you obtain, on average over trials? Potentially more information if some quasiprobabilities are negative than if none are. The quasiprobabilities can be negative only if the state and observables fail to commute with each other. So noncommutation—a hallmark of quantum physics—underlies exceptional metrological results, as shown by Kirkwood-Dirac quasiprobabilities.

Exceptional results are useful, and we might aim to design experiments that achieve them. We can by designing experiments described by nonclassical Kirkwood-Dirac quasiprobabilities. When can the quasiprobabilities become nonclassical? Whenever the relevant quantum state and observables fail to commute, the quantum community used to believe. This belief turns out to mirror the expectation that one could access my grandmother’s back patio from the living room whenever no visible barriers obstructed the path. As a lack of visible barriers was necessary for patio access, noncommutation is necessary for Kirkwood-Dirac nonclassicality. But noncommutation doesn’t suffice, according to my paper with David and Jacob. We identified a sufficient condition, sliding back the metaphorical glass door on Kirkwood-Dirac nonclassicality. The condition depends on simple properties of the system, state, and observables. (Experts: Examples include the Hilbert space’s dimensionality.) We also quantified and upper-bounded the amount of nonclassicality that a Kirkwood-Dirac quasiprobability can contain.

From an engineering perspective, our results can inform the design of experiments intended to achieve certain quantum advantages. From a foundational perspective, the results help illuminate the sources of certain quantum advantages. To achieve certain advantages, noncommutation doesn’t cut the mustard—but we now know a condition that does.

For another take on our paper, check out this news article in Physics Today.  

1Really, a generalized Kirkwood-Dirac quasiprobability. But that phrase contains a horrendous number of syllables, so I’ll elide the “generalized.”