Ten lessons I learned from John Preskill

Last August, Toronto’s Centre for Quantum Information and Quantum Control (CQIQC) gave me 35 minutes to make fun of John Preskill in public. CQIQC was hosting its biannual conference, also called CQIQC, in Toronto. The conference features the awarding of the John Stewart Bell Prize for fundamental quantum physics. The prize derives its name for the thinker who transformed our understanding of entanglement. John received this year’s Bell Prize for identifying, with collaborators, how we can learn about quantum states from surprisingly few trials and measurements.

The organizers invited three Preskillites to present talks in John’s honor: Hoi-Kwong Lo, who’s helped steer quantum cryptography and communications; Daniel Gottesman, who’s helped lay the foundations of quantum error correction; and me. I believe that one of the most fitting ways to honor John is by sharing the most exciting physics you know of. I shared about quantum thermodynamics for (simple models of) nuclear physics, along with ten lessons I learned from John. You can watch the talk here and check out the paper, recently published in Physical Review Letters, for technicalities.

John has illustrated this lesson by wrestling with the black-hole-information paradox, including alongside Stephen Hawking. Quantum information theory has informed quantum thermodynamics, as Quantum Frontiers regulars know. Quantum thermodynamics is the study of work (coordinated energy that we can harness directly) and heat (the energy of random motion). Systems exchange heat with heat reservoirs—large, fixed-temperature systems. As I draft this blog post, for instance, I’m radiating heat into the frigid air in Montreal Trudeau Airport.

So much for quantum information. How about high-energy physics? I’ll include nuclear physics in the category, as many of my Europeans colleagues do. Much of nuclear physics and condensed matter involves gauge theories. A gauge theory is a model that contains more degrees of freedom than the physics it describes. Similarly, a friend’s description of the CN Tower could last twice as long as necessary, due to redundancies. Electrodynamics—the theory behind light bulbs—is a gauge theory. So is quantum chromodynamics, the theory of the strong force that holds together a nucleus’s constituents.

Every gauge theory obeys Gauss’s law. Gauss’s law interrelates the matter at a site to the gauge field around the site. For example, imagine a positive electric charge in empty space. An electric field—a gauge field—points away from the charge at every spot in space. Imagine a sphere that encloses the charge. How much of the electric field is exiting the sphere? The answer depends on the amount of charge inside, according to Gauss’s law.

Gauss’s law interrelates the matter at a site with the gauge field nearby…which is related to the matter at the next site…which is related to the gauge field farther away. So everything depends on everything else. So we can’t easily claim that over here are independent degrees of freedom that form a system of interest, while over there are independent degrees of freedom that form a heat reservoir. So how can we define the heat and work exchanged within a lattice gauge theory? If we can’t, we should start biting our nails: thermodynamics is the queen of the physical theories, a metatheory expected to govern all other theories. But how can we define the quantum thermodynamics of lattice gauge theories? My colleague Zohreh Davoudi and her group asked me this question.

I had the pleasure of addressing the question with five present and recent Marylanders…

…the mention of whom in my CQIQC talk invited…

I’m a millennial; social media took off with my generation. But I enjoy saying that my PhD advisor enjoys far more popularity on social media than I do.

How did we begin establishing a quantum thermodynamics for lattice gauge theories?

Someone who had a better idea than I, when I embarked upon this project, was my colleague Chris Jarzynski. So did Dvira Segal, a University of Toronto chemist and CQIQC’s director. So did everyone else who’d helped develop the toolkit of strong-coupling thermodynamics. I’d only heard of the toolkit, but I thought it sounded useful for lattice gauge theories, so I invited Chris to my conversations with Zohreh’s group.

I didn’t create this image for my talk, believe it or not. The picture already existed on the Internet, courtesy of this blog.

Strong-coupling thermodynamics concerns systems that interact strongly with reservoirs. System–reservoir interactions are weak, or encode little energy, throughout much of thermodynamics. For example, I exchange little energy with Montreal Trudeau’s air, relative to the amount of energy inside me. The reason is, I exchange energy only through my skin. My skin forms a small fraction of me because it forms my surface. My surface is much smaller than my volume, which is proportional to the energy inside me. So I couple to Montreal Trudeau’s air weakly.

My surface would be comparable to my volume if I were extremely small—say, a quantum particle. My interaction with the air would encode loads of energy—an amount comparable to the amount inside me. Should we count that interaction energy as part of my energy or as part of the air’s energy? Could we even say that I existed, and had a well-defined form, independently of that interaction energy? Strong-coupling thermodynamics provides a framework for answering these questions.

Kevin Kuns, a former Quantum Frontiers blogger, described how John explains physics through simple concepts, like a ball attached to a spring. John’s gentle, soothing voice resembles a snake charmer’s, Kevin wrote. John charms his listeners into returning to their textbooks and brushing up on basic physics.

Little is more basic than the first law of thermodynamics, synopsized as energy conservation. The first law governs how much a system’s internal energy changes during any process. The energy change equals the heat absorbed, plus the work absorbed, by the system. Every formulation of thermodynamics should obey the first law—including strong-coupling thermodynamics. 

Which lattice-gauge-theory processes should we study, armed with the toolkit of strong-coupling thermodynamics? My collaborators and I implicitly followed

and

We don’t want to irritate experimentalists by asking them to run difficult protocols. Tom Rosenbaum, on the left of the previous photograph, is a quantum experimentalist. He’s also the president of Caltech, so John has multiple reasons to want not to irritate him.

Quantum experimentalists have run quench protocols on many quantum simulators, or special-purpose quantum computers. During a quench protocol, one changes a feature of the system quickly. For example, many quantum systems consist of particles hopping across a landscape of hills and valleys. One might flatten a hill during a quench.

We focused on a three-step quench protocol: (1) Set the system up in its initial landscape. (2) Quickly change the landscape within a small region. (3) Let the system evolve under its natural dynamics for a long time. Step 2 should cost work. How can we define the amount of work performed? By following

John wrote a blog post about how the typical physicist is a one-trick pony: they know one narrow subject deeply. John prefers to know two subjects. He can apply insights from one field to the other. A two-trick pony can show that Gauss’s law behaves like a strong interaction—that lattice gauge theories are strongly coupled thermodynamic systems. Using strong-coupling thermodynamics, the two-trick pony can define the work (and heat) exchanged within a lattice gauge theory. 

An experimentalist can easily measure the amount of work performed,1 we expect, for two reasons. First, the experimentalist need measure only the small region where the landscape changed. Measuring the whole system would be tricky, because it’s so large and it can contain many particles. But an experimentalist can control the small region. Second, we proved an equation that should facilitate experimental measurements. The equation interrelates the work performed1 with a quantity that seems experimentally accessible.

My team applied our work definition to a lattice gauge theory in one spatial dimension—a theory restricted to living on a line, like a caterpillar on a thin rope. You can think of the matter as qubits2 and the gauge field as more qubits. The system looks identical if you flip it upside-down; that is, the theory has a \mathbb{Z}_2 symmetry. The system has two phases, analogous to the liquid and ice phases of H_2O. Which phase the system occupies depends on the chemical potential—the average amount of energy needed to add a particle to the system (while the system’s entropy, its volume, and more remain constant).

My coauthor Connor simulated the system numerically, calculating its behavior on a classical computer. During the simulated quench process, the system began in one phase (like H_2O beginning as water). The quench steered the system around within the phase (as though changing the water’s temperature) or across the phase transition (as though freezing the water). Connor computed the work performed during the quench.1 The amount of work changed dramatically when the quench started steering the system across the phase transition. 

Not only could we define the work exchanged within a lattice gauge theory, using strong-coupling quantum thermodynamics. Also, that work signaled a phase transition—a large-scale, qualitative behavior.

What future do my collaborators and I dream of for our work? First, we want for an experimentalist to measure the work1 spent on a lattice-gauge-theory system in a quantum simulation. Second, we should expand our definitions of quantum work and heat beyond sudden-quench processes. How much work and heat do particles exchange while scattering in particle accelerators, for instance? Third, we hope to identify other phase transitions and macroscopic phenomena using our work and heat definitions. Fourth—most broadly—we want to establish a quantum thermodynamics for lattice gauge theories.

Five years ago, I didn’t expect to be collaborating on lattice gauge theories inspired by nuclear physics. But this work is some of the most exciting I can think of to do. I hope you think it exciting, too. And, more importantly, I hope John thought it exciting in Toronto.

I was a student at Caltech during “One Entangled Evening,” the campus-wide celebration of Richard Feynman’s 100th birthday. So I watched John sing and dance onstage, exhibiting no fear of embarrassing himself. That observation seemed like an appropriate note on which to finish with my slides…and invite questions from the audience.

Congratulations on your Bell Prize, John.

1Really, the dissipated work.

2Really, hardcore bosons.

Finding Ed Jaynes’s ghost

You might have heard of the conundrum “What do you give the man who has everything?” I discovered a variation on it last October: how do you celebrate the man who studied (nearly) everything? Physicist Edwin Thompson Jaynes impacted disciplines from quantum information theory to biomedical imaging. I almost wrote “theoretical physicist,” instead of “physicist,” but a colleague insisted that Jaynes had a knack for electronics and helped design experiments, too. Jaynes worked at Washington University in St. Louis (WashU) from 1960 to 1992. I’d last visited the university in 2018, as a newly minted postdoc collaborating with WashU experimentalist Kater Murch. I’d scoured the campus for traces of Jaynes like a pilgrim seeking a saint’s forelock or humerus. The blog post “Chasing Ed Jaynes’s ghost” documents that hunt.

I found his ghost this October.

Kater and colleagues hosted the Jaynes Centennial Symposium on a brilliant autumn day when the campus’s trees were still contemplating shedding their leaves. The agenda featured researchers from across the sciences and engineering. We described how Jaynes’s legacy has informed 21st-century developments in quantum information theory, thermodynamics, biophysics, sensing, and computation. I spoke about quantum thermodynamics and information theory—specifically, incompatible conserved quantities, about which my research-group members and I have blogged many times.

Irfan Siddiqi spoke about quantum technologies. An experimentalist at the University of California, Berkeley, Irfan featured on Quantum Frontiers seven years ago. His lab specializes in superconducting qubits, tiny circuits in which current can flow forever, without dissipating. How can we measure a superconducting qubit? We stick the qubit in a box. Light bounces back and forth across the box. The light interacts with the qubit while traversing it, in accordance with the Jaynes–Cummings model. We can’t seal any box perfectly, so some light will leak out. That light carries off information about the qubit. We can capture the light using a photodetector to infer about the qubit’s state.

The first half of Jaynes–Cummings

Bill Bialek, too, spoke about inference. But Bill is a Princeton biophysicist, so fruit flies preoccupy him more than qubits do. A fruit fly metamorphoses from a maggot that hatches from an egg. As the maggot develops, its cells differentiate: some form a head, some form a tail, and so on. Yet all the cells contain the same genetic information. How can a head ever emerge, to differ from a tail? 

A fruit-fly mother, Bill revealed, injects molecules into an egg at certain locations. These molecules diffuse across the egg, triggering the synthesis of more molecules. The knock-on molecules’ concentrations can vary strongly across the egg: a maggot’s head cells contain molecules at certain concentrations, and the tail cells contain the same molecules at other concentrations.

At this point in Bill’s story, I was ready to take my hat off to biophysicists for answering the question above, which I’ll rephrase here: if we find that a certain cell belongs to a maggot’s tail, why does the cell belong to the tail? But I enjoyed even more how Bill turned the question on its head (pun perhaps intended): imagine that you’re a maggot cell. How can you tell where in the maggot you are, to ascertain how to differentiate? Nature asks this question (loosely speaking), whereas human observers ask Bill’s first question.

To answer the second question, Bill recalled which information a cell accesses. Suppose you know four molecules’ concentrations: c_1, c_2, c_3, and c_4. How accurately can you predict the cell’s location? That is, what probability does the cell have of sitting at some particular site, conditioned on the cs? That probability is large only at one site, biophysicists have found empirically. So a cell can accurately infer its position from its molecules’ concentrations.

I’m no biophysicist (despite minor evidence to the contrary), but I enjoyed Bill’s story as I enjoyed Irfan’s. Probabilities, information, and inference are abstract notions; yet they impact physical reality, from insects to quantum science. This tension between abstraction and concreteness arrested me when I first encountered entropy, in a ninth-grade biology lecture. The tension drew me into information theory and thermodynamics. These toolkits permeate biophysics as they permeate my disciplines. So, throughout the symposium, I spoke with engineers, medical-school researchers, biophysicists, thermodynamicists, and quantum scientists. They all struck me as my kind of people, despite our distribution across the intellectual landscape. Jaynes reasoned about distributions—probability distributions—and I expect he’d have approved of this one. The man who studied nearly everything deserves a celebration that illuminates nearly everything.

Beyond NISQ: The Megaquop Machine

On December 11, I gave a keynote address at the Q2B 2024 Conference in Silicon Valley. This is a transcript of my remarks. The slides I presented are here. The video of the talk is here.

NISQ and beyond

I’m honored to be back at Q2B for the 8th year in a row.

The Q2B conference theme is “The Roadmap to Quantum Value,” so I’ll begin by showing a slide from last year’s talk. As best we currently understand, the path to economic impact is the road through fault-tolerant quantum computing. And that poses a daunting challenge for our field and for the quantum industry.

We are in the NISQ era. And NISQ technology already has noteworthy scientific value. But as of now there is no proposed application of NISQ computing with commercial value for which quantum advantage has been demonstrated when compared to the best classical hardware running the best algorithms for solving the same problems. Furthermore, currently there are no persuasive theoretical arguments indicating that commercially viable applications will be found that do not use quantum error-correcting codes and fault-tolerant quantum computing.

NISQ, meaning Noisy Intermediate-Scale Quantum, is a deliberately vague term. By design, it has no precise quantitative meaning, but it is intended to convey an idea: We now have quantum machines such that brute force simulation of what the quantum machine does is well beyond the reach of our most powerful existing conventional computers. But these machines are not error-corrected, and noise severely limits their computational power.

In the future we can envision FASQ* machines, Fault-Tolerant Application-Scale Quantum computers that can run a wide variety of useful applications, but that is still a rather distant goal. What term captures the path along the road from NISQ to FASQ? Various terms retaining the ISQ format of NISQ have been proposed [here, here, here], but I would prefer to leave ISQ behind as we move forward, so I’ll speak instead of a megaquop or gigaquop machine and so on meaning one capable of executing a million or a billion quantum operations, but with the understanding that mega means not precisely a million but somewhere in the vicinity of a million.

Naively, a megaquop machine would have an error rate per logical gate of order 10^{-6}, which we don’t expect to achieve anytime soon without using error correction and fault-tolerant operation. Or maybe the logical error rate could be somewhat larger, as we expect to be able to boost the simulable circuit volume using various error mitigation techniques in the megaquop era just as we do in the NISQ era. Importantly, the megaquop machine would be capable of achieving some tasks beyond the reach of classical, NISQ, or analog quantum devices, for example by executing circuits with of order 100 logical qubits and circuit depth of order 10,000.

What resources are needed to operate it? That depends on many things, but a rough guess is that tens of thousands of high-quality physical qubits could suffice. When will we have it? I don’t know, but if it happens in just a few years a likely modality is Rydberg atoms in optical tweezers, assuming they continue to advance in both scale and performance.

What will we do with it? I don’t know, but as a scientist I expect we can learn valuable lessons by simulating the dynamics of many-qubit systems on megaquop machines. Will there be applications that are commercially viable as well as scientifically instructive? That I can’t promise you.

The road to fault tolerance

To proceed along the road to fault tolerance, what must we achieve? We would like to see many successive rounds of accurate error syndrome measurement such that when the syndromes are decoded the error rate per measurement cycle drops sharply as the code increases in size. Furthermore, we want to decode rapidly, as will be needed to execute universal gates on protected quantum information. Indeed, we will want the logical gates to have much higher fidelity than physical gates, and for the logical gate fidelities to improve sharply as codes increase in size. We want to do all this at an acceptable overhead cost in both the number of physical qubits and the number of physical gates. And speed matters — the time on the wall clock for executing a logical gate should be as short as possible.

A snapshot of the state of the art comes from the Google Quantum AI team. Their recently introduced Willow superconducting processor has improved transmon lifetimes, measurement errors, and leakage correction compared to its predecessor Sycamore. With it they can perform millions of rounds of surface-code error syndrome measurement with good stability, each round lasting about a microsecond. Most notably, they find that the logical error rate per measurement round improves by a factor of 2 (a factor they call Lambda) when the code distance increases from 3 to 5 and again from 5 to 7, indicating that further improvements should be achievable by scaling the device further. They performed accurate real-time decoding for the distance 3 and 5 codes. To further explore the performance of the device they also studied the repetition code, which corrects only bit flips, out to a much larger code distance. As the hardware continues to advance we hope to see larger values of Lambda for the surface code, larger codes achieving much lower error rates, and eventually not just quantum memory but also logical two-qubit gates with much improved fidelity compared to the fidelity of physical gates.

Last year I expressed concern about the potential vulnerability of superconducting quantum processors to ionizing radiation such as cosmic ray muons. In these events, errors occur in many qubits at once, too many errors for the error-correcting code to fend off. I speculated that we might want to operate a superconducting processor deep underground to suppress the muon flux, or to use less efficient codes that protect against such error bursts.

The good news is that the Google team has demonstrated that so-called gap engineering of the qubits can reduce the frequency of such error bursts by orders of magnitude. In their studies of the repetition code they found that, in the gap-engineered Willow processor, error bursts occurred about once per hour, as opposed to once every ten seconds in their earlier hardware.  Whether suppression of error bursts via gap engineering will suffice for running deep quantum circuits in the future is not certain, but this progress is encouraging. And by the way, the origin of the error bursts seen every hour or so is not yet clearly understood, which reminds us that not only in superconducting processors but in other modalities as well we are likely to encounter mysterious and highly deleterious rare events that will need to be understood and mitigated.

Real-time decoding

Fast real-time decoding of error syndromes is important because when performing universal error-corrected computation we must frequently measure encoded blocks and then perform subsequent operations conditioned on the measurement outcomes. If it takes too long to decode the measurement outcomes, that will slow down the logical clock speed. That may be a more serious problem for superconducting circuits than for other hardware modalities where gates can be orders of magnitude slower.

For distance 5, Google achieves a latency, meaning the time from when data from the final round of syndrome measurement is received by the decoder until the decoder returns its result, of about 63 microseconds on average. In addition, it takes about another 10 microseconds for the data to be transmitted via Ethernet from the measurement device to the decoding workstation. That’s not bad, but considering that each round of syndrome measurement takes only a microsecond, faster would be preferable, and the decoding task becomes harder as the code grows in size.

Riverlane and Rigetti have demonstrated in small experiments that the decoding latency can be reduced by running the decoding algorithm on FPGAs rather than CPUs, and by integrating the decoder into the control stack to reduce communication time. Adopting such methods may become increasingly important as we scale further. Google DeepMind has shown that a decoder trained by reinforcement learning can achieve a lower logical error rate than a decoder constructed by humans, but it’s unclear whether that will work at scale because the cost of training rises steeply with code distance. Also, the Harvard / QuEra team has emphasized that performing correlated decoding across multiple code blocks can reduce the depth of fault-tolerant constructions, but this also increases the complexity of decoding, raising concern about whether such a scheme will be scalable.

Trading simplicity for performance

The Google processors use transmon qubits, as do superconducting processors from IBM and various other companies and research groups. Transmons are the simplest superconducting qubits and their quality has improved steadily; we can expect further improvement with advances in materials and fabrication. But a logical qubit with very low error rate surely will be a complicated object due to the hefty overhead cost of quantum error correction. Perhaps it is worthwhile to fashion a more complicated physical qubit if the resulting gain in performance might actually simplify the operation of a fault-tolerant quantum computer in the megaquop regime or well beyond. Several versions of this strategy are being pursued.

One approach uses cat qubits, in which the encoded 0 and 1 are coherent states of a microwave resonator, well separated in phase space, such that the noise afflicting the qubit is highly biased. Bit flips are exponentially suppressed as the mean photon number of the resonator increases, while the error rate for phase flips induced by loss from the resonator increases only linearly with the photon number. This year the AWS team built a repetition code to correct phase errors for cat qubits that are passively protected against bit flips, and showed that increasing the distance of the repetition code from 3 to 5 slightly improves the logical error rate. (See also here.)

Another helpful insight is that error correction can be more effective if we know when and where the errors occur in a quantum circuit. We can apply this idea using a dual rail encoding of the qubits. With two microwave resonators, for example, we can encode a qubit by placing a single photon in either the first resonator (the 10) state, or the second resonator (the 01 state). The dominant error is loss of a photon, causing either the 01 or 10 state to decay to 00. One can check whether the state is 00, detecting whether the error occurred without disturbing a coherent superposition of 01 and 10. In a device built by the Yale / QCI team, loss errors are detected over 99% of the time and all undetected errors are relatively rare. Similar results were reported by the AWS team, encoding a dual-rail qubit in a pair of transmons instead of resonators.

Another idea is encoding a finite-dimensional quantum system in a state of a resonator that is highly squeezed in two complementary quadratures, a so-called GKP encoding. This year the Yale group used this scheme to encode 3-dimensional and 4-dimensional systems with decay rate better by a factor of 1.8 than the rate of photon loss from the resonator. (See also here.)

A fluxonium qubit is more complicated than a transmon in that it requires a large inductance which is achieved with an array of Josephson junctions, but it has the advantage of larger anharmonicity, which has enabled two-qubit gates with better than three 9s of fidelity, as the MIT team has shown.

Whether this trading of simplicity for performance in superconducting qubits will ultimately be advantageous for scaling to large systems is still unclear. But it’s appropriate to explore such alternatives which might pay off in the long run.

Error correction with atomic qubits

We have also seen progress on error correction this year with atomic qubits, both in ion traps and optical tweezer arrays. In these platforms qubits are movable, making it possible to apply two-qubit gates to any pair of qubits in the device. This opens the opportunity to use more efficient coding schemes, and in fact logical circuits are now being executed on these platforms. The Harvard / MIT / QuEra team sampled circuits with 48 logical qubits on a 280-qubit device –- that big news broke during last year’s Q2B conference. Atom computing and Microsoft ran an algorithm with 28 logical qubits on a 256-qubit device. Quantinuum and Microsoft prepared entangled states of 12 logical qubits on a 56-qubit device.

However, so far in these devices it has not been possible to perform more than a few rounds of error syndrome measurement, and the results rely on error detection and postselection. That is, circuit runs are discarded when errors are detected, a scheme that won’t scale to large circuits. Efforts to address these drawbacks are in progress. Another concern is that the atomic movement slows the logical cycle time. If all-to-all coupling enabled by atomic movement is to be used in much deeper circuits, it will be important to speed up the movement quite a lot.

Toward the megaquop machine

How can we reach the megaquop regime? More efficient quantum codes like those recently discovered by the IBM team might help. These require geometrically nonlocal connectivity and are therefore better suited for Rydberg optical tweezer arrays than superconducting processors, at least for now. Error mitigation strategies tailored for logical circuits, like those pursued by Qedma, might help by boosting the circuit volume that can be simulated beyond what one would naively expect based on the logical error rate. Recent advances from the Google team, which reduce the overhead cost of logical gates, might also be helpful.

What about applications? Impactful applications to chemistry typically require rather deep circuits so are likely to be out of reach for a while yet, but applications to materials science provide a more tempting target in the near term. Taking advantage of symmetries and various circuit optimizations like the ones Phasecraft has achieved, we might start seeing informative results in the megaquop regime or only slightly beyond.

As a scientist, I’m intrigued by what we might conceivably learn about quantum dynamics far from equilibrium by doing simulations on megaquop machines, particularly in two dimensions. But when seeking quantum advantage in that arena we should bear in mind that classical methods for such simulations are also advancing impressively, including in the past year (for example, here and here).

To summarize, advances in hardware, control, algorithms, error correction, error mitigation, etc. are bringing us closer to megaquop machines, raising a compelling question for our community: What are the potential uses for these machines? Progress will require innovation at all levels of the stack.  The capabilities of early fault-tolerant quantum processors will guide application development, and our vision of potential applications will guide technological progress. Advances in both basic science and systems engineering are needed. These are still the early days of quantum computing technology, but our experience with megaquop machines will guide the way to gigaquops, teraquops, and beyond and hence to widely impactful quantum value that benefits the world.

I thank Dorit Aharonov, Sergio Boixo, Earl Campbell, Roland Farrell, Ashley Montanaro, Mike Newman, Will Oliver, Chris Pattison, Rob Schoelkopf, and Qian Xu for helpful comments.

*The acronym FASQ was suggested to me by Andrew Landahl.

The megaquop machine (image generated by ChatGPT.
The megaquop machine (image generated by ChatGPT).

Sculpting quantum steampunk

In 2020, many of us logged experiences that we’d never anticipated. I wrote a nonfiction book and got married outside the Harvard Faculty Club (because nobody was around to shoo us away). Equally unexpectedly, I received an invitation to collaborate with a professional artist. One Bruce Rosenbaum emailed me out of the blue:

I watched your video on Quantum Steampunk: Quantum Information Meets Thermodynamics. [ . . . ] I’d like to explore collaborating with you on bringing together the fusion of Quantum physics and Thermodynamics into the real world with functional Steampunk art and design.

This Bruce Rosenbaum, I reasoned, had probably seen some colloquium of mine that a university had recorded and posted online. I’d presented a few departmental talks about how quantum thermodynamics is the real-world incarnation of steampunk.

I looked Bruce up online. Wired Magazine had called the Massachusetts native “the steampunk evangelist,” and The Wall Street Journal had called him “the steampunk guru.” He created sculptures for museums and hotels, in addition to running workshops that riffed on the acronym STEAM (science, technology, engineering, art, and mathematics). MTV’s Extreme Cribs had spotlighted his renovation of a Victorian-era church into a home and workshop.

The Rosenbaums’ kitchen (photo from here)

All right, I replied, I’m game. But research fills my work week, so can you talk at an unusual time?

We Zoomed on a Saturday afternoon. Bruce Zooms from precisely the room that you’d hope to find a steampunk artist in: a workshop filled with brass bits and bobs spread across antique-looking furniture. Something intricate is usually spinning atop a table behind him. And no, none of it belongs to a virtual background. Far from an overwrought inventor, though, Bruce exudes a vibe as casual as the T-shirt he often wears—when not interviewing in costume. A Boston-area accent completed the feeling of chatting with a neighbor.

Bruce proposed building a quantum-steampunk sculpture. I’d never dreamed of the prospect, but it sounded like an adventure, so I agreed. We settled on a sculpture centered on a quantum engine. Classical engines inspired the development of thermodynamics around the time of the Industrial Revolution. One of the simplest engines—the heat engine—interacts with two environments, or reservoirs: one cold and one hot. Heat—the energy of random atomic motion—flows from the hot to the cold. The engine siphons off part of the heat, converting it into work—coordinated energy that can, say, turn a turbine. 

Can a quantum system convert random heat into useful work? Yes, quantum thermodynamicists have shown. Bell Labs scientists designed a quantum engine formed from one atom, during the 1950s and 1960s. Since then, physicists have co-opted superconducting qubits, trapped ions, and more into quantum engines. Entanglement can enhance quantum engines, which can both suffer and benefit from quantum coherences (wave-like properties, in the spirit of wave–particle duality). Experimentalists have realized quantum engines in labs. So Bruce and I placed (an artistic depiction of) a quantum engine at our sculpture’s center. The engine consists of a trapped ion—a specialty of Maryland, where I accepted a permanent position that spring.

Bruce engaged an illustrator, Jim Su, to draw the sculpture. We iterated through draft after draft, altering shapes and fixing scientific content. Versions from the cutting-room floor now adorn the Maryland Quantum-Thermodynamics Hub’s website.

Designing the sculpture was a lark. Finding funding to build it has required more grit. During the process, our team grew to include scientific-computing expert Alfredo Nava-Tudelo, physicist Bill Phillips, senior faculty specialist Daniel Serrano, and Quantum Frontiers gatekeeper Spiros Michalakis. We secured a grant from the University of Maryland’s Arts for All program this spring. The program is promoting quantum-inspired art this year, in honor of the UN’s designation of 2025 as the International Year of Quantum Science and Technology

Through the end of 2024, we’re building a tabletop version of the sculpture. We were expecting a 3D-printout version to consume our modest grant. But quantum steampunk captured the imagination of Empire Group, the design-engineering company hired by Bruce to create and deploy technical drawings. Empire now plans to include metal and moving parts in the sculpture. 

The Quantum-Steampunk Engine sculpture (drawing by Jim Su)

Empire will create CAD (computer-aided–design) drawings this November, in dialogue with the scientific team and Bruce. The company will fabricate the sculpture in December. The scientists will create educational materials that explain the thermodynamics and quantum physics represented in the sculpture. Starting in 2025, we’ll exhibit the sculpture everywhere possible. Plans include the American Physical Society’s Global Physics Summit (March Meeting), the quantum-steampunk creative-writing course I’m co-teaching next spring, and the Quantum World Congress. Bruce will incorporate the sculpture into his STEAMpunk workshops. Drop us a line if you want the Quantum-Steampunk Engine sculpture at an event as a centerpiece or teaching tool. And stay tuned for updates on the sculpture’s creation process and outreach journey.

Our team’s schemes extend beyond the tabletop sculpture: we aim to build an 8’-by-8’-by-8’ version. The full shebang will contain period antiques, lasers, touchscreens, and moving and interactive parts. We hope that a company, university, or individual will request the full-size version upon seeing its potential in the tabletop.

A sculpture, built by ModVic for a corporate office, of the scale we have in mind. The description on Bruce’s site reads, “A 300 lb. Clipper of the Clouds sculpture inspired by a Jules Verne story. The piece suspends over the corporate lobby.”

After all, what are steampunk and science for, if not dreaming?

Now published: Building Quantum Computers

Building Quantum Computers: A Practical Introduction by Shayan Majidy, Christopher Wilson, and Raymond Laflamme has been published by Cambridge University Press and will be released in the US on September 30. The authors invited me to write a Foreword for the book, which I was happy to do. The publisher kindly granted permission for me to post the Foreword here on Quantum Frontiers.

Foreword

The principles of quantum mechanics, which as far as we know govern all natural phenomena, were discovered in 1925. For 99 years we have built on that achievement to reach a comprehensive understanding of much of the physical world, from molecules to materials to elementary particles and much more. No comparably revolutionary advance in fundamental science has occurred since 1925. But a new revolution is in the offing.

Up until now, most of what we have learned about the quantum world has resulted from considering the behavior of individual particles — for example a single electron propagating as a wave through a crystal, unfazed by barriers that seem to stand in its way. Understanding that single-particle physics has enabled us to explore nature in unprecedented ways, and to build information technologies that have profoundly transformed our lives.

What’s happening now is we’re learning how to instruct particles to evolve in coordinated ways that can’t be accurately described in terms of the behavior of one particle at a time. The particles, as we like to say, can become entangled. Many particles, like electrons or photons or atoms, when highly entangled, exhibit an extraordinary complexity that we can’t capture with the most powerful of today’s supercomputers, or with our current theories of how nature works. That opens extraordinary opportunities for new discoveries and new applications.

Most temptingly, we anticipate that by building and operating large-scale quantum computers, which control the evolution of very complex entangled quantum systems, we will be able to solve some computational problems that are far beyond the reach of today’s digital computers. The concept of a quantum computer was proposed over 40 years ago, and the task of building quantum computing hardware has been pursued in earnest since the 1990s. After decades of steady progress, quantum information processors with hundreds of qubits have become feasible and are scientifically valuable. But we may need quantum processors with millions of qubits to realize practical applications of broad interest. There is still a long way to go.

Why is it taking so long? A conventional computer processes bits, where each bit could be, say, a switch which is either on or off. To build highly complex entangled quantum states, the fundamental information-carrying component of a quantum computer must be what we call a “qubit” rather than a bit. The trouble is that qubits are much more fragile than bits — when a qubit interacts with its environment, the information it carries is irreversibly damaged, a process called decoherence. To perform reliable logical operations on qubits, we need to prevent decoherence by keeping the qubits nearly perfectly isolated from their environment. That’s very hard to do. And because a qubit, unlike a bit, can change continuously, precisely controlling a qubit is a further challenge, even when decoherence is in check.

While theorists may find it convenient to regard a qubit (or a bit) as an abstract object, in an actual processor a qubit needs to be encoded in a particular physical system. There are many options. It might, for example, be encoded in a single atom which can be in either one of two long-lived internal states. Or the spin of a single atomic nucleus or electron which points either up or down along some axis. Or a single photon that occupies either one of two possible optical modes. These are all remarkable encodings, because the qubit resides in a very simple single quantum system, yet, thanks to technical advances over several decades, we have learned to control such qubits reasonably well. Alternatively, the qubit could be encoded in a more complex system, like a circuit conducting electricity without resistance at very low temperature. This is also remarkable, because although the qubit involves the collective motion of billions of pairs of electrons, we have learned to make it behave as though it were a single atom.

To run a quantum computer, we need to manipulate individual qubits and perform entangling operations on pairs of qubits. Once we can perform such single-qubit and two-qubit “quantum gates” with sufficient accuracy, and measure and initialize the qubits as well, then in principle we can perform any conceivable quantum computation by assembling sufficiently many qubits and executing sufficiently many gates.

It’s a daunting engineering challenge to build and operate a quantum system of sufficient complexity to solve very hard computation problems. That systems engineering task, and the potential practical applications of such a machine, are both beyond the scope of Building Quantum Computers. Instead the focus is on the computer’s elementary constituents for four different qubit modalities: nuclear spins, photons, trapped atomic ions, and superconducting circuits. Each type of qubit has its own fascinating story, told here expertly and with admirable clarity.

For each modality a crucial question must be addressed: how to produce well-controlled entangling interactions between two qubits. Answers vary. Spins have interactions that are always on, and can be “refocused” by applying suitable pulses. Photons hardly interact with one another at all, but such interactions can be mocked up using appropriate measurements. Because of their Coulomb repulsion, trapped ions have shared normal modes of vibration that can be manipulated to generate entanglement. Couplings and frequencies of superconducting qubits can be tuned to turn interactions on and off. The physics underlying each scheme is instructive, with valuable lessons for the quantum informationists to heed.

Various proposed quantum information processing platforms have characteristic strengths and weaknesses, which are clearly delineated in this book. For now it is important to pursue a variety of hardware approaches in parallel, because we don’t know for sure which ones have the best long term prospects. Furthermore, different qubit technologies might be best suited for different applications, or a hybrid of different technologies might be the best choice in some settings. The truth is that we are still in the early stages of developing quantum computing systems, and there is plenty of potential for surprises that could dramatically alter the outlook.

Building large-scale quantum computers is a grand challenge facing 21st-century science and technology. And we’re just getting started. The qubits and quantum gates of the distant future may look very different from what is described in this book, but the authors have made wise choices in selecting material that is likely to have enduring value. Beyond that, the book is highly accessible and fun to read. As quantum technology grows ever more sophisticated, I expect the study and control of highly complex many-particle systems to become an increasingly central theme of physical science. If so, Building Quantum Computers will be treasured reading for years to come.

John Preskill
Pasadena, California

Version 1.0.0

Announcing the quantum-steampunk creative-writing course!

Why not run a quantum-steampunk creative-writing course?

Quantum steampunk, as Quantum Frontiers regulars know, is the aesthetic and spirit of a growing scientific field. Steampunk is a subgenre of science fiction. In it, futuristic technologies invade Victorian-era settings: submarines, time machines, and clockwork octopodes populate La Belle Èpoque, a recently liberated Haiti, and Sherlock Holmes’s London. A similar invasion characterizes my research field, quantum thermodynamics: thermodynamics is the study of heat, work, temperature, and efficiency. The Industrial Revolution spurred the theory’s development during the 1800s. The theory’s original subject—nineteenth-century engines—were large, were massive, and contained enormous numbers of particles. Such engines obey the classical mechanics developed during the 1600s. Hence thermodynamics needs re-envisioning for quantum systems. To extend the theory’s laws and applications, quantum thermodynamicists use mathematical and experimental tools from quantum information science. Quantum information science is, in part, the understanding of quantum systems through how they store and process information. The toolkit is partially cutting-edge and partially futuristic, as full-scale quantum computers remain under construction. So applying quantum information to thermodynamics—quantum thermodynamics—strikes me as the real-world incarnation of steampunk.

But the thought of a quantum-steampunk creative-writing course had never occurred to me, and I hesitated over it. Quantum-steampunk blog posts, I could handle. A book, I could handle. Even a short-story contest, I’d handled. But a course? The idea yawned like the pitch-dark mouth of an unknown cavern in my imagination.

But the more I mulled over Edward Daschle’s suggestion, the more I warmed to it. Edward was completing a master’s degree in creative writing at the University of Maryland (UMD), specializing in science fiction. His mentor Emily Brandchaft Mitchell had sung his praises via email. In 2023, Emily had served as a judge for the Quantum-Steampunk Short-Story Contest. She works as a professor of English at UMD, writes fiction, and specializes in the study of genre. I reached out to her last spring about collaborating on a grant for quantum-inspired art, and she pointed to her protégé.

Who won me over. Edward and I are co-teaching “Writing Quantum Steampunk: Science-Fiction Workshop” during spring 2025.

The course will alternate between science and science fiction. Under Edward’s direction, we’ll read and discuss published fiction. We’ll also learn about what genres are and how they come to be. Students will try out writing styles by composing short stories themselves. Everyone will provide feedback about each other’s writing: what works, what’s confusing, and opportunities for improvement. 

The published fiction chosen will mirror the scientific subjects we’ll cover: quantum physics; quantum technologies; and thermodynamics, including quantum thermodynamics. I’ll lead this part of the course. The scientific studies will interleave with the story reading, writing, and workshopping. Students will learn about the science behind the science fiction while contributing to the growing subgenre of quantum steampunk.

We aim to attract students from across campus: physics, English, the Jiménez-Porter Writers’ House, computer science, mathematics, and engineering—plus any other departments whose students have curiosity and creativity to spare. The course already has four cross-listings—Arts and Humanities 270, Physics 299Q, Computer Science 298Q, and Mechanical Engineering 299Q—and will probably acquire a fifth (Chemistry 298Q). You can earn a Distributive Studies: Scholarship in Practice (DSSP) General Education requirement, and undergraduate and graduate students are welcome. QuICS—the Joint Center for Quantum Information and Computer Science, my home base—is paying Edward’s salary through a seed grant. Ross Angelella, the director of the Writers’ House, arranged logistics and doused us with enthusiasm. I’m proud of how organizations across the university are uniting to support the course.

The diversity we seek, though, poses a challenge. The course lacks prerequisites, so I’ll need to teach at a level comprehensible to the non-science students. I’d enjoy doing so, but I’m concerned about boring the science students. Ideally, the science students will help me teach, while the non-science students will challenge us with foundational questions that force us to rethink basic concepts. Also, I hope that non-science students will galvanize discussions about ethical and sociological implications of quantum technologies. But how can one ensure that conversation will flow?

This summer, Edward and I traded candidate stories for the syllabus. Based on his suggestions, I recommend touring science fiction under an expert’s guidance. I enjoyed, for a few hours each weekend, sinking into the worlds of Ted Chiang, Ursula K. LeGuinn, N. K. Jemison, Ken Liu, and others. My scientific background informed my reading more than I’d expected. Some authors, I could tell, had researched their subjects thoroughly. When they transitioned from science into fiction, I trusted and followed them. Other authors tossed jargon into their writing but evidenced a lack of deep understanding. One author nailed technical details about quantum computation, initially impressing me, but missed the big picture: his conflict hinged on a misunderstanding about entanglement. I see all these stories as affording opportunities for learning and teaching, in different ways.

Students can begin registering for “Writing Quantum Steampunk: Science-Fiction Workshop” on October 24. We can offer only 15 seats, due to Writers’ House standards, so secure yours as soon as you can. Part of me still wonders how the Hilbert space I came to be co-teaching a quantum-steampunk creative-writing course.1 But I look forward to reading with you next spring!


1A Hilbert space is a mathematical object that represents a quantum system. But you needn’t know that to succeed in the course.

Let gravity do its work

One day, early this spring, I found myself in a hotel elevator with three other people. The cohort consisted of two theoretical physicists, one computer scientist, and what appeared to be a normal person. I pressed the elevator’s 4 button, as my husband (the computer scientist) and I were staying on the hotel’s fourth floor. The button refused to light up.

“That happened last time,” the normal person remarked. He was staying on the fourth floor, too.

The other theoretical physicist pressed the 3 button.

“Should we press the 5 button,” the normal person continued, “and let gravity do its work?

I took a moment to realize that he was suggesting we ascend to the fifth floor and then induce the elevator to fall under gravity’s influence to the fourth. We were reaching floor three, so I exchanged a “have a good evening” with the other physicist, who left. The door shut, and we began to ascend.

As it happens,” I remarked, “he’s an expert on gravity.” The other physicist was Herman Verlinde, a professor at Princeton.

Such is a side effect of visiting the Simons Center for Geometry and Physics. The Simons Center graces the Stony Brook University campus, which was awash in daffodils and magnolia blossoms last month. The Simons Center derives its name from hedge-fund manager Jim Simons (who passed away during the writing of this article). He achieved landmark physics and math research before earning his fortune on Wall Street as a quant. Simons supported his early loves by funding the Simons Center and other scientific initiatives. The center reminded me of the Perimeter Institute for Theoretical Physics, down to the café’s linen napkins, so I felt at home.

I was participating in the Simons Center workshop “Entanglement, thermalization, and holography.” It united researchers from quantum information and computation, black-hole physics and string theory, quantum thermodynamics and many-body physics, and nuclear physics. We were to share our fields’ approaches to problems centered on thermalization, entanglement, quantum simulation, and the like. I presented about the eigenstate thermalization hypothesis, which elucidates how many-particle quantum systems thermalize. The hypothesis fails, I argued, if a system’s dynamics conserve quantities (analogous to energy and particle number) that can’t be measured simultaneously. Herman Verlinde discussed the ER=EPR conjecture.

My PhD advisor, John Preskill, blogged about ER=EPR almost exactly eleven years ago. Read his blog post for a detailed introduction. Briefly, ER=EPR posits an equivalence between wormholes and entanglement. 

The ER stands for Einstein–Rosen, as in Einstein–Rosen bridge. Sean Carroll provided the punchiest explanation I’ve heard of Einstein–Rosen bridges. He served as the scientific advisor for the 2011 film Thor. Sean suggested that the film feature a wormhole, a connection between two black holes. The filmmakers replied that wormholes were passé. So Sean suggested that the film feature an Einstein–Rosen bridge. “What’s an Einstein–Rosen bridge?” the filmmakers asked. “A wormhole.” So Thor features an Einstein–Rosen bridge.

EPR stands for Einstein–Podolsky–Rosen. The three authors published a quantum paradox in 1935. Their EPR paper galvanized the community’s understanding of entanglement.

ER=EPR is a conjecture that entanglement is closely related to wormholes. As Herman said during his talk, “You probably need entanglement to realize a wormhole.” Or any two maximally entangled particles are connected by a wormhole. The idea crystallized in a paper by Juan Maldacena and Lenny Susskind. They drew on work by Mark Van Raamsdonk (who masterminded the workshop behind this Quantum Frontiers post) and Brian Swingle (who’s appeared in further posts).

Herman presented four pieces of evidence for the conjecture, as you can hear in the video of his talk. One piece emerges from the AdS/CFT duality, a parallel between certain space-times (called anti–de Sitter, or AdS, spaces) and quantum theories that have a certain symmetry (called conformal field theories, or CFTs). A CFT, being quantum, can contain entanglement. One entangled state is called the thermofield double. Suppose that a quantum system is in a thermofield double and you discard half the system. The remaining half looks thermal—we can attribute a temperature to it. Evidence indicates that, if a CFT has a temperature, then it parallels an AdS space that contains a black hole. So entanglement appears connected to black holes via thermality and temperature.

Despite the evidence—and despite the eleven years since John’s publication of his blog post—ER=EPR remains a conjecture. Herman remarked, “It’s more like a slogan than anything else.” His talk’s abstract contains more hedging than a suburban yard. I appreciated the conscientiousness, a college acquaintance having once observed that I spoke carefully even over sandwiches with a friend.

A “source of uneasiness” about ER=EPR, to Herman, is measurability. We can’t check whether a quantum state is entangled via any single measurement. We have to prepare many identical copies of the state, measure the copies, and process the outcome statistics. In contrast, we seem able to conclude that a space-time is connected without measuring multiple copies of the space-time. We can check that a hotel’s first floor is connected to its fourth, for instance, by riding in an elevator once.

Or by riding an elevator to the fifth floor and descending by one story. My husband, the normal person, and I took the stairs instead of falling. The hotel fixed the elevator within a day or two, but who knows when we’ll fix on the truth value of ER=EPR?

With thanks to the conference organizers for their invitation, to the Simons Center for its hospitality, to Jim Simons for his generosity, and to the normal person for inspiration.

How I didn’t become a philosopher (but wound up presenting a named philosophy lecture anyway)

Many people ask why I became a theoretical physicist. The answer runs through philosophy—which I thought, for years, I’d left behind in college.

My formal relationship with philosophy originated with Mr. Bohrer. My high school classified him as a religion teacher, but he co-opted our junior-year religion course into a philosophy course. He introduced us to Plato’s cave, metaphysics, and the pursuit of the essence beneath the skin of appearance. The essence of reality overlaps with quantum theory and relativity, which fascinated him. Not that he understood them, he’d hasten to clarify. But he passed along that fascination to me. I’d always loved dealing in abstract ideas, so the notion of studying the nature of the universe attracted me. A friend and I joked about growing up to be philosophers and—on account of not being able to find jobs—living in cardboard boxes next to each other.

After graduating from high school, I searched for more of the same in Dartmouth College’s philosophy department. I began with two prerequisites for the philosophy major: Moral Philosophy and Informal Logic. I adored those courses, but I adored all my courses.

As a sophomore, I embarked upon an upper-level philosophy course: philosophy of mind. I was one of the course’s youngest students, but the professor assured me that I’d accumulated enough background information in science and philosophy classes. Yet he and the older students threw around technical terms, such as qualia, that I’d never heard of. Those terms resurfaced in the assigned reading, again without definitions. I struggled to follow the conversation.

Meanwhile, I’d been cycling through the sciences. I’d taken my high school’s highest-level physics course, senior year—AP Physics C: Mechanics and Electromagnetism. So, upon enrolling in college, I made the rounds of biology, chemistry, and computer science. I cycled back to physics at the beginning of sophomore year, taking Modern Physics I in parallel with Informal Logic. The physics professor, Miles Blencowe, told me, “I want to see physics in your major.” I did, too, I assured him. But I wanted to see most subjects in my major.

Miles, together with department chair Jay Lawrence, helped me incorporate multiple subjects into a physics-centric program. The major, called “Physics Modified,” stood halfway between the physics major and the create-your-own major offered at some American liberal-arts colleges. The program began with heaps of prerequisite courses across multiple departments. Then, I chose upper-level physics courses, a math course, two history courses, and a philosophy course. I could scarcely believe that I’d planted myself in a physics department; although I’d loved physics since my first course in it, I loved all subjects, and nobody in my family did anything close to physics. But my major would provide a well-rounded view of the subject.

From shortly after I declared my Physics Modified major. Photo from outside the National Academy of Sciences headquarters in Washington, DC.

The major’s philosophy course was an independent study on quantum theory. In one project, I dissected the “EPR paper” published by Einstein, Podolsky, and Rosen (EPR) in 1935. It introduced the paradox that now underlies our understanding of entanglement. But who reads the EPR paper in physics courses nowadays? I appreciated having the space to grapple with the original text. Still, I wanted to understand the paper more deeply; the philosophy course pushed me toward upper-level physics classes.

What I thought of as my last chance at philosophy evaporated during my senior spring. I wanted to apply to graduate programs soon, but I hadn’t decided which subject to pursue. The philosophy and history of physics remained on the table. A history-of-physics course, taught by cosmologist Marcelo Gleiser, settled the matter. I worked my rear off in that course, and I learned loads—but I already knew some of the material from physics courses. Moreover, I knew the material more deeply than the level at which the course covered it. I couldn’t stand the thought of understanding the rest of physics only at this surface level. So I resolved to burrow into physics in graduate school. 

Appropriately, Marcelo published a book with a philosopher (and an astrophysicist) this March.

Burrow I did: after a stint in condensed-matter research, I submerged up to my eyeballs in quantum field theory and differential geometry at the Perimeter Scholars International master’s program. My research there bridged quantum information theory and quantum foundations. I appreciated the balance of fundamental thinking and possible applications to quantum-information-processing technologies. The rigorous mathematical style (lemma-theorem-corollary-lemma-theorem-corollary) appealed to my penchant for abstract thinking. Eating lunch with the Perimeter Institute’s quantum-foundations group, I felt at home.

Craving more research at the intersection of quantum thermodynamics and information theory, I enrolled at Caltech for my PhD. As I’d scarcely believed that I’d committed myself to my college’s physics department, I could scarcely believe that I was enrolling in a tech school. I was such a child of the liberal arts! But the liberal arts include the sciences, and I ended up wrapping Caltech’s hardcore vibe around myself like a favorite denim jacket.

Caltech kindled interests in condensed matter; atomic, molecular, and optical physics; and even high-energy physics. Theorists at Caltech thought not only abstractly, but also about physical platforms; so I started to, as well. I began collaborating with experimentalists as a postdoc, and I’m now working with as many labs as I can interface with at once. I’ve collaborated on experiments performed with superconducting qubits, photons, trapped ions, and jammed grains. Developing an abstract idea, then nursing it from mathematics to reality, satisfies me. I’m even trying to redirect quantum thermodynamics from foundational insights to practical applications.

At the University of Toronto in 2022, with my experimental collaborator Batuhan Yılmaz—and a real optics table!

So I did a double-take upon receiving an invitation to present a named lecture at the University of Pittsburgh Center for Philosophy of Science. Even I, despite not being a philosopher, had heard of the cache of Pitt’s philosophy-of-science program. Why on Earth had I received the invitation? I felt the same incredulity as when I’d handed my heart to Dartmouth’s physics department and then to a tech school. But now, instead of laughing at the image of myself as a physicist, I couldn’t see past it.

Why had I received that invitation? I did a triple-take. At Perimeter, I’d begun undertaking research on resource theories—simple, information-theoretic models for situations in which constraints restrict the operations one can perform. Hardly anyone worked on resource theories then, although they form a popular field now. Philosophers like them, and I’ve worked with multiple classes of resource theories by now.

More recently, I’ve worked with contextuality, a feature that distinguishes quantum theory from classical theories. And I’ve even coauthored papers about closed timelike curves (CTCs), hypothetical worldlines that travel backward in time. CTCs are consistent with general relativity, but we don’t know whether they exist in reality. Regardless, one can simulate CTCs, using entanglement. Collaborators and I applied CTC simulations to metrology—to protocols for measuring quantities precisely. So we kept a foot in practicality and a foot in foundations.

Perhaps the idea of presenting a named lecture on the philosophy of science wasn’t hopelessly bonkers. All right, then. I’d present it.

Presenting at the Center for Philosophy of Science

This March, I presented an ALS Lecture (an Annual Lecture Series Lecture, redundantly) entitled “Field notes on the second law of quantum thermodynamics from a quantum physicist.” Scientists formulated the second law the early 1800s. It helps us understand why time appears to flow in only one direction. I described three enhancements of that understanding, which have grown from quantum thermodynamics and nonequilibrium statistical mechanics: resource-theory results, fluctuation theorems, and thermodynamic applications of entanglement. I also enjoyed talking with Center faculty and graduate students during the afternoon and evening. Then—being a child of the liberal arts—I stayed in Pittsburgh for half the following Saturday to visit the Carnegie Museum of Art.

With a copy of a statue of the goddess Sekhmet. She lives in the Carnegie Museum of Natural History, which shares a building with the art museum, from which I detoured to see the natural-history museum’s ancient-Egypt area (as Quantum Frontiers regulars won’t be surprised to hear).

Don’t get me wrong: I’m a physicist, not a philosopher. I don’t have the training to undertake philosophy, and I have enough work to do in pursuit of my physics goals. But my high-school self would approve—that self is still me.

The quantum gold rush

Even if you don’t recognize the name, you probably recognize the saguaro cactus. It’s the archetype of the cactus, a column from which protrude arms bent at right angles like elbows. As my husband pointed out, the cactus emoji is a saguaro: 🌵. In Tucson, Arizona, even the airport has a saguaro crop sufficient for staging a Western short film. I didn’t have a film to shoot, but the garden set the stage for another adventure: the ITAMP winter school on quantum thermodynamics.

Tucson airport

ITAMP is the Institute for Theoretical Atomic, Molecular, and Optical Physics (the Optical is silent). Harvard University and the Smithsonian Institute share ITAMP, where I worked as a postdoc. ITAMP hosted the first quantum-thermodynamics conference to take place on US soil, in 2017. Also, ITAMP hosts a winter school in Arizona every February. (If you lived in the Boston area, you might want to escape to the southwest then, too.) The winter school’s topic varies from year to year. 

How about a winter school on quantum thermodynamics? ITAMP’s director, Hossein Sadeghpour, asked me when I visited Cambridge, Massachusetts last spring.

Let’s do it, I said. 

Lecturers came from near and far. Kanu Sinha, of the University of Arizona, spoke about how electric charges fluctuate in the quantum vacuum. Fluctuations feature also in extensions of the second law of thermodynamics, which helps explain why time flows in only one direction. Gabriel Landi, from the University of Rochester, lectured about these fluctuation relations. ITAMP Postdoctoral Fellow Ceren Dag explained why many-particle quantum systems register time’s arrow. Ferdinand Schmidt-Kaler described the many-particle quantum systems—the trapped ions—in his lab at the University of Mainz.

Ronnie Kosloff, of Hebrew University in Jerusalem, lectured about quantum engines. Nelly Ng, an Assistant Professor at Nanyang Technological University, has featured on Quantum Frontiers at least three times. She described resource theories—information-theoretic models—for thermodynamics. Information and energy both serve as resources in thermodynamics and computation, I explained in my lectures.

The 2024 ITAMP winter school

The winter school took place at the conference center adjacent to Biosphere 2. Biosphere 2 is an enclosure that contains several miniature climate zones, including a coastal fog desert, a rainforest, and an ocean. You might have heard of Biosphere 2 due to two experiments staged there during the 1990s: in each experiment, a group of people was sealed in the enclosure. The experimentalists harvested their own food and weren’t supposed to receive any matter from outside. The first experiment lasted for two years. The group, though, ran out of oxygen, which a support crew pumped in. Research at Biosphere 2 contributes to our understanding of ecosystems and space colonization.

Fascinating as the landscape inside Biosphere 2 is, so is the landscape outside. The winter school included an afternoon hike, and my husband and I explored the territory around the enclosure.

Did you see any snakes? my best friend asked after I returned home.

No, I said. But we were chased by a vicious beast. 

On our first afternoon, my husband and I followed an overgrown path away from the biosphere to an almost deserted-looking cluster of buildings. We eventually encountered what looked like a warehouse from which noises were emanating. Outside hung a sign with which I resonated.

Scientists, I thought. Indeed, a researcher emerged from the warehouse and described his work to us. His group was preparing to seal off a building where they were simulating a Martian environment. He also warned us about the territory we were about to enter, especially the creature that roosted there. We were too curious to retreat, though, so we set off into a ghost town.

At least, that’s what the other winter-school participants called the area, later in the week—a ghost town. My husband and I had already surveyed the administrative offices, conference center, and other buildings used by biosphere personnel today. Personnel in the 1980s used a different set of buildings. I don’t know why one site gave way to the other. But the old buildings survive—as what passes for ancient ruins to many Americans. 

Weeds have grown up in the cracks in an old parking lot’s tarmac. A sign outside one door says, “Classroom”; below it is a sign that must not have been correct in decades: “Class in progress.” Through the glass doors of the old visitors’ center, we glimpsed cushioned benches and what appeared to be a diorama exhibit; outside, feathers and bird droppings covered the ground. I searched for a tumbleweed emoji, to illustrate the atmosphere, but found only a tumbler one: 🥃.

After exploring, my husband and I rested in the shade of an empty building, drank some of the water we’d brought, and turned around. We began retracing our steps past the defunct visitors’ center. Suddenly, a monstrous Presence loomed on our right. 

I can’t tell you how large it was; I only glimpsed it before turning and firmly not running away. But the Presence loomed. And it confirmed what I’d guessed upon finding the feathers and droppings earlier: the old visitors’ center now served as the Lair of the Beast.

The Mars researcher had warned us about the aggressive male turkey who ruled the ghost town. The turkey, the researcher had said, hated men—especially men wearing blue. My husband, naturally, was wearing a blue shirt. You might be able to outrun him, the researcher added pensively.

My husband zipped up his black jacket over the blue shirt. I advised him to walk confidently and not too quickly. Hikes in bear country, as well as summers at Busch Gardens Zoo Camp, gave me the impression that we mustn’t run; the turkey would probably chase us, get riled up, and excite himself to violence. So we walked, and the monstrous turkey escorted us. For surprisingly and frighteningly many minutes. 

The turkey kept scolding us in monosyllabic squawks, which sounded increasingly close to the back of my head. I didn’t turn around to look, but he sounded inches away. I occasionally responded in the soothing voice I was taught to use on horses. But my husband and I marched increasingly quickly.

We left the old visitors’ center, curved around, and climbed most of a hill before ceasing to threaten the turkey—or before he ceased to threaten us. He squawked a final warning and fell back. My husband and I found ourselves amid the guest houses of workshops past, shaky but unmolested. Not that the turkey wreaks much violence, according to the Mars researcher: at most, he beats his wings against people and scratches up their cars (especially blue ones). But we were relieved to return to civilization.

Afternoon hike at Catalina State Park, a drive away from Biosphere 2. (Yes, that’s a KITP hat.)

The ITAMP winter school reminded me of Roughing It, a Mark Twain book I finished this year. Twain chronicled the adventures he’d experienced out West during the 1860s. The Gold Rush, he wrote, attracted the top young men of all nations. The quantum-technologies gold rush has been attracting the top young people of all nations, and the winter school evidenced their eagerness. Yet the winter school also evidenced how many women have risen to the top: 10 of the 24 registrants were women, as were four of the seven lecturers.1 

The winter-school participants in the shuttle I rode from the Tucson airport to Biosphere 2

We’ll see to what extent the quantum-technologies gold rush plays out like Mark Twain’s. Ours at least involves a ghost town and ferocious southwestern critters.

1For reference, when I applied to graduate programs, I was told that approximately 20% of physics PhD students nationwide were women. The percentage of women drops as one progresses up the academic chain to postdocs and then to faculty members. And primarily PhD students and postdocs registered for the winter school.

The rain in Portugal

My husband taught me how to pronounce the name of the city where I’d be presenting a talk late last July: Aveiro, Portugal. Having studied Spanish, I pronounced the name as Ah-VEH-roh, with a v partway to a hard b. But my husband had studied Portuguese, so he recommended Ah-VAI-roo

His accuracy impressed me when I heard the name pronounced by the organizer of the conference I was participating in—Theory of Quantum Computation, or TQC. Lídia del Rio grew up in Portugal and studied at the University of Aveiro, so I bow to her in matters of Portuguese pronunciation. I bow to her also for organizing one of the world’s largest annual quantum-computation conferences (with substantial help—fellow quantum physicist Nuriya Nurgalieva shared the burden). But Lídia cofounded Quantum, a journal that’s risen from a Gedankenexperiment to a go-to venue in six years. So she gives the impression of being able to manage anything.

Aveiro architecture

Watching Lídia open TQC gave me pause. I met her in 2013, the summer before beginning my PhD at Caltech. She was pursuing her PhD at ETH Zürich, which I was visiting. Lídia took me dancing at an Argentine-tango studio one evening. Now, she’d invited me to speak at an international conference that she was coordinating.

Lídia and me in Zürich as PhD students
Lídia opening TQC

Not only Lídia gave me pause; so did the three other invited speakers. Every one of them, I’d met when each of us was a grad student or a postdoc. 

Richard Küng described classical shadows, a technique for extracting information about quantum states via measurements. Suppose we wish to infer about diverse properties of a quantum state \rho (about diverse observables’ expectation values). We have to measure many copies of \rho—some number n of copies. The community expected n to grow exponentially with the system’s size—for instance, with the number of qubits in a quantum computer’s register. We can get away with far fewer, Richard and collaborators showed, by randomizing our measurements. 

Richard postdocked at Caltech while I was a grad student there. Two properties of his stand out in my memory: his describing, during group meetings, the math he’d been exploring and the Austrian accent in which he described that math.

Did this restaurant’s owners realize that quantum physicists were descending on their city? I have no idea.

Also while I was a grad student, Daniel Stilck França visited Caltech. Daniel’s TQC talk conveyed skepticism about whether near-term quantum computers can beat classical computers in optimization problems. Near-term quantum computers are NISQ (noisy, intermediate-scale quantum) devices. Daniel studied how noise (particularly, local depolarizing noise) propagates through NISQ circuits. Imagine a quantum computer suffering from a 1% noise error. The quantum computer loses its advantage over classical competitors after 10 layers of gates, Daniel concluded. Nor does he expect error mitigation—a bandaid en route to the sutures of quantum error correction—to help much.

I’d coauthored a paper with the fourth invited speaker, Adam Bene Watts. He was a PhD student at MIT, and I was a postdoc. At the time, he resembled the 20th-century entanglement guru John Bell. Adam still resembles Bell, but he’s moved to Canada.

Adam speaking at TQC
From a 2021 Quantum Frontiers post of mine. I was tickled to see that TQC’s organizers used the photo from my 2021 post as Adam’s speaker photo.

Adam distinguished what we can compute using simple quantum circuits but not using simple classical ones. His results fall under the heading of complexity theory, about which one can rarely prove anything. Complexity theorists cling to their jobs by assuming conjectures widely expected to be true. Atop the assumptions, or conditions, they construct “conditional” proofs. Adam proved unconditional claims in complexity theory, thanks to the simplicity of the circuits he compared.

In my estimation, the talks conveyed cautious optimism: according to Adam, we can prove modest claims unconditionally in complexity theory. According to Richard, we can spare ourselves trials while measuring certain properties of quantum systems. Even Daniel’s talk inspired more optimism than he intended: a few years ago, the community couldn’t predict how noisy short-depth quantum circuits could perform. So his defeatism, rooted in evidence, marks an advance.

Aveiro nurtures optimism, I expect most visitors would agree. Sunshine drenches the city, and the canals sparkle—literally sparkle, as though devised by Elsa at a higher temperature than usual. Fresh fruit seems to wend its way into every meal.1 Art nouveau flowers scale the architecture, and fanciful designs pattern the tiled sidewalks.

What’s more, quantum information theorists of my generation were making good. Three riveted me in their talks, and another co-orchestrated one of the world’s largest quantum-computation gatherings. To think that she’d taken me dancing years before ascending to the global stage.

My husband and I made do, during our visit, by cobbling together our Spanish, his Portuguese, and occasional English. Could I hold a conversation with the Portuguese I gleaned? As adroitly as a NISQ circuit could beat a classical computer. But perhaps we’ll return to Portugal, and experimentalists are doubling down on quantum error correction. I remain cautiously optimistic.

1As do eggs, I was intrigued to discover. Enjoyed a hardboiled egg at breakfast? Have a fried egg on your hamburger at lunch. And another on your steak at dinner. And candied egg yolks for dessert.

This article takes its title from a book by former US Poet Laureate Billy Collins. The title alludes to a song in the musical My Fair Lady, “The Rain in Spain.” The song has grown so famous that I don’t think twice upon hearing the name. “The rain in Portugal” did lead me to think twice—and so did TQC.

With thanks to Lídia and Nuriya for their hospitality. You can submit to TQC2024 here.