What distinguishes quantum thermodynamics from quantum statistical mechanics?

Yoram Alhassid asked the question at the end of my Yale Quantum Institute colloquium last February. I knew two facts about Yoram: (1) He belongs to Yale’s theoretical-physics faculty. (2) His PhD thesis’s title—“On the Information Theoretic Approach to Nuclear Reactions”—ranks among my three favorites.1 

Over the past few months, I’ve grown to know Yoram better. He had reason to ask about quantum statistical mechanics, because his research stands up to its ears in the field. If forced to synopsize quantum statistical mechanics in five words, I’d say, “study of many-particle quantum systems.” Examples include gases of ultracold atoms. If given another five words, I’d add, “Calculate and use partition functions.” A partition function is a measure of the number of states, or configurations, accessible to the system. Calculate a system’s partition function, and you can calculate the system’s average energy, the average number of particles in the system, how the system responds to magnetic fields, etc.

Line in the sand

My colloquium concerned quantum thermodynamics, which I’ve blogged about many times. So I should have been able to distinguish quantum thermodynamics from its neighbors. But the answer I gave Yoram didn’t satisfy me. I mulled over the exchange for a few weeks, then emailed Yoram a 502-word essay. The exercise grew my appreciation for the question and my understanding of my field. 

An adaptation of the email appears below. The adaptation should suit readers who’ve majored in physics, but don’t worry if you haven’t. Bits of what distinguishes quantum thermodynamics from quantum statistical mechanics should come across to everyone—as should, I hope, the value of question-and-answer sessions:

One distinction is a return to the operational approach of 19th-century thermodynamics. Thermodynamicists such as Sadi Carnot wanted to know how effectively engines could operate. Their practical questions led to fundamental insights, such as the Carnot bound on an engine’s efficiency. Similarly, quantum thermodynamicists often ask, “How can this state serve as a resource in thermodynamic tasks?” This approach helps us identify what distinguishes quantum theory from classical mechanics.

For example, quantum thermodynamicists found an advantage in charging batteries via nonlocal operations. Another example is the “MBL-mobile” that I designed with collaborators. Many-body localization (MBL), we found, can enhance an engine’s reliability and scalability. 

Asking, “How can this state serve as a resource?” leads quantum thermodynamicists to design quantum engines, ratchets, batteries, etc. We analyze how these devices can outperform classical analogues, identifying which aspects of quantum theory power the outperformance. This question and these tasks contrast with the questions and tasks of many non-quantum-thermodynamicists who use statistical mechanics. They often calculate response functions and (e.g., ground-state) properties of Hamiltonians.

These goals of characterizing what nonclassicality is and what it can achieve in thermodynamic contexts resemble upshots of quantum computing and cryptography. As a 21st-century quantum information scientist, I understand what makes quantum theory quantum partially by understanding which problems quantum computers can solve efficiently and classical computers can’t. Similarly, I understand what makes quantum theory quantum partially by understanding how much more work you can extract from a singlet \frac{1}{ \sqrt{2} } ( | 0 1 \rangle - |1 0 \rangle ) (a maximally entangled state of two qubits) than from a product state in which the reduced states have the same forms as in the singlet, \frac{1}{2} ( | 0 \rangle \langle 0 | + | 1 \rangle \langle 1 | ).

As quantum thermodynamics shares its operational approach with quantum information theory, quantum thermodynamicists use mathematical tools developed in quantum information theory. An example consists of generalized entropies. Entropies quantify the optimal efficiency with which we can perform information-processing and thermodynamic tasks, such as data compression and work extraction.

Most statistical-mechanics researchers use just the Shannon and von Neumann entropies, H_{\rm Sh} and H_{\rm vN}, and perhaps the occasional relative entropy. These entropies quantify optimal efficiencies in large-system limits, e.g., as the number of messages compressed approaches infinity and in the thermodynamic limit.

Other entropic quantities have been defined and explored over the past two decades, in quantum and classical information theory. These entropies quantify the optimal efficiencies with which tasks can be performed (i) if the number of systems processed or the number of trials is arbitrary, (ii) if the systems processed share correlations, (iii) in the presence of “quantum side information” (if the system being used as a resource is entangled with another system, to which an agent has access), or (iv) if you can tolerate some probability \varepsilon that you fail to accomplish your task. Instead of limiting ourselves to H_{\rm Sh} and H_{\rm vN}, we use also “\varepsilon-smoothed entropies,” Rényi divergences, hypothesis-testing entropies, conditional entropies, etc.

Another hallmark of quantum thermodynamics is results’ generality and simplicity. Thermodynamics characterizes a system with a few macroscopic observables, such as temperature, volume, and particle number. The simplicity of some quantum thermodynamics served a chemist collaborator and me, as explained in the introduction of https://arxiv.org/abs/1811.06551.

Yoram’s question reminded me of one reason why, as an undergrad, I adored studying physics in a liberal-arts college. I ate dinner and took walks with students majoring in economics, German studies, and Middle Eastern languages. They described their challenges, which I analyzed with the physics mindset that I was acquiring. We then compared our approaches. Encountering other disciplines’ perspectives helped me recognize what tools I was developing as a budding physicist. How can we know our corner of the world without stepping outside it and viewing it as part of a landscape?

Plane

1The title epitomizes clarity and simplicity. And I have trouble resisting anything advertised as “the information-theoretic approach to such-and-such.”

The importance of being open

Barcelona refused to stay indoors this May.

Merchandise spilled outside shops onto the streets, restaurateurs parked diners under trees, and ice-cream cones begged to be eaten on park benches. People thronged the streets, markets filled public squares, and the scents of flowers wafted from vendors’ stalls. I couldn’t blame the city. Its sunshine could have drawn Merlin out of his crystal cave. Insofar as a city lives, Barcelona epitomized a quotation by thermodynamicist Ilya Prigogine: “The main character of any living system is openness.”

Prigogine (1917–2003), who won the Nobel Prize for chemistry, had brought me to Barcelona. I was honored to receive, at the Joint European Thermodynamics Conference (JETC) there, the Ilya Prigogine Prize for a thermodynamics PhD thesis. The JETC convenes and awards the prize biennially; the last conference had taken place in Budapest. Barcelona suited the legacy of a thermodynamicist who illuminated open systems.

IMG_0324

The conference center. Not bad, eh?

Ilya Prigogine began his life in Russia, grew up partially in Germany, settled in Brussels, and worked at American universities. His nobelprize.org biography reveals a mind open to many influences and disciplines: Before entering university, his “interest was more focused on history and archaeology, not to mention music, especially piano.” Yet Prigogine pursued chemistry. 

He helped extend thermodynamics outside equilibrium. Thermodynamics is the study of energy, order, and time’s arrow in terms of large-scale properties, such as temperature, pressure, and volume. Many physicists think that thermodynamics describes only equilibrium. Equilibrium is a state of matter in which (1) large-scale properties remain mostly constant and (2) stuff (matter, energy, electric charge, etc.) doesn’t flow in any particular direction much. Apple pies reach equilibrium upon cooling on a countertop. When I’ve described my research as involving nonequilibrium thermodynamics, some colleagues have asked whether I’ve used an oxymoron. But “nonequilibrium thermodynamics” appears in Prigogine’s Nobel Lecture. 

Prigogine photo

Ilya Prigogine

Another Nobel laureate, Lars Onsager, helped extend thermodynamics a little outside equilibrium. He imagined poking a system gently, as by putting a pie on a lukewarm stovetop or a magnet in a weak magnetic field. (Experts: Onsager studied the linear-response regime.) You can read about his work in my blog post “Long live Yale’s cemetery.” Systems poked slightly out of equilibrium tend to return to equilibrium: Equilibrium is stable. Systems flung far from equilibrium, as Prigogine showed, can behave differently. 

A system can stay far from equilibrium by interacting with other systems. Imagine placing an apple pie atop a blistering stove. Heat will flow from the stove through the pie into the air. The pie will stay out of equilibrium due to interactions with what we call a “hot reservoir” (the stove) and a “cold reservoir” (the air). Systems (like pies) that interact with other systems (like stoves and air), we call “open.”

You and I are open: We inhale air, ingest food and drink, expel waste, and radiate heat. Matter and energy flow through us; we remain far from equilibrium. A bumper sticker in my high-school chemistry classroom encapsulated our status: “Old chemists don’t die. They come to equilibrium.” We remain far from equilibrium—alive—because our environment provides food and absorbs heat. If I’m an apple pie, the yogurt that I ate at breakfast serves as my stovetop, and the living room in which I breakfasted serves as the air above the stove. We live because of our interactions with our environments, because we’re open. Hence Prigogine’s claim, “The main character of any living system is openness.”

Apple pie

The author

JETC 2019 fostered openness. The conference sessions spanned length scales and mass scales, from quantum thermodynamics to biophysics to gravitation. One could arrive as an expert in cell membranes and learn about astrophysics.

I remain grateful for the prize-selection committee’s openness. The topics of earlier winning theses include desalination, colloidal suspensions, and falling liquid films. If you tipped those topics into a tube, swirled them around, and capped the tube with a kaleidoscope glass, you might glimpse my thesis’s topic, quantum steampunk. Also, of the nine foregoing Prigogine Prize winners, only one had earned his PhD in the US. I’m grateful for the JETC’s consideration of something completely different.

When Prigogine said, “openness,” he referred to exchanges of energy and mass. Humans can exhibit openness also to ideas. The JETC honored Prigogine’s legacy in more ways than one. Here’s hoping I live up to their example.

IMG_0349

Outside La Sagrada Familia

Thermodynamics of quantum channels

You would hardly think that a quantum channel could have any sort of thermodynamic behavior. We were surprised, too.

How do the laws of thermodynamics apply in the quantum regime? Thanks to novel ideas introduced in the context of quantum information, scientists have been able to develop new ways to characterize the thermodynamic behavior of quantum states. If you’re a Quantum Frontiers regular, you have certainly read about these advances in Nicole’s captivating posts on the subject.

Asking the same question for quantum channels, however, turned out to be more challenging than expected. A quantum channel is a way of representing how an input state can change into an output state according to the laws of quantum mechanics. Let’s picture it as a box with an input state and an output state, like so:

channel-01

A computing gate, the building block of quantum computers, is described by a quantum channel. Or, if Alice sends a photon to Bob over an optical fiber, then the whole process is represented by a quantum channel. Thus, by studying quantum channels directly we can derive statements that are valid regardless of the physical platform used to store and process the quantum information—ion traps, superconducting qubits, photonic qubits, NV centers, etc.

We asked the following question: If I’m given a quantum channel, can I transform it into another, different channel by using something like a miniature heat engine? If so, how much work do I need to spend in order to accomplish this task? The answer is tricky because of a few aspects in which quantum channels are more complicated than quantum states.

In this post, I’ll try to give some intuition behind our results, which were developed with the help of Mario Berta and Fernando Brandão, and which were recently published in Physical Review Letters.

First things first, let’s worry about how to study the thermodynamic behavior of miniature systems.

Thermodynamics of small stuff

One of the important ideas that quantum information brought to thermodynamics is the idea of a resource theory. In a resource theory, we declare that there are certain kinds of states that are available for free, and that there are a set of operations that can be carried out for free. In a resource theory of thermodynamics, when we say “for free,” we mean “without expending any thermodynamic work.”

Here, the free states are those in thermal equilibrium at a fixed given temperature, and the free operations are those quantum operations that preserve energy and that introduce no noise into the system (we call those unitary operations). Faced with a task such as transforming one quantum state into another, we may ask whether or not it is possible to do so using the freely available operations. If that is not possible, we may then ask how much thermodynamic work we need to invest, in the form of additional energy at the input, in order to make the transformation possible.

Interestingly, the amount of work needed to go from one state ρ to another state σ might be unrelated to the work required to go back from σ to ρ. Indeed, the freely allowed operations can’t always be reversed; the reverse process usually requires a different sequence of operations, incurring an overhead. There is a mathematical framework to understand these transformations and this reversibility gap, in which generalized entropy measures play a central role. To avoid going down that road, let’s instead consider the macroscopic case in which we have a large number n of independent particles that are all in the same state ρ, a state which we denote by \rho^{\otimes n}. Then something magical happens: This macroscopic state can be reversibly converted to and from another macroscopic state \sigma^{\otimes n}, where all particles are in some other state σ. That is, the work invested in the transformation from \rho^{\otimes n} to \sigma^{\otimes n} can be entirely recovered by performing the reverse transformation:

asympt-interconversion-states-01

If this rings a bell, that is because this is precisely the kind of thermodynamics that you will find in your favorite textbook. There is an optimal, reversible way of transforming any two thermodynamic states into each other, and the optimal work cost of the transformation is the difference of a corresponding quantity known as the thermodynamic potential. Here, the thermodynamic potential is a quantity known as the free energy F(\rho). Therefore, the optimal work cost per copy w of transforming \rho^{\otimes n} into \sigma^{\otimes n} is given by the difference in free energy w = F(\sigma) - F(\rho).

From quantum states to quantum channels

Can we repeat the same story for quantum channels? Suppose that we’re given a channel \mathcal{E}, which we picture as above as a box that transforms an input state into an output state. Using the freely available thermodynamic operations, can we “transform” \mathcal{E} into another channel \mathcal{F}? That is, can we wrap this box with some kind of procedure that uses free thermodynamic operations to pre-process the input and post-process the output, such that the overall new process corresponds (approximately) to the quantum channel \mathcal{F}? We might picture the situation like this:

channel-simulation-E-F-01

Let us first simplify the question by supposing we don’t have a channel \mathcal{E} to start off with. How can we implement the channel \mathcal{F} from scratch, using only free thermodynamic operations and some invested work? That simple question led to pages and pages of calculations, lots of coffee, a few sleepless nights, and then more coffee. After finally overcoming several technical obstacles, we found that in the macroscopic limit of many copies of the channel, the corresponding amount of work per copy is given by the maximum difference of free energy F between the input and output of the channel. We decided to call this quantity the thermodynamic capacity of the channel:

thermodynamic-capacity-def

Intuitively, an implementation of \mathcal{F}^{\otimes n} must be prepared to expend an amount of work corresponding to the worst possible transformation of an input state to its corresponding output state. It’s kind of obvious in retrospect. However, what is nontrivial is that one can find a single implementation that works for all input states.

It turned out that this quantity had already been studied before. An earlier paper by Navascués and García-Pintos had shown that it was exactly this quantity that characterized the amount of work per copy that could be extracted by “consuming” many copies of a process \mathcal{E}^{\otimes n} provided as black boxes.

To our surprise, we realized that Navascués and García-Pintos’s result implied that the transformation of \mathcal{E}^{\otimes n} into \mathcal{F}^{\otimes n} is reversible. There is a simple procedure to convert \mathcal{E}^{\otimes n} into \mathcal{F}^{\otimes n} at a cost per copy that equals T(\mathcal{F}) - T(\mathcal{E}). The procedure consists in first extracting T(\mathcal{E}) work per copy of the first set of channels, and then preparing \mathcal{F}^{\otimes n} from scratch at a work cost of T(\mathcal{F}) per copy:

asympt-conversion-channels-E-to-F-01

Clearly, the reverse transformation yields back all the work invested in the forward transformation, making the transformation reversible. That’s because we could have started with \mathcal{F}’s and finished with \mathcal{E}’s instead of the opposite, and the associated work cost per copy would be T(\mathcal{E}) - T(\mathcal{F}). Thus the transformation is, indeed, reversible:

asympt-interconversion-channels-01

In turn, this implies that in the many-copy regime, quantum channels have a macroscopic thermodynamic behavior. That is, there is a thermodynamic potential—the thermodynamic capacity—that quantifies the minimal work required to transform one macroscopic set of channels into another.

Prospects for the thermodynamic capacity

Resource theories that are reversible are pretty rare. Reversibility is a coveted property because a reversible resource theory is one in which we can easily understand exactly which transformations are possible. Other than the thermodynamic resource theory of states mentioned above, most instances of a resource theory—especially resource theories of channels—typically produce the kind of overheads in the conversion cost that spoil reversibility. So it’s rather exciting when you do find a new reversible resource theory of channels.

Quantum information theorists, especially those working on the theory of quantum communication, care a lot about characterizing the capacity of a channel. This is the maximal amount of information that can be transmitted through a channel. Even though in our case we’re talking about a different kind of capacity—one where we transmit thermodynamic energy and entropy, rather than quantum bits of messages—there are some close parallels between the two settings from which both fields of quantum communication and quantum thermodynamics can profit. Our result draws deep inspiration from the so-called quantum reverse Shannon theorem, an important result in quantum communication that tells us how two parties can communicate using one kind of a channel if they have access to another kind of a channel. On the other hand, the thermodynamic capacity at zero energy is a quantity that was already studied in quantum communication, but it was not clear what that quantity represented concretely. This quantity gained even more importance as it was identified as the entropy of a channel. Now, we see that this quantity has a thermodynamic interpretation. Also, the thermodynamic capacity has a simple definition, it is relatively easy to compute and it is additive—all desirable properties that other measures of capacity of a quantum channel do not necessarily share.

We still have a few rough edges that I hope we can resolve sooner or later. In fact, there is an important caveat that I have avoided mentioning so far—our argument only holds for special kinds of channels, those that do the same thing regardless of when they are applied in time. (Those channels are called time-covariant.) A lot of channels that we’re used to studying have this property, but we think it should be possible to prove a version of our result for any general quantum channel. In fact, we do have another argument that works for all quantum channels, but it uses a slightly different thermodynamic framework which might not be physically well-grounded.

That’s all very nice, I can hear you think, but is this useful for any quantum computing applications? The truth is, we’re still pretty far from founding a new quantum start-up. The levels of heat dissipation in quantum logic elements are still orders of magnitude away from the fundamental limits that we study in the thermodynamic resource theory.

Rather, our result teaches us about the interplay of quantum channels and thermodynamic concepts. We not only have gained useful insight on the structure of quantum channels, but also developed new tools for how to analyze them. These will be useful to study more involved resource theories of channels. And still, in the future when quantum technologies will perhaps approach the thermodynamically reversible limit, it might be good to know how to implement a given quantum channel in such a way that good accuracy is guaranteed for any possible quantum input state, and without any inherent overhead due to the fact that we don’t know what the input state is.

Thermodynamics, a theory developed to study gases and steam engines, has turned out to be relevant from the most obvious to the most unexpected of situations—chemical reactions, electromagnetism, solid state physics, black holes, you name it. Trust the laws of thermodynamics to surprise you again by applying to a setting you’d never imagined them to, like quantum channels.

Quantum information in quantum cognition

Some research topics, says conventional wisdom, a physics PhD student shouldn’t touch with an iron-tipped medieval lance: sinkholes in the foundations of quantum theory. Problems so hard, you’d have a snowball’s chance of achieving progress. Problems so obscure, you’d have a snowball’s chance of convincing anyone to care about progress. Whether quantum physics could influence cognition much.

Quantum physics influences cognition insofar as (i) quantum physics prevents atoms from imploding and (ii) implosion inhabits atoms from contributing to cognition. But most physicists believe that useful entanglement can’t survive in brains. Entanglement consists of correlations shareable by quantum systems and stronger than any achievable by classical systems. Useful entanglement dies quickly in hot, wet, random environments. 

Brains form such environments. Imagine injecting entangled molecules A and B into someone’s brain. Water, ions, and other particles would bombard the molecules. The higher the temperature, the heavier the bombardment. The bombardiers would entangle with the molecules via electric and magnetic fields. Each molecule can share only so much entanglement. The more A entangled with the environment, the less A could remain entangled with B. A would come to share a tiny amount of entanglement with each of many particles. Such tiny amounts couldn’t accomplish much. So quantum physics seems unlikely to affect cognition significantly.

Lances

Do not touch.

Yet my PhD advisor, John Preskill, encouraged me to consider whether the possibility interested me.

Try some completely different research, he said. Take a risk. If it doesn’t pan out, fine. People don’t expect much of grad students, anyway. Have you seen Matthew Fisher’s paper about quantum cognition? 

Matthew Fisher is a theoretical physicist at the University of California, Santa Barbara. He has plaudits out the wazoo, many for his work on superconductors. A few years ago, Matthew developed an interest in biochemistry. He knew that most physicists doubt whether quantum physics could affect cognition much. But suppose that it could, he thought. How could it? Matthew reverse-engineered a mechanism, in a paper published by Annals of Physics in 2015.

A PhD student shouldn’t touch such research with a ten-foot radio antenna, says conventional wisdom. But I trust John Preskill in a way in which I trust no one else on Earth.

I’ll look at the paper, I said.

Risk

Matthew proposed that quantum physics could influence cognition as follows. Experimentalists have performed quantum computation using one hot, wet, random system: that of nuclear magnetic resonance (NMR). NMR is the process that underlies magnetic resonance imaging (MRI), a technique used to image people’s brains. A common NMR system consists of high-temperature liquid molecules. The molecules consists of atoms whose nuclei have quantum properties called spin. The nuclear spins encode quantum information (QI).

Nuclear spins, Matthew reasoned, might store QI in our brains. He catalogued the threats that could damage the QI. Hydrogen ions, he concluded, would threaten the QI most. They could entangle with (decohere) the spins via dipole-dipole interactions.

How can a spin avoid the threats? First, by having a quantum number s = 1/2. Such a quantum number zeroes out the nuclei’s electric quadrupole moments. Electric-quadrupole interactions can’t decohere such spins. Which biologically prevalent atoms have s = 1/2 nuclear spins? Phosphorus and hydrogen. Hydrogen suffers from other vulnerabilities, so phosphorus nuclear spins store QI in Matthew’s story. The spins serve as qubits, or quantum bits.

How can a phosphorus spin avoid entangling with other spins via magnetic dipole-dipole interactions? Such interactions depend on the spins’ orientations relative to their positions. Suppose that the phosphorus occupies a small molecule that tumbles in biofluids. The nucleus’s position changes randomly. The interaction can average out over tumbles.

The molecule contains atoms other than phosphorus. Those atoms have nuclei whose spins can interact with the phosphorus spins, unless every threatening spin has a quantum number s = 0. Which biologically prevalent atoms have s = 0 nuclear spins? Oxygen and calcium. The phosphorus should therefore occupy a molecule with oxygen and calcium.

Matthew designed this molecule to block decoherence. Then, he found the molecule in the scientific literature. The structure, {\rm Ca}_9 ({\rm PO}_4)_6, is called a Posner cluster or a Posner molecule. I’ll call it a Posner, for short. Posners appear to exist in simulated biofluids, fluids created to mimic the fluids in us. Posners are believed to exist in us and might participate in bone formation. According to Matthew’s estimates, Posners might protect phosphorus nuclear spins for up to 1-10 days.

Posner 2

Posner molecule (image courtesy of Swift et al.)

How can Posners influence cognition? Matthew proposed the following story.

Adenosine triphosphate (ATP) is a molecule that fuels biochemical reactions. “Triphosphate” means “containing three phosphate ions.” Phosphate ({\rm PO}_4^{3-}) consists of one phosphorus atom and three oxygen atoms. Two of an ATP molecule’s phosphates can break off while remaining joined to each other.

The phosphate pair can drift until encountering an enzyme called pyrophosphatase. The enzyme can break the pair into independent phosphates. Matthew, with Leo Radzihovsky, conjectured that, as the pair breaks, the phosphorus nuclear spins are projected onto a singlet. This state, represented by \frac{1}{ \sqrt{2} } ( | \uparrow \downarrow \rangle - | \downarrow \uparrow \rangle ), is maximally entangled. 

Imagine many entangled phosphates in a biofluid. Six phosphates can join nine calcium ions to form a Posner molecule. The Posner can share up to six singlets with other Posners. Clouds of entangled Posners can form.

One clump of Posners can enter one neuron while another clump enters another neuron. The protein VGLUT, or BNPI, sits in cell membranes and has the potential to ferry Posners in. The neurons will share entanglement. Imagine two Posners, P and Q, approaching each other in a neuron N. Quantum-chemistry calculations suggest that the Posners can bind together. Suppose that P shares entanglement with a Posner P’ in a neuron N’, while Q shares entanglement with a Posner Q’ in N’. The entanglement, with the binding of P to Q, can raise the probability that P’ binds to Q’.

Bound-together Posners will move slowly, having to push much water out of the way. Hydrogen and magnesium ions can latch onto the slow molecules easily. The Posners’ negatively charged phosphates will attract the {\rm H}^+ and {\rm Mg}^{2+} as the phosphates attract the Posner’s {\rm Ca}^{2+}. The hydrogen and magnesium can dislodge the calcium, breaking apart the Posners. Calcium will flood neurons N and N’. Calcium floods a neuron’s axion terminal (the end of the neuron) when an electrical signal reaches the axion. The flood induces the neuron to release neurotransmitters. Neurotransmitters are chemicals that travel to the next neuron, inducing it to fire. So entanglement between phosphorus nuclear spins in Posner molecules might stimulate coordinated neuron firing.

Neurons

Does Matthew’s story play out in the body? We can’t know till running experiments and analyzing the results. Experiments have begun: Last year, the Heising-Simons Foundation granted Matthew and collaborators $1.2 million to test the proposal.

Suppose that Matthew conjectures correctly, John challenged me, or correctly enough. Posner molecules store QI. Quantum systems can process information in ways in which classical systems, like laptops, can’t. How adroitly can Posners process QI?

I threw away my iron-tipped medieval lance in year five of my PhD. I left Caltech for a five-month fellowship, bent on returning with a paper with which to answer John. I did, and Annals of Physics published the paper this month.

Digest image

I had the fortune to interest Elizabeth Crosson in the project. Elizabeth, now an assistant professor at the University of New Mexico, was working as a postdoc in John’s group. Both of us are theorists who specialize in QI theory. But our backgrounds, skills, and specialties differ. We complemented each other while sharing a doggedness that kept us emailing, GChatting, and Google-hangout-ing at all hours.

Elizabeth and I translated Matthew’s biochemistry into the mathematical language of QI theory. We dissected Matthew’s narrative into a sequence of biochemical steps. We ascertained how each step would transform the QI encoded in the phosphorus nuclei. Each transformation, we represented with a piece of math and with a circuit-diagram element. (Circuit-diagram elements are pictures strung together to form circuits that run algorithms.) The set of transformations, we called Posner operations.

Imagine that you can perform Posner operations, by preparing molecules, trying to bind them together, etc. What QI-processing tasks can you perform? Elizabeth and I found applications to quantum communication, quantum error detection, and quantum computation. Our results rest on the assumption—possibly inaccurate—that Matthew conjectures correctly. Furthermore, we characterized what Posners could achieve if controlled. Randomness, rather than control, would direct Posners in biofluids. But what can happen in principle offers a starting point.

First, QI can be teleported from one Posner to another, while suffering noise.1 This noisy teleportation doubles as superdense coding: A trit is a random variable that assumes one of three possible values. A bit is a random variable that assumes one of two possible values. You can teleport a trit from one Posner to another effectively, while transmitting a bit directly, with help from entanglement. 

Teleport

Second, Matthew argued that Posners’ structures protect QI. Scientists have developed quantum error-correcting and -detecting codes to protect QI. Can Posners implement such codes, in our model? Yes: Elizabeth and I (with help from erstwhile Caltech postdoc Fernando Pastawski) developed a quantum error-detection code accessible to Posners. One Posner encodes a logical qutrit, the quantum version of a trit. The code detects any error that slams any of the Posner’s six qubits.

Third, how complicated an entangled state can Posner operations prepare? A powerful one, we found: Suppose that you can measure this state locally, such that earlier measurements’ outcomes affect which measurements you perform later. You can perform any quantum computation. That is, Posner operations can prepare a state that fuels universal measurement-based quantum computation.

Finally, Elizabeth and I quantified effects of entanglement on the rate at which Posners bind together. Imagine preparing two Posners, P and P’, that share entanglement only with other particles. If the Posners approach each other with the right orientation, they have a 33.6% chance of binding, in our model. Now, suppose that every qubit in P is maximally entangled with a qubit in P’. The binding probability can rise to 100%.

Circuit

Elizabeth and I recast as a quantum circuit a biochemical process discussed in Matthew Fisher’s 2015 paper.

I feared that other scientists would pooh-pooh our work as crazy. To my surprise, enthusiasm flooded in. Colleagues cheered the risk on a challenge in an emerging field that perks up our ears. Besides, Elizabeth’s and my work is far from crazy. We don’t assert that quantum physics affects cognition. We imagine that Matthew conjectures correctly, acknowledging that he might not, and explore his proposal’s implications. Being neither biochemists nor experimentalists, we restrict our claims to QI theory.

Maybe Posners can’t protect coherence for long enough. Would inaccuracy of Matthew’s beach our whale of research? No. Posners prompted us to propose ideas and questions within QI theory. For instance, our quantum circuits illustrate interactions (unitary gates, to experts) interspersed with measurements implemented by the binding of Posners. The circuits partially motivated a subfield that emerged last summer and is picking up speed: Consider interspersing random unitary gates with measurements. The unitaries tend to entangle qubits, whereas the measurements disentangle. Which influence wins? Does the system undergo a phase transition from “mostly entangled” to “mostly unentangled” at some measurement frequency? Researchers from Santa Barbara to Colorado; MIT; Oxford; Lancaster, UK; Berkeley; Stanford; and Princeton have taken up the challenge.  

A physics PhD student, conventional wisdom says, shouldn’t touch quantum cognition with a Swiss guard’s halberd. I’m glad I reached out: I learned much, contributed to science, and had an adventure. Besides, if anyone disapproves of daring, I can blame John Preskill.

Lance

Annals of Physics published “Quantum information in the Posner model of quantum cognition” here. You can find the arXiv version here and can watch a talk about our paper here. 

1Experts: The noise arises because, if two Posners bind, they effectively undergo a measurement. This measurement transforms a subspace of the two-Posner Hilbert space as a coarse-grained Bell measurement. A Bell measurement yields one of four possible outcomes, or two bits. Discarding one of the bits amounts to coarse-graining the outcome. Quantum teleportation involves a Bell measurement. Coarse-graining the measurement introduces noise into the teleportation.

Long live Yale’s cemetery

Call me morbid, but, the moment I arrived at Yale, I couldn’t wait to visit the graveyard.

I visited campus last February, to present the Yale Quantum Institute (YQI) Colloquium. The YQI occupies a building whose stone exterior honors Yale’s Gothic architecture and whose sleekness defies it. The YQI has theory and experiments, seminars and colloquia, error-correcting codes and small-scale quantum computers, mugs and laptop bumper stickers. Those assets would have drawn me like honey. But my host, Steve Girvin, piled molasses, fudge, and cookie dough on top: “you should definitely reserve some time to go visit Josiah Willard Gibbs, Jr., Lars Onsager, and John Kirkwood in the Grove Street Cemetery.”

Laptop

Gibbs, Onsager, and Kirkwood pioneered statistical mechanics. Statistical mechanics is the physics of many-particle systems, energy, efficiency, and entropy, a measure of order. Statistical mechanics helps us understand why time flows in only one direction. As a colleague reminded me at a conference about entropy, “You are young. But you will grow old and die.” That conference featured a field trip to a cemetery at the University of Cambridge. My next entropy-centric conference took place next to a cemetery in Banff, Canada. A quantum-thermodynamics conference included a tour of an Oxford graveyard.1 (That conference reincarnated in Santa Barbara last June, but I found no cemeteries nearby. No wonder I haven’t blogged about it.) Why shouldn’t a quantum-thermodynamics colloquium lead to the Grove Street Cemetery?

Building

Home of the Yale Quantum Institute

The Grove Street Cemetery lies a few blocks from the YQI. I walked from the latter to the former on a morning whose sunshine spoke more of springtime than of February. At one entrance stood a gatehouse that looked older than many of the cemetery’s residents.

“Can you tell me where to find Josiah Willard Gibbs?” I asked the gatekeepers. They handed me a map, traced routes on it, and dispatched me from their lodge. Snow had fallen the previous evening but was losing its battle against the sunshine. I sloshed to a pathway labeled “Locust,” waded along Locust until passing Myrtle, and splashed back and forth until a name caught my eye: “Gibbs.” 

Entrance

One entrance of the Grove Street Cemetery

Josiah Willard Gibbs stamped his name across statistical mechanics during the 1800s. Imagine a gas in a box, a system that illustrates much of statistical mechanics. Suppose that the gas exchanges heat with a temperature-T bath through the box’s walls. After exchanging heat for a long time, the gas reaches thermal equilibrium: Large-scale properties, such as the gas’s energy, quit changing much. Imagine measuring the gas’s energy. What probability does the measurement have of outputting E? The Gibbs distribution provides the answer, e^{ - E / (k_{\rm B} T) } / Z. The k_{\rm B} denotes Boltzmann’s constant, a fundamental constant of nature. The Z denotes a partition function, which ensures that the probabilities sum to one.

Gibbs lent his name to more than probabilities. A function of probabilities, the Gibbs entropy, prefigured information theory. Entropy features in the Gibbs free energy, which dictates how much work certain thermodynamic systems can perform. A thermodynamic system has many properties, such as temperature and pressure. How many can you control? The answer follows from the Gibbs-Duheim relation. You’ll be able to follow the Gibbs walk, a Yale alumnus tells me, once construction on Yale’s physical-sciences complex ends.

Gibbs 1

Back I sloshed along Locust Lane. Turning left onto Myrtle, then right onto Cedar, led to a tree that sheltered two tombstones. They looked like buddies about to throw their arms around each other and smile for a photo. The lefthand tombstone reported four degrees, eight service positions, and three scientific honors of John Gamble Kirkwood. The righthand tombstone belonged to Lars Onsager:

NOBEL LAUREATE*

[ . . . ]

*ETC.

Onsager extended thermodynamics beyond equilibrium. Imagine gently poking one property of a thermodynamic system. For example, recall the gas in a box. Imagine connecting one end of the box to a temperature-T bath and the other end to a bath at a slightly higher temperature, T' \gtrsim T. You’ll have poked the system’s temperature out of equilibrium. Heat will flow from the hotter bath to the colder bath. Particles carry the heat, energy of motion. Suppose that the particles have electric charges. An electric current will flow because of the temperature difference. Similarly, heat can flow because of an electric potential difference, or a pressure difference, and so on. You can cause a thermodynamic system’s elbow to itch, Onsager showed, by tickling the system’s ankle.

To Onsager’s left lay John Kirkwood. Kirkwood had defined a quasiprobability distribution in 1933. Quasiprobabilities resemble probabilities but can assume negative and nonreal values. These behaviors can signal nonclassical physics, such as the ability to outperform classical computers. I generalized Kirkwood’s quasiprobability with collaborators. Our generalized quasiprobability describes quantum chaos, thermalization, and the spread of information through entanglement. Applying the quasiprobability across theory and experiments has occupied me for two-and-a-half years. Rarely has a tombstone pleased anyone as much as Kirkwood’s tickled me.

Kirkwood and Onsager

The Grove Street Cemetery opened my morning with a whiff of rosemary. The evening closed with a shot of adrenaline. I met with four undergrad women who were taking Steve Girvin’s course, an advanced introduction to physics. I should have left the conversation bled of energy: Since visiting the cemetery, I’d held six discussions with nine people. But energy can flow backward. The students asked how I’d come to postdoc at Harvard; I asked what they might major in. They described the research they hoped to explore; I explained how I’d constructed my research program. They asked if I’d had to work as hard as they to understand physics; I confessed that I might have had to work harder.

I left the YQI content, that night. Such a future deserves its past; and such a past, its future.

WIP

With thanks to Steve Girvin, Florian Carle, and the Yale Quantum Institute for their hospitality.

1Thermodynamics is a physical theory that emerges from statistical mechanics.

Symmetries and quantum error correction

It’s always exciting when you can bridge two different physical concepts that seem to have nothing in common—and it’s even more thrilling when the results have as broad a range of possible fields of application as from fault-tolerant quantum computation to quantum gravity.

Physicists love to draw connections between distinct ideas, interconnecting concepts and theories to uncover new structure in the landscape of scientific knowledge. Put together information theory with quantum mechanics and you’ve opened a whole new field of quantum information theory. More recently, machine learning tools have been combined with many-body physics to find new ways to identify phases of matter, and ideas from quantum computing were applied to Pozner molecules to obtain new plausible models of how the brain might work.

In a recent contribution, my collaborators and I took a shot at combining the two physical concepts of quantum error correction and physical symmetries. What can we say about a quantum error-correcting code that conforms to a physical symmetry? Surprisingly, a continuous symmetry prevents the code from doing its job: A code can conform well to the symmetry, or it can correct against errors accurately, but it cannot do both simultaneously.

By a continuous symmetry, we mean a transformation that is characterized by a set of continuous parameters, such as angles. For instance, if I am holding an atom in my hand (more realistically, it’ll be confined in some fancy trap with lots of lasers), then I can rotate it around and about in space:

A rotation like this is fully specified by an axis and an angle, which are continuous parameters. Other transformations that we could think of are, for instance, time evolution, or a continuous family of unitary gates that we might want to apply to the system.

On the other hand, a code is a way of embedding some logical information into physical systems:

By cleverly distributing the information that we care about over several physical systems, an error-correcting code is able to successfully recover the original logical information even if the physical systems are exposed to some noise. Quantum error-correcting codes are particularly promising for quantum computing, since qubits tend to lose their information really fast (current typical ones can hold their information for a few seconds). In this way, instead of storing the actual information we care about on a single qubit, we use extra qubits which we prepare in a complicated state that is designed to protect this information from the noise.

Covariant codes for quantum computation

A code that is compatible with respect to a physical symmetry is called covariant. This property ensures that if I apply a symmetry transformation on the logical information, this is equivalent to applying corresponding symmetry transformations on each of the physical systems.

Suppose I would like to flip my qubit from “0” to “1” and from “1” to “0”. If my information is stored in an encoded form, then in principle I first need to decode the information to uncover the original logical information, apply the flip operation, and then re-encode the new logical information back onto the physical qubits. A covariant code allows to perform the transformation directly on the physical qubits, without having to decode the information first:

The advantage of this scheme is that the logical information is never exposed and remains protected all along the computation.

But here’s the catch: Eastin and Knill famously proved that error-correcting codes can be at most covariant with respect to a finite set of transformations, ruling out universal computation with transversal gates. In other words, the computations we can perform using this scheme are very limited because we can’t perform any continuous symmetry transformation.

Interestingly, however, there’s a loophole: If we consider macroscopic systems, such as a particle with a very large value of spin, then it becomes possible again to construct codes that are covariant with respect to continuous transformations.

How is that possible, you ask? How do we transition from the microscopic regime, where covariant codes are ruled out for continuous symmetries, to the macroscopic regime, where they are allowed? We provide an answer by resorting to approximate quantum error correction. Namely, we consider the situation where the code does not have to correct each error exactly, but only has to reconstruct a good approximation of the logical information. As it turns out, there is a quantitative limit to how accurately a code can correct against errors if it is covariant with respect to a continuous symmetry, represented by the following equation:

where epsilon specifies how inaccurately the code error-corrects (epsiloneqzero means the code can correct against errors perfectly), n is the number of physical subsystems, and the DeltaT_logical and DeltaT_physical are measures of “how strongly” the symmetry transformation can act on the logical and physical subsystems.

Let’s try to understand the right-hand side of this equation. In physics, continuous symmetries are generated by what we call physical charges. These are physical quantities that are associated with the symmetry, and that characterize how the symmetry acts on each state of the system. For instance, the charge that corresponds to time evolution is simply energy: States that label high energies have a rapidly varying phase whereas the phase of low-energy states changes slowly in time. Above, we indicate by DeltaT_logical the range of possible charge values on the logical system and by DeltaT_physical the corresponding range of charge values on each physical subsystem. In typical settings, this range of charge values is related to the dimension of the system—the more states the system has, intuitively, the greater range of charges it can accommodate.

The above equation states that the inaccuracy of the code must be larger than some value given on the right-hand side of the equation, which depends on the number of subsystems n and the ranges of charge values on the logical system and physical subsystems. The right-hand side becomes small in two regimes: if each subsystem can accommodate a large range of charge values, or if there is a large number of physical systems. In these regimes, our limitation vanishes, and we can circumvent the Eastin-Knill theorem and construct good covariant error-correcting codes. This allows us to connect the two regimes that seemed incompatible earlier, the microscopic regime where there cannot be any covariant codes, and the macroscopic regime where they are allowed.

From quantum computation to many-body physics and quantum gravity

Quantum error-correcting codes not only serve to protect information in a quantum computation against noise, but they also provide a conceptual toolbox to understand complex physical systems where a quantum state is delocalized over many physical subsystems. The tight connections between quantum error correction and many-body physics have been put to light following a long history of pioneering research at Caltech in these fields. And as if that weren’t enough, quantum error correcting codes were also shown to play a crucial role in understanding quantum gravity.

There is an abundance of natural physical symmetries to consider both in many-body physics and in quantum gravity, and that gives us a good reason to be excited about characterizing covariant codes. For instance, there are natural approximate quantum error correcting codes that appear in some statistical mechanical models by cleverly picking global energy eigenstates. These codes are covariant with respect to time evolution by construction, since the codewords are energy eigenstates. Now, we understand more precisely under which conditions such codes can be constructed.

Perhaps an even more illustrative example is that of time evolution in holographic quantum gravity, that is, in the AdS/CFT correspondence. This model of quantum gravity has the property that it is equivalent to a usual quantum field theory that lives on the boundary of the universe. What’s more, the correspondence which tells us how the bulk quantum gravity theory is mapped to the boundary is, in fact, a quantum error-correcting code. If we add a time axis, then the picture becomes a cylinder where the interior is the theory of quantum gravity, and where the cylinder itself represents a traditional quantum field theory:

AdSCFT-01

Since the bulk theory and the boundary theory are equivalent, the action of time evolution must be faithfully represented in both pictures. But this is in apparent contradiction with the Eastin-Knill theorem, from which it follows that a quantum error-correcting code cannot be covariant with respect to a continuous symmetry. We now understand how this is, in fact, not a contradiction: As we’ve seen, codes may be covariant with respect to continuous symmetries in the presence of systems with a large number of degrees of freedom, such as a quantum field theory.

What’s next?

There are some further results in our paper that I have not touched upon in this post, including a precise approximate statement of the Eastin-Knill theorem in terms of system dimensions, and a fun machinery to construct covariant codes for more general systems such as oscillators and rotors.

We have only scratched the surface of the different applications I’ve mentioned, by studying the properties of covariant codes in general. I’m now excited to dive into more detail with our wonderful team to study deeper applications to correlations in many-body systems, global symmetries in quantum gravity, accuracy limits of quantum clocks and precision limits to quantum metrology in the presence of noise.

This has been an incredibly fun project to work on. Such a collaboration illustrates again the benefit of interacting with great scientists with a wide range of areas of expertise including representation theory, continuous variable systems, and quantum gravity. Thanks Sepehr, Victor, Grant, Fernando, Patrick, and John, for this fantastic experience.

My QIP 2019 After-Dinner Speech

Scientists who work on theoretical aspects of quantum computation and information look forward each year to the Conference on Quantum Information Processing (QIP), an annual event since 1998. This year’s meeting, QIP 2019, was hosted this past week by the University of Colorado at Boulder. I attended and had a great time, as I always do.

But this year, in addition to catching up with old friends and talking with colleagues about the latest research advances, I also accepted a humbling assignment: I was the after-dinner speaker at the conference banquet. Here is (approximately) what I said.

QIP 2019 After-Dinner Speech
16 January 2019

Thanks, it’s a great honor to be here, and especially to be introduced by Graeme Smith, my former student. I’m very proud of your success, Graeme. Back in the day, who would have believed it?

And I’m especially glad to join you for these holiday festivities. You do know this is a holiday, don’t you? Yes, as we do every January, we are once again celebrating Gottesman’s birthday! Happy Birthday, Daniel!

Look, I’m kidding of course. Yes, it really is Daniel’s birthday — and I’m sure he appreciates 500 people celebrating in his honor — but I know you’re really here for QIP. We’ve been holding this annual celebration of Quantum Information Processing since 1998 — this is the 22nd QIP. If you are interested in the history of this conference, it’s very helpful that the QIP website includes links to the sites for all previous QIPs. I hope that continues; it conveys a sense of history. For each of those past meetings, you can see what people were talking about, who was there, what they looked like in the conference photo, etc.

Some of you were there the very first time – I was not. But among the attendees at the first QIP, in Arhus in 1998, where a number of brilliant up-and-coming young scientists who have since then become luminaries of our field. Including: Dorit Aharonov, Wim van Dam, Peter Hoyer (who was an organizer), Michele Mosca, John Smolin, Barbara Terhal, and John Watrous. Also somewhat more senior people were there, like Harry Buhrman and Richard Cleve. And pioneers so eminent that we refer to them by their first names alone:  Umesh … Gilles … Charlie. It’s nice to know those people are still around, but it validates the health of our field that so many new faces are here, that so many young people are still drawn to QIP, 21 years after it all began. Over 300 students and postdocs are here this year, among nearly 500 attendees.

QIP has changed since the early days. It was smaller and more informal then; the culture was more like a theoretical physics conference, where the organizing committee brainstorms and conjures up a list of invited speakers. The system changed in 2006, when for the first time there were submissions and a program committee. That more formal system opened up opportunities to speak to a broader community, and the quality of the accepted talks has stayed very high — only 18% of 349 submissions were accepted this year.

In fact it has become a badge of honor to speak here — people put it on their CVs: “I gave a QIP contributed talk, or plenary talk, or invited talk.” But what do you think is the highest honor that QIP can bestow? Well, it’s obvious isn’t it? It’s the after-dinner speech! That’s the talk to rule them all. So Graeme told me, when he invited me to do this. And I checked, Gottesman put it on his website, and everyone knows Daniel is a very serious guy. So it must be important. Look, we’re having a banquet in honor of his birthday, and he can hardly crack a smile!

I hear the snickers. I know what you’re thinking. “John, wake up. Don’t you see what Graeme was trying to tell you: You’re too washed up to get a talk accepted to QIP! This is the only way to get you on the program now!” But no, you’re wrong. Graeme told me this is a great honor. And I trust Graeme. He’s an honest man. What? Why are you laughing? It’s true.

I asked Graeme, what should I talk about? He said, “Well, you might try to be funny.” I said, “What do you mean funny? You mean funny Ha Ha? Or do you mean funny the way cheese smells when it’s been in the fridge for too long?” He said, “No I mean really, really funny. You know, like Scott.”

So there it was, the gauntlet had been thrown. Some of you are too young to remember this, but the most notorious QIP after-dinner speech of them all was Scott Aaronson’s in Paris in 2006. Were you there? He used props, and he skewered his more senior colleagues with razor sharp impressions. And remember, this was 2006, so everybody was Scott’s more senior colleague. He was 12 at the time, if memory serves.

He killed. Even I appreciated some of the jokes; for example, as a physicist I could understand this one: Scott said, “I don’t care about the fine structure constant, it’s just a constant.” Ba ding!  So Scott set the standard back then, and though many have aspired to clear the bar since then, few have come close.

But remember, this was Graeme I was talking to. And I guess many of you know that I’ve had a lot of students through the years, and I’m proud of all of them. But my memory isn’t what it once was; I need to use mnemonic tricks to keep track of them now. So I have a rating system;  I rate them according to how funny they are. And Graeme is practically off the chart, that’s how funny he is. But his is what I call stealth humor. You can’t always tell that he’s being funny, but you assume it.

So I said, “Graeme, What’s the secret? Teach me how to be funny.” I meant it sincerely, and he responded sympathetically. Graeme said, “Well, if you want to be funny, you have to believe you are funny. So when I want to be funny, I think of someone who is funny, and I pretend to be that person.” I said, “Aha, so you go out there and pretend to be Graeme Smith?” And Graeme said, “No, that wouldn’t work for me. I close my eyes and pretend I’m … John Smolin!” I said, “Graeme, you mean you want me to be indistinguishable from John Smolin to an audience of computationally bounded quantum adversaries?” He nodded. “But Graeme, I don’t know any plausible cryptographic assumptions under which that’s possible!”

Fortunately, I had another idea. “I write poems,” I said. “What if I recite a poem? This would set a great precedent. From now on, everyone would know: the QIP after-dinner speech will be a poetry slam!”

Graeme replied “Well, that sounds [long pause] really [pause] boring. But how about a limerick? People love limericks.” I objected, “Graeme, I don’t do limericks. I’m not good at limericks.” But he wouldn’t back down. “Try a limerick,” Graeme said. “People like limericks. They’re so [pause] short.”

But I don’t do limericks. You see:

I was invited to speak here by Graeme.
He knows me well, just as I am.
He was really quite nice
When he gave this advice:
Please don’t do a poetry slam.

Well, like I said, I don’t do limericks.

So now I’m starting to wonder: Why did they invite me to do this anyway? And I think I figured that out. See, Graeme asked me to speak just a few days ago. This must be what happened. Like any smoothly functioning organizing committee, they lined up an after-dinner speaker months in advance, as is the usual practice.

But then, just a few days before the conference began, they began to worry. “We better comb through the speaker’s Twitter feed. Maybe, years ago, our speaker said something offensive, something disqualifying.” And guess what? They found something, something really bad. It turned out that the designated after-dinner speaker had once made a deeply offensive remark about something called “quantum supremacy” … No, wait … that can’t be it.

Can’t you picture the panicky meeting of the organizers? QIP is about to start, and there’s no after-dinner speaker! So people started throwing out suggestions, starting with the usual suspects.

“How about Schroedinger’s Rat?”
“No, he’s booked.”
“Are you telling me Schroedinger’s Rat has another gig that same night?”
“No, no, I mean they booked him.  A high-profile journal filed a complaint and he’s in the slammer.”
“Well, how about RogueQIPConference?”
“No, same problem.”

“I’ve got it,” someone says: “How about the hottest quantum Twitter account out there? Yes, I’m talking about Quantum Computing Memes for QMA-Complete Teens!”

Are you all following that account? You should be. That’s where I go for all the latest fast-breaking quantum news. And that’s where you can get advice about what a quantumist should wear on Halloween. Your costume should combine Sexy with your greatest fear.  Right, I mean Sexy P = BQP.

Hey does that worry you? That maybe P = BQP? Does it keep you up at night? It’s possible, isn’t it? But it doesn’t worry me much. If it turns out that P = BQP, I’m just going to make up another word. How about NISP? Noisy Intermediate-Scale Polynomial.

I guess they weren’t able to smoke out whoever is behind Quantum Computing Memes for QMA-Complete Teens. So here I am.

Aside from Limericks, Graeme had another suggestion. He said, “You can reminisce. Tell us what QIP was like in the old days.” “The old days?” I said. “Yes, you know. You could be one of those stooped-over white-haired old men who tells interminable stories that nobody cares about.” I hesitated. “Yeah, I think I could do that.”

Okay, if that’s what you want, I’ll tell a story about my first QIP; that was QIP 2000, which was actually in Montreal in December 1999. It was back in the BPC era — Before Program Committee — and I was an invited speaker (I talked about decoding the toric code). Attending with me was Michael Nielsen, then a Caltech postdoc. Michael’s good friend Ike Chuang was also in the hotel, and they were in adjacent rooms. Both had brought laptops (not a given in 1999), and they wanted to share files. Well, hotels did not routinely offer Internet access back then, and certainly not wireless. But Ike had brought along a spool of Ethernet cable. So Ike and Mike both opened their windows, even though it was freezing cold. And Ike leaned out his window and made repeated attempts to toss the cable though Michael’s window before he finally succeeded, and they connected their computers.

I demanded to know, why the urgent need for a connection? And that was the day I found what most of the rest of the quantum world already knew: Mike and Ike were writing a book! By then they were in the final stages of writing, after some four years of effort (they sent the final draft of the book off to Cambridge University Press the following June).

So, QIP really has changed. The Mike and Ike book is out now. And it’s no longer necessary to open your window on a frigid Montreal evening to share a file with your collaborator.

Boy, it was cold that week in Montreal. [How cold was it?] Well, we went to lunch one day during the conference, and were walking single file down a narrow sidewalk toward the restaurant, when Harry Buhrman, who was right behind me, said: “John, there’s an icicle on your backpack!” You see, I hadn’t screwed the cap all the way shut on my water bottle, water was leaking out of the bottle, soaking through the backback, and immediately freezing on contact with the air; hence the icicle. And ever since then I’ve always been sure to screw my bottle cap shut tight. But over the years since then, lots of other things have spilled in my backpack just the same, and I’d love to tell you about that, but …

Well, my stories may be too lacking in drama to carry the evening ….  Look, I don’t care what Graeme says, I’m gonna recite some poems!

I can’t remember how this got started, but some years ago I started writing a poem whenever I needed to introduce a speaker at the Caltech physics colloquium. I don’t do this so much anymore. Partly because I realized that my poetry might reveal my disturbing innermost thoughts, which are best kept private.

Actually, one of my colleagues, after hearing one of my poems, suggested throwing the poem into a black hole. And when we tried it … boom …. it bounced right back, but in a highly scrambled form! And ever since then I’ve had that excuse. If someone says “That’s not such a great poem,” I can shoot back, “Yeah, but it was better before it got scrambled.”

But anyway, here’s one I wrote to honor Ben Schumacher, the pioneer of quantum information theory who named the qubit, and whose compression theorem you all know well.

Ben.
He rocks.
I remember
When
He showed me how to fit
A qubit
In a small box.

I wonder how it feels
To be compressed.
And then to pass
A fidelity test.
Or does it feel
At all, and if it does
Would I squeal
Or be just as I was?

If not undone
I’d become as I’d begun
And write a memorandum
On being random.
Had it felt like a belt
Of rum?

And might it be predicted
That I’d become addicted,
Longing for my session
Of compression?

I’d crawl
To Ben again.
And call,
“Put down your pen!
Don’t stall!
Make me small!”

[Silence]

Yeah that’s the response I usually get when I recite this poem — embarrassed silence, followed by a few nervous titters.

So, as you can see, as in Ben Schumacher’s case, I use poetry to acknowledge our debt to the guiding intellects of our discipline. It doesn’t always work, though. I once tried to write a poem about someone I admire very much, Daniel Gottesman, and it started like this:

When the weather’s hottest, then
I call for Daniel Gottesman.
My apples are less spotted when
Daniel eats the rottenest ten …

It just wasn’t working, so I stopped there. Someday, I’ll go back and finish it. But it’s tough to rhyme “Gottesman.”

More apropos of QIP, some of you may recall that about 12 years ago, one of the hot topics was quantum speedups for formula evaluation, a subject ignited by a brilliant paper by Eddie Farhi, Jeffrey Goldstone, and Sam Gutmann. They showed there’s a polynomial speedup if we use a quantum computer to, say, determine whether a two-player game has a winning strategy. That breakthrough inspired me to write an homage to Eddie, which went:

We’re very sorry, Eddie Farhi
Your algorithm’s quantum.
Can’t run it on those mean machines
Until we’ve actually got ‘em.

You’re not alone, so go on home,
Tell Jeffrey and tell Sam:
Come up with something classical
Or else it’s just a scam.

Unless … you think it’s on the brink
A quantum-cal device.
That solves a game and brings you fame.
Damn! That would be nice!

Now, one thing that Graeme explained to me is that the white-haired-old-man talk has a mandatory feature: It must go on too long. Maybe I have met that criterion by now. Except …

There’s one thing Graeme neglected to say. He never told me that I must not sing at QIP.

You see, there’s a problem: Tragically, though I like to sing, I don’t sing very well at all. And unfortunately, I am totally unaware of this fact. So I sometimes I sing in public, despite strongly worded advice not to do so.

When I was about to leave home on my way to QIP, my wife Roberta asked me, “When are you going to prepare your after-dinner talk?” I said, “Well, I guess I’ll work on it on the plane.” She said, “LA to Denver, that’s not a long enough flight.” I said, “I know!”

What I didn’t say, is that I was thinking of singing a song. If I had, Roberta would have tried to stop me from boarding the plane.

So I guess it’s up to you, what do you think? Should we stop here while I’m (sort of) ahead, or should we take the plunge. Song or no song? How many say song?

All right, that’s good enough for me! This is a song that I usually perform in front of a full orchestra, and I hoped the Denver Symphony Orchestra would be here to back me up. But it turns out they don’t exist anymore. So I’ll just have to do my best.

If you are a fan of Rodgers and Hammerstein, you’ll recognize the tune as a butchered version of Some Enchanted Evening, But the lyrics have changed. This song is called One Entangled Evening.

One entangled evening
We will see a qubit
And another qubit
Across a crowded lab.

And somehow we’ll know
We’ll know even then
This qubit’s entangled
Aligned with its friend.

One entangled evening
We’ll cool down a circuit
See if we can work it
At twenty milli-K.

A circuit that cold
Is worth more than gold
For qubits within it.
Will do as they’re told.

Quantum’s inviting, just as Feynman knew.
The future’s exciting, if we see it through

One entangled evening
Anyons will be braiding
And thereby evading
The noise that haunts the lab.

Then our quantum goods
Will work as they should
Solving the problems
No old gadget could!

Once we have dreamt it, we can make it so.
Once we have dreamt it, we can make it so!

The song lyrics are meant to be uplifting, and I admit they’re corny. No one can promise you that, in the words of another song, “the dreams that you dare to dream really do come true.” That’s not always the case.

At this time in the field of quantum information processing, there are very big dreams, and many of us worry about unrealistic expectations concerning the time scale for quantum computing to have a transformative impact on society. Progress will be incremental. New technology does not change the world all at once; it’s a gradual process.

But I do feel that from the perspective of the broad sweep of history, we (the QIP community and the broader quantum community) are very privileged to be working in this field at a pivotal time in the history of science and technology on earth. We should deeply cherish that good fortune, and the opportunities it affords. I’m confident that great discoveries lie ahead for us.

It’s been a great privilege for me to be a part of a thriving quantum community for more than 20 years. By now, QIP has become one of our venerable traditions, and I hope it continues to flourish for many years ahead. Now it’s up to all of you to make our quantum dreams come true. We are on a great intellectual adventure. Let’s savor it and enjoy it to the hilt!

Thanks for putting up with me tonight.

[And here’s proof that I really did sing.]