# Interaction + Entanglement = Efficient Proofs of Halting

A couple weeks ago my co-authors Zhengfeng Ji (UTS Sydney), Heny Yuen (University of Toronto) and Anand Natarajan and John Wright (both at Caltech’s IQIM, with John soon moving to UT Austin) & I posted a manuscript on the arXiv preprint server entitled

MIP*=RE

The magic of the single-letter formula quickly made its effect, and our posting received some attention on the blogosphere (see links below). Within computer science, complexity theory is at an advantage in its ability to capture powerful statements in few letters: who has not head of P, NP, and, for readers of this blog, BQP and QMA? (In contrast, I am under no illusion that my vague attempt at a more descriptive title has, by the time you reach this line, all but vanished from the reader’s memory.)

Even accounting for this popularity however, it is a safe bet that fewer of our readers have heard of MIP* or RE. Yet we are promised that the above-stated equality has great consequences for physics (“Tsirelson’s problem” in the study of nonlocality) and mathematics (“Connes’ embedding problem” in the theory of von Neumann algebras). How so — how can complexity-theoretic alphabet soup have any consequence for, on the one hand, physical reality, and on the other, abstract mathematics?

The goal of this post and the next one is to help the interested reader grasp the significance of interactive proofs (that lie between the symbols MIP*) and undecidability (that lies behind RE) for quantum mechanics.

The bulk of the present post is an almost identical copy of a post I wrote for my personal blog. To avoid accusations of self-plagiarism, I will substantiate it with a little picture and a story, see below. The post gives a very personal take on the research that led to the aforementioned result. In the next post, my co-author Henry Yuen has offered to give a more scientific introduction to the result and its significance.

Before proceeding, it is important to make it clear that the research described in this post and the next has not been refereed or thoroughly vetted by the community. This process will take place over the coming months, and we should wait until it is completed before placing too much weight on the results. As an author, I am proud of my work; yet I am aware that there is due process to be made before the claims can be officialised. As such, these posts only represent my opinion (and Henry’s) and not necessarily that of the wider scientific community.

For more popular introductions to our result, see the blog posts of Scott Aaronson, Dick Lipton, and Gil Kalai and reporting by Davide Castelvecchi for Nature and Emily Conover for Science.

Now for the personal post…and the promised picture. Isn’t it beautiful? The design is courtesy of Tony Metger and Alexandru Gheorghiu, the first a visiting student and the second a postdoctoral scholar at Caltech’s IQIM. While Tony and Andru came up with the idea, the execution is courtesy of the bakery store employee, who graciously implemented the custom design (apparently writing equations on top of cakes is not  common enough to be part of the standard offerings, so they had to go for the custom option). Although it is unclear if the executioner grasped the full depth of the signs they were copying, note how perfect the execution: not a single letter is out of place! Thanks to Tony, Andru, and the anonymous chef for the tasty souvenir.

Now for the story. In an earlier post on my personal research blog, I had reported on the beautiful recent result by Natarajan and Wright showing the astounding power of multi-prover interactive proofs with quantum provers sharing entanglement: in letters, $\text{NEEXP} \subseteq \text{MIP}^\star$. In the remainder of this post I will describe our follow-up work with Ji, Natarajan, Wright, and Yuen. In this post I will tell the story from a personal point of view, with all the caveats that this implies: the “hard science” will be limited (but there could be a hint as to how “science”, to use a big word, “progresses”, to use an ill-defined one; see also the upcoming post by Henry Yuen for more), the story is far too long, and it might be mostly of interest to me only. It’s a one-sided story, but that has to be. (In particular below I may at times attribute credit in the form “X had this idea”. This is my recollection only, and it is likely to be inaccurate. Certainly I am ignoring a lot of important threads.) I wrote this because I enjoyed recollecting some of the best moments in the story just as much as some the hardest; it is fun to look back and find meanings in ideas that initially appeared disconnected. Think of it as an example of how different lines of work can come together in unexpected ways; a case for open-ended research. It’s also an antidote against despair that I am preparing for myself: whenever I feel I’ve been stuck on a project for far too long, I’ll come back to this post and ask myself if it’s been 14 years yet — if not, then press on.

It likely comes as a surprise to me only that I am no longer fresh out of the cradle. My academic life started in earnest some 14 years ago, when in the Spring of 2006 I completed my Masters thesis in Computer Science under the supervision of Julia Kempe, at Orsay in France. I had met Julia the previous term: her class on quantum computing was, by far, the best-taught and most exciting course in the Masters program I was attending, and she had gotten me instantly hooked. Julia agreed to supervise my thesis, and suggested that I look into some interesting recent result by Stephanie Wehner that linked the study of entanglement and nonlocality in quantum mechanics to complexity-theoretic questions about interactive proof systems (specifically, this was Stephanie’s paper showing that $\text{XOR-MIP}^\star \subseteq \text{QIP}(2)$).

At the time the topic was very new. It had been initiated the previous year with a beautiful paper by Cleve et al. (that I have recommended to many a student since!) It was a perfect fit for me: the mathematical aspects of complexity theory and quantum computing connected to my undergraduate background, while the relative concreteness of quantum mechanics (it is a physical theory after all) spoke to my desire for real-world connection (not “impact” or even “application” — just “connection”). Once I got myself up to speed in the area (which consisted of three papers: the two I already mentioned, together with a paper by Kobayashi and Matsumoto where they studied interactive proofs with quantum messages), Julia suggested looking into the “entangled-prover” class $\text{MIP}^\star$ introduced in the aforementioned paper by Cleve et al. Nothing was known about this class! Nothing besides the trivial inclusion of single-prover interactive proofs, IP, and the containment in…ALL, the trivial class that contains all languages.

Yet the characterization MIP=NEXP of its classical counterpart by Babai et al. in the 1990s had led to one of the most productive lines of work in complexity of the past few decades, through the PCP theorem and its use from hardness of approximation to efficient cryptographic schemes. Surely, studying $\text{MIP}^\star$ had to be a productive direction? In spite of its well-established connection to classical complexity theory, via the formalism of interactive proofs, this was a real gamble. The study of entanglement from the complexity-theoretic perspective was entirely new, and bound to be fraught with difficulty; very few results were available and the existing lines of works, from the foundations of non-locality to more recent endeavors in device-independent cryptography, provided little other starting point than strong evidence that even the simplest examples came with many unanswered questions. But my mentor was fearless, and far from a novice in terms of defraying new areas, having done pioneering work in areas ranging from quantum random walks to Hamiltonian complexity through adiabatic computation. Surely this would lead to something?

It certainly did. More sleepless nights than papers, clearly, but then the opposite would only indicate dullness. Julia’s question led to far more unexpected consequences than I, or I believe she, could have imagined at the time. I am writing this post to celebrate, in a personal way, the latest step in 15 years of research by dozens of researchers: today my co-authors and I uploaded to the quant-ph arXiv what we consider a complete characterization of the power of entangled-prover interactive proof systems by proving the equality $\text{MIP}^\star = \text{RE}$, the class of all recursively enumerable languages (a complete problem for RE is the halting problem). Without going too much into the result itself (if you’re interested, look for an upcoming post here that goes into the proof a bit more), and since this is a more personal post, I will continue on with some personal thoughts about the path that got us there.

When Julia & I started working on the question, our main source of inspiration were the results by Cleve et al. showing that the non-local correlations of entanglement had interesting consequences when seen through the lens of interactive proof systems in complexity theory. Since the EPR paper, a lot of work in understanding entanglement had already been accomplished in the Physics community, most notably by Mermin, Peres, Bell, and more recently the works in device-independent quantum cryptography by Acin, Pironio, Scarani and many others, stimulated by Ekert’s proposal for quantum key distribution and Mayers and Yao’s idea for “device-independent cryptography”. By then we certainly knew that “spooky action-at-a-distance” did not entail any faster-than-light communication, and indeed was not really “action-at-a-distance” in the first place but merely “correlation-at-a-distance”. What Cleve et al. recognized is that these “spooky correlations-at-a-distance” were sufficiently special so as to not only give numerically different values in “Bell inequalities”, the tool invented by Bell to evidence non-locality in quantum mechanics, but also have some potentially profound consequences in complexity theory.

In particular, examples such as the “Magic Square game” demonstrated that enough correlation could be gained from entanglement so as to defeat basic proof systems whose soundness relied only on the absence of communication between the provers, an assumption that until then had been wrongly equated with the assumption that any computation performed by the provers could be modeled entirely locally. I think that the fallacy of this implicit assumption came as a surprise to complexity theorists, who may still not have entirely internalized it. Yet the perfect quantum strategy for the Magic Square game provides a very concrete “counter-example” to the soundness of the “clause-vs-variable” game for 3SAT. Indeed this game, a reformulation by Aravind and Cleve-Mermin of a Bell Inequality discovered by Mermin and Peres in 1990, can be easily re-framed as a 3SAT system of equations that is not satisfiable, and yet is such that the associated two-player clause-vs-variable game has a perfect quantum strategy. It is this observation, made in the paper by Cleve et al., that gave the first strong hint that the use of entanglement in interactive proof systems could make many classical results in the area go awry.

By importing the study of non-locality into complexity theory Cleve et al. immediately brought it into the realm of asymptotic analysis. Complexity theorists don’t study fixed objects, they study families of objects that tend to have a uniform underlying structure and whose interesting properties manifest themselves “in the limit”. As a result of this new perspective focus shifted from the study of single games or correlations to infinite families thereof. Some of the early successes of this translation include the “unbounded violations” that arose from translating asymptotic separations in communication complexity to the language of Bell inequalities and correlations (e.g. this paper). These early successes attracted the attention of some physicists working in foundations as well as some mathematical physicists, leading to a productive exploration that combined tools from quantum information, functional analysis and complexity theory.

The initial observations made by Cleve et al. had pointed to $\text{MIP}^\star$ as a possibly interesting complexity class to study. Rather amazingly, nothing was known about it! They had shown that under strong restrictions on the verifier’s predicate (it should be an XOR of two answer bits), a collapse took place: by the work of Hastad, XOR-MIP equals NEXP, but $\text{MIP}^\star$ is included in EXP. This seemed very fortuitous (the inclusion is proved via a connection with semidefinite programming that seems tied to the structure of XOR-MIP protocols): could entanglement induce a collapse of the entire, unrestricted class? We thought (at this point mostly Julia thought, because I had no clue) that this ought not to be the case, and so we set ourselves to show that the equality $\text{MIP}^\star=\text{NEXP}$, that would directly parallel Babai et al.’s characterization MIP=NEXP, holds. We tried to show this by introducing techniques to “immunize” games against entanglement: modify an interactive proof system so that its structure makes it “resistant” to the kind of “nonlocal powers” that can be used to defeat the clause-vs-variable game (witness the Magic Square). This was partially successful, and led to one of the papers I am most proud of — I am proud of it because I think it introduced elementary techniques (such as the use of the Cauchy-Schwarz inequality — inside joke — more seriously, basic things such as “prover-switching”, “commutation tests”, etc.) that are now routine manipulations in the area. The paper was a hard sell! It’s good to remember the first rejections we received. They were not unjustified: the main point of criticism was that we were only able to establish a hardness result for exponentially small completeness-soundness gap. A result for such a small gap in the classical setting follows directly from a very elementary analysis based on the Cook-Levin theorem. So then why did we have to write so many pages (and so many applications of Cauchy-Schwarz!) to arrive at basically the same result (with a $^\star$)?

Eventually we got lucky and the paper was accepted to a conference. But the real problem, of establishing any non-trivial lower bound on the class $\text{MIP}^\star$ with constant (or, in the absence of any parallel repetition theorem, inverse-polynomial) completeness-soundness gap, remained. By that time I had transitioned from a Masters student in France to a graduate student in Berkeley, and the problem (pre-)occupied me during some of the most difficult years of my Ph.D. I fully remember spending my first year entirely thinking about this (oh and sure, that systems class I had to pass to satisfy the Berkeley requirements), and then my second year — yet, getting nowhere. (I checked the arXiv to make sure I’m not making this up: two full years, no posts.) I am forever grateful to my fellow student Anindya De for having taken me out of the cycle of torture by knocking on my door with one of the most interesting questions I have studied, that led me into quantum cryptography and quickly resulted in an enjoyable paper. It was good to feel productive again! (Though the paper had fun reactions as well: after putting it on the arXiv we quickly heard from experts in the area that we had solved an irrelevant problem, and that we better learn about information theory — which we did, eventually leading to another paper, etc.) The project had distracted me and I set interactive proofs aside; clearly, I was stuck.

About a year later I visited IQC in Waterloo. I don’t remember in what context the visit took place. What I do remember is a meeting in the office of Tsuyoshi Ito, at the time a postdoctoral scholar at IQC. Tsuyoshi asked me to explain our result with Julia. He then asked a very pointed question: the bedrock for the classical analysis of interactive proof systems is the “linearity test” of Blum-Luby-Rubinfeld (BLR). Is there any sense in which we could devise a quantum version of that test?

What a question! This was great. At first it seemed fruitless: in what sense could one argue that quantum provers apply a “linear function”? Sure, quantum mechanics is linear, but that is besides the point. The linearity is a property of the prover’s answers as a function of their question. So what to make of the quantum state, the inherent randomness, etc.?

It took us a few months to figure it out. Once we got there however, the answer was relatively simple — the prover should be making a question-independent measurement that returns a linear function that it applies to its question in order to obtain the answer returned to the verifier — and it opened the path to our subsequent paper showing that the inclusion of NEXP in $\text{MIP}^\star$ indeed holds. Tsuyoshi’s question about linearity testing had allowed us to make the connection with PCP techniques; from there to MIP=NEXP there was only one step to make, which is to analyze multi-linearity testing. That step was suggested by my Ph.D. advisor, Umesh Vazirani, who was well aware of the many pathways towards the classical PCP theorem, since the theorem had been obtained in great part by his former student Sanjeev Arora. It took a lot of technical work, yet conceptually a single question from my co-author had sufficed to take me out of a 3-year slumber.

This was in 2012, and I thought we were done. For some reason the converse inclusion, of $\text{MIP}^\star$ in NEXP, seemed to resist our efforts, but surely it couldn’t resist much longer. Navascues et al. had introduced a hierarchy of semidefinite programs that seemed to give the right answer (technically they could only show convergence to a relaxation, the commuting value, but that seemed like a technicality; in particular, the values coincide when restricted to finite-dimensional strategies, which is all we computer scientists cared about). There were no convergence bounds on the hierarchy, yet at the same time commutative SDP hierarchies were being used to obtain very strong results in combinatorial optimization, and it seemed like it would only be a matter of time before someone came up with an analysis of the quantum case. (I had been trying to solve a related “dimension reduction problem” with Oded Regev for years, and we were making no progress; yet it seemed someone ought to!)

In Spring 2014 during an open questions session at a workshop at the Simons Institute in Berkeley Dorit Aharonov suggested that I ask the question of the possible inclusion of QMA-EXP, the exponential-sized-proofs analogue of QMA, in $\text{MIP}^\star$. A stronger result than the inclusion of NEXP (under assumptions), wouldn’t it be a more natural “fully quantum” analogue of MIP=NEXP? Dorit’s suggestion was motivated by research on the “quantum PCP theorem”, that aims to establish similar hardness results in the realm of the local Hamiltonian problem; see e.g. this post for the connection. I had no idea how to approach the question — I also didn’t really believe the answer could be positive — but what can you do, if Dorit asks you something… So I reluctantly went to the board and asked the question. Joe Fitzsimons was in the audience, and he immediately picked it up! Joe had the fantastic ideas of using quantum error-correction, or more specifically secret-sharing, to distribute a quantum proof among the provers. His enthusiasm overcame my skepticism, and we eventually showed the desired inclusion. Maybe $\text{MIP}^\star$ was bigger than $\text{NEXP}$ after all.

Our result, however, had a similar deficiency as the one with Julia, in that the completeness-soundness gap was exponentially small. Obtaining a result with a constant gap took 3 years of couple more years of work and the fantastic energy and insights of a Ph.D. student at MIT, Anand Natarajan. Anand is the first person I know of to have had the courage to dive into the most technical aspects of the analysis of the aforementioned results, while also bringing in the insights of a “true quantum information theorist” that were supported by Anand’s background in Physics and upbringing in the group of Aram Harrow at MIT. (In contrast I think of myself more as a “raw” mathematician; I don’t really understand quantum states other than as positive-semidefinite matrices…not that I understand math either of course; I suppose I’m some kind of a half-baked mish-mash.) Anand had many ideas but one of the most beautiful ones led to what he poetically called the “Pauli braiding test”, a “truly quantum” analogue of the BLR linearity test that amounts to doing two linearity tests in conjugate bases and piecing the results together into a robust test for {n}-qubit entanglement (I wrote about our work on this here).

At approximately the same time, Zhengfeng Ji had another wonderful idea that was in some sense orthogonal to our work. (My interpretation of) Zhengfeng’s idea is that one can see an interactive proof system as a computation (verifier-prover-verifier) and use Kitaev’s circuit-to-Hamiltonian construction to transform the entire computation into a “quantum CSP” (in the same sense that the local Hamiltonian problem is a quantum analogue of classical constraint satisfaction problems (CSP)) that could then itself be verified by a quantum multi-prover interactive proof system…with exponential gains in efficiency! Zhengfeng’s result implied an exponential improvement in complexity compared to the result by Julia and myself, showing inclusion of NEEXP, instead of NEXP, in $\text{MIP}^\star$. However, Zhengfeng’s technique suffered from the same exponentially small completeness-soundness gap as we had, so that the best lower bound on $\text{MIP}^\star$ per se remained NEXP.

Both works led to follow-ups. With Natarajan we promoted the Pauli braiding test into a “quantum low-degree test” that allowed us to show the inclusion of QMA-EXP into $\text{MIP}^\star$, with constant gap, thereby finally answering the question posed by Aharonov 4 years after it was asked. (I should also say that by then all results on $\text{MIP}^\star$ started relying on a sequence of parallel repetition results shown by Bavarian, Yuen, and others; I am skipping this part.) In parallel, with Ji, Fitzsimons, and Yuen we showed that Ji’s compression technique could be “iterated” an arbitrary number of times. In fact, by going back to “first principles” and representing verifiers uniformly as Turing machines we realized that the compression technique could be used iteratively to (up to small caveats) give a new proof of the fact (first shown by Slofstra using an embedding theorem for finitely presented group) that the zero-gap version of $\text{MIP}^\star$ contains the halting problem. In particular, the entangled value is uncomputable! This was not the first time that uncomputability crops in to a natural problem in quantum computing (e.g. the spectral gap paper), yet it still surprises when it shows up. Uncomputable! How can anything be uncomputable!

As we were wrapping up our paper Henry Yuen realized that our “iterated compression of interactive proof systems” was likely optimal, in the following sense. Even a mild improvement of the technique, in the form of a slower closing of the completeness-soundness gap through compression, would yield a much stronger result: undecidability of the constant-gap class $\text{MIP}^\star$. It was already known by work of Navascues et al., Fritz, and others, that such a result would have, if not surprising, certainly consequences that seemed like they would be taking us out of our depth. In particular, undecidability of any language in $\text{MIP}^\star$ would imply a negative resolution to a series of equivalent conjectures in functional analysis, from Tsirelson’s problem to Connes’ Embedding Conjecture through Kirchberg’s QWEP conjecture. While we liked our result, I don’t think that we believed it could resolve any conjecture(s) in functional analysis.

So we moved on. At least I moved on, I did some cryptography for a change. But Anand Natarajan and his co-author John Wright did not stop there. They had the last major insight in this story, which underlies their recent STOC best paper described in the previous post. Briefly, they were able to combine the two lines of work, by Natarajan & myself on low-degree testing and by Ji et al. on compression, to obtain a compression that is specially tailored to the existing $\text{MIP}^\star$ protocol for NEXP and compresses that protocol without reducing its completeness-soundness gap. This then let them show Ji’s result that $\text{MIP}^\star$ contains NEEXP, but this time with constant gap! The result received well-deserved attention. In particular, it is the first in this line of works to not suffer from any caveats (such as a closing gap, or randomized reductions, or some kind of “unfair” tweak on the model that one could attribute the gain in power to), and it implies an unconditional separation between MIP and $\text{MIP}^\star$.

As they were putting the last touches on their result, suddenly something happened, which is that a path towards a much bigger result opened up. What Natarajan & Wright had achieved is a one-step gapless compression. In our iterated compression paper we had observed that iterated gapless compression would lead to $\text{MIP}^\star=\text{RE}$, implying negative answers to the aforementioned conjectures. So then?

I suppose it took some more work, but in some way all the ideas had been laid out in the previous 15 years of work in the complexity of quantum interactive proof systems; we just had to put it together. And so a decade after the characterization QIP = PSPACE of single-prover quantum interactive proof systems, we have arrived at a characterization of quantum multiprover interactive proof systems, $\text{MIP}^\star = \text{RE}$. With one author in common between the two papers: congratulations Zhengfeng!

Even though we just posted a paper, in a sense there is much more left to do. I am hopeful that our complexity-theoretic result will attract enough interest from the mathematicians’ community, and especially operator algebraists, for whom CEP is a central problem, that some of them will be willing to devote time to understanding the result. I also recognize that much effort is needed on our own side to make it accessible in the first place! I don’t doubt that eventually complexity theory will not be needed to obtain the purely mathematical consequences; yet I am hopeful that some of the ideas may eventually find their way into the construction of interesting mathematical objects (such as, who knows, a non-hyperlinear group).

That was a good Masters project…thanks Julia!

# On the merits of flatworm reproduction

On my right sat a quantum engineer. She was facing a melanoma specialist who works at a medical school. Leftward of us sat a networks expert, a flatworm enthusiast, and a condensed-matter theorist.

Farther down sat a woman who slices up mouse brains.

Welcome to “Coherent Spins in Biology,” a conference that took place at the University of California, Los Angeles (UCLA) this past December. Two southern Californians organized the workshop: Clarice Aiello heads UCLA’s Quantum Biology Tech lab. Thorsten Ritz, of the University of California, Irvine, cofounded a branch of quantum biology.

Quantum biology served as the conference’s backdrop. According to conventional wisdom, quantum phenomena can’t influence biology significantly: Biological systems have high temperatures, many particles, and fluids. Quantum phenomena, such as entanglement (a relationship that quantum particles can share), die quickly under such conditions.

Yet perhaps some survive. Quantum biologists search for biological systems that might use quantum resources. Then, they model and measure the uses and resources. Three settings (at least) have held out promise during the past few decades: avian navigation, photosynthesis, and olfaction. You can read about them in this book, cowritten by a conference participant for the general public. I’ll give you a taste (or a possibly quantum smell?) by sketching the avian-navigation proposal, developed by Thorsten and colleagues.

Birds migrate southward during the autumn and northward during the spring. How do they know where to fly? At least partially by sensing the Earth’s magnetic field, which leads compass needles to point northward. How do birds sense the field?

Possibly with a protein called “cryptochrome.” A photon (a particle of light) could knock an electron out of part of the protein and into another part. Each part would have one electron that lacked a partner. The electrons would share entanglement. One electron would interact with the Earth’s magnetic field differently than its partner, because its surroundings would differ. (Experts: The electrons would form a radical pair. One electron would neighbor different atoms than the other, so the electron would experience a different local magnetic field. The discrepancy would change the relative phase between the electrons’ spins.) The discrepancy could affect the rate at which the chemical system could undergo certain reactions. Which reactions occur could snowball into large and larger effects, eventually signaling the brain about where the bird should fly.

Quantum mechanics and life rank amongst the universe’s mysteries. How could a young researcher resist the combination? A postdoc warned me away, one lunchtime at the start of my PhD. Quantum biology had enjoyed attention several years earlier, he said, but noise the obscured experimental data. Controversy marred the field.

I ate lunch with that postdoc in 2013. Interest in quantum biology is reviving, as evidenced in the conference. Two reasons suggested themselves: new technologies and new research avenues. For example, Thorsten described the disabling and deletion of genes that code for cryptochrome. Such studies require years’ more work but might illuminate whether cryptochrome affects navigation.

The keynote speaker, Harvard’s Misha Lukin, illustrated new technologies and new research avenues. Misha’s lab has diamonds that contain quantum defects, which serve as artificial atoms. The defects sense tiny magnetic fields and temperatures. Misha’s group applies these quantum sensors to biology problems.

For example, different cells in an embryo divide at different times. Imagine reversing the order in which the cells divide. Would the reversal harm the organism? You could find out by manipulating the temperatures in different parts of the embryo: Temperature controls the rate at which cells divide.

Misha’s team injected nanoscale diamonds into a worm embryo. (See this paper for a related study.) The diamonds reported the temperature at various points in the worm. This information guided experimentalists who heated the embryo with lasers.

The manipulated embryos grew into fairly normal adults. But their cells, and their descendants’ cells, cycled through the stages of life slowly. This study exemplified, to me, one of the most meaningful opportunities for quantum physicists interested in biology: to develop technologies and analyses that can answer biology questions.

I mentioned, in an earlier blog post, another avenue emerging in quantum biology: Physicist Matthew Fisher proposed a mechanism by which entanglement might enhance coordinated neuron firing. My collaborator Elizabeth Crosson and I analyzed how the molecules in Matthew’s proposal—Posner clusters—could process quantum information. The field of Posner quantum biology had a population of about two, when Elizabeth and I entered, and I wondered whether anyone would join us.

The conference helped resolve my uncertainty. Three speakers (including me) presented work based on Matthew’s; two other participants were tilling the Posner soil; and another speaker mentioned Matthew’s proposal. The other two Posner talks related data from three experiments. The experimentalists haven’t finished their papers, so I won’t share details. But stay tuned.

Posner molecule (image by Swift et al.)

Clarice and Thorsten’s conference reminded me of a conference I’d participated in at the end of my PhD: Last month, I moonlighted as a quantum biologist. In 2017, I moonlighted as a quantum-gravity theorist. Two years earlier, I’d been dreaming about black holes and space-time. At UCLA, I was finishing the first paper I’ve coauthored with biophysicists. What a toolkit quantum information theory and thermodynamics provide, that it can unite such disparate fields.

The contrast—on top of what I learned at UCLA—filled my mind for weeks. And reminded me of the description of asexual reproduction that we heard from the conference’s flatworm enthusiast. According to Western Michigan University’s Wendy Beane, a flatworm “glues its butt down, pops its head off, and grows a new one. Y’know. As one does.”

I hope I never flinch from popping my head off and growing a new one—on my quantum-information-thermodynamics spine—whenever new science calls for figuring out.

With thanks to Clarice, Thorsten, and UCLA for their invitation and hospitality.

# An equation fit for a novel

Archana Kamal was hunting for an apartment in Cambridge, Massachusetts. She was moving MIT, to work as a postdoc in physics. The first apartment she toured had housed John Updike, during his undergraduate career at Harvard. No other apartment could compete; Archana signed the lease.

The apartment occupied the basement of a red-brick building covered in vines. The rooms spanned no more than 350 square feet. Yet her window opened onto the neighbors’ garden, whose leaves she tracked across the seasons. And Archana cohabited with history.

She’s now studying the universe’s history, as an assistant professor of physics at the University of Massachusetts Lowell. The cosmic microwave background (CMB) pervades the universe. The CMB consists of electromagnetic radiation, or light. Light has particle-like properties and wavelike properties. The wavelike properties include wavelength, the distance between successive peaks. Long-wavelength light includes red light, infrared light, and radio waves. Short-wavelength light includes blue light, ultraviolet light, and X-rays. Light of one wavelength and light of another wavelength are said to belong to different modes.

Does the CMB have nonclassical properties, impossible to predict with classical physics but (perhaps) predictable with quantum theory? The CMB does according to the theory of inflation. According to the theory, during a short time interval after the Big Bang, the universe expanded very quickly: Spacetime stretched. Inflation explains features of our universe, though we don’t know what mechanism would have effected the expansion.

According to inflation, around the Big Bang time, all the light in the universe crowded together. The photons (particles of light) interacted, entangling (developing strong quantum correlations). Spacetime then expanded, and the photons separated. But they might retain entanglement.

Detecting that putative entanglement poses challenges. For instance, the particles that you’d need to measure could produce a signal too weak to observe. Cosmologists have been scratching their heads about how to observe nonclassicality in the CMB. One team—Nishant Agarwal at UMass Lowell and Sarah Shandera at Pennsylvania State University—turned to Archana for help.

Archana studies the theory of open quantum systems, quantum systems that interact with their environments. She thinks most about systems such as superconducting qubits, tiny circuits with which labs are building quantum computers. But the visible universe constitutes an open quantum system.

We can see only part of the universe—or, rather, only part of what we believe is the whole universe. Why? We can see only stuff that’s emitted light that has reached us, and light has had only so long to travel. But the visible universe interacts (we believe) with stuff we haven’t seen. For instance, according to the theory of inflation, that rapid expansion stretched some light modes’ wavelengths. Those wavelengths grew longer than the visible universe. We can’t see those modes’ peak-to-peak variations or otherwise observe the modes, often called “frozen.” But the frozen modes act as an environment that exchanges information and energy with the visible universe.

We describe an open quantum system’s evolution with a quantum master equation, which I blogged about four-and-a-half years ago. Archana and collaborators constructed a quantum master equation for the visible universe. The frozen modes, they found, retain memories of the visible universe. (Experts: the bath is non-Markovian.) Next, they need to solve the equation. Then, they’ll try to use their solution to identify quantum observables that could reveal nonclassicality in the CMB.

Frozen modes

Archana’s project caught my fancy for two reasons. First, when I visited her in October, I was collaborating on a related project. My coauthors and I were concocting a scheme for detecting nonclassical correlations in many-particle systems by measuring large-scale properties. Our paper debuted last month. It might—with thought and a dash of craziness—be applied to detect nonclassicality in the CMB. Archana’s explanation improved my understanding of our scheme’s potential.

Second, Archana and collaborators formulated a quantum master equation for the visible universe. A quantum master equation for the visible universe. The phrase sounded romantic to me.1 It merited a coauthor who’d seized on an apartment lived in by a Pulitzer Prize-winning novelist.

Archana’s cosmology and Updike stories reminded me of one reason why I appreciate living in the Boston area: History envelops us here. Last month, while walking to a grocery, I found a sign that marks the building in which the poet e. e. cummings was born. My walking partner then generously tolerated a recitation of cummings’s “anyone lived in a pretty how town.” History enriches our lives—and some of it might contain entanglement.

1It might sound like gobbledygook to you, if I’ve botched my explanations of the terminology.

With thanks to Archana and the UMass Lowell Department of Physics and Applied Physics for their hospitality and seminar invitation.

# Breaking up the band structure

Note from the editor: During the Summer of 2019, a group of thirteen undergraduate students from Caltech and universities around the world, spent 10 weeks on campus performing research in experimental quantum physics. Below, Aiden Cullo, a student from Binghampton University in New York, shares his experience working in Professor Yeh’s lab. The program, termed QuantumSURF, will run again during the Summer of 2020.

This summer, I worked in Nai-Chang Yeh’s experimental condensed matter lab. The aim of my project was to observe the effects of a magnetic field on our topological insulator (TI) sample, ${(BiSb)}_2{Te}_3$. The motivation behind this project was to examine more closely the transformation between a topological insulator and a state exhibiting the anomalous hall effect (AHE).

Both states of matter have garnered a good deal of interest in condensed matter research because of their interesting transport properties, among other things. TIs have gained popularity due to their applications in electronics (spintronics), superconductivity, and quantum computation. TIs are peculiar in that they simultaneously have insulating bulk states and conducting surface state. Due to time-reversal symmetry (TRS) and spin-momentum locking, these surface states have a very symmetric hourglass-like gapless energy band structure (Dirac cone).

The focus of our particular study was the effects of “c-plane” magnetization of our TI’s surface state. Theory predicts TRS and spin-momentum locking will be broken, resulting in a gapped spectrum with a single connection between the valence and conduction bands. This gapping has been theorized and shown experimentally in Chromium (Cr)-doped ${(BiSb)}_2{Te}_3$ and numerous other TIs with similar make-up.

In 2014, Nai-Chang Yeh’s group showed that Cr-doped ${Bi}_2{Se}_3$ exhibit this gap opening due to the surface state of ${Bi}_2{Se}_3$ interacting via the proximity effect with a ferromagnet. Our contention is that a similar material, Cr-doped ${(BiSb)}_2{Te}_3$, exhibits a similar effect, but more homogeneously because of reduced structural strain between atoms. Specifically, at temperatures below the Curie temperature (Tc), we expect to see a gap in the energy band and an overall increase in the gap magnitude. In short, the main goal of my summer project was to observe the gapping of our TI’s energy band.

Overall, my summer project entailed a combination of reading papers/textbooks and hands-on experimental work. It was difficult to understand fully the theory behind my project in such a short amount of time, but even with a cursory knowledge of topological insulators, I was able to provide a meaningful analysis/interpretation of our data.

Additionally, my experiment relied heavily on external factors such as our supplier for liquid helium, argon gas, etc. As a result, our progress was slowed if an order was delayed or not placed far enough in advance. Most of the issues we encountered were not related to the abstract theory of the materials/machinery, but rather problems with less complex mechanisms such as wiring, insulation, and temperature regulation.

While I expected to spend a good deal of time troubleshooting, I severely underestimated the amount of time that would be spent dealing with quotidian problems such as configuring software or etching STM tips. Working on a machine as powerful as an STM was frustrating at times, but also very rewarding as eventually we were able to collect a large amount of data on our samples.

An important (and extremely difficult) part of our analysis of STM data was whether patterns/features in our data set were artifacts or genuine phenomena, or a combination. I was fortunate enough to be surrounded by other researchers that helped me sift through the volumes of data and identify traits of our samples. Reflecting on my SURF, I believe it was a positive experience as it not only taught me a great deal about research, but also, more importantly, closely mimicked the experience of graduate school.

# The paper that begged for a theme song

A year ago, the “I’m a little teapot” song kept playing in my head.

I was finishing a collaboration with David Limmer, a theoretical chemist at the University of California Berkeley. David studies quantum and classical systems far from equilibrium, including how these systems exchange energy and information with their environments. Example systems include photoisomers.

A photoisomer is a molecular switch. These switches appear across nature and technologies. We have photoisomers in our eyes, and experimentalists have used photoisomers to boost solar-fuel storage. A photoisomer has two functional groups, or collections of bonded atoms, attached to a central axis.

Your average-Joe photoisomer spends much of its life in equilibrium, exchanging heat with room-temperature surroundings. The molecule has the shape above, called the cis configuration. Imagine shining a laser or sunlight on the photoisomer. The molecule can absorb a photon, or particle of light, gaining energy. The energized switch has the opportunity to switch: One chemical group can rotate downward. The molecule will occupy its trans configuration.

The molecule now has more energy than it had while equilibrium, albeit less energy than it had right after absorbing the photon. The molecule can remain in this condition for a decent amount of time. (Experts: The molecule occupies a metastable state.) That is, the molecule can store sunlight. For that reason, experimentalists at Harvard and MIT attached photoisomers to graphene nanotubules, improving the nanotubules’ storage of solar fuel.

With what probability does a photoisomer switch upon absorbing a photon? This question has resisted easy answering, because photoisomers prove difficult to model: They’re small, quantum, and far from equilibrium. People have progressed by making assumptions, but such assumptions can lack justifications or violate physical principles. David wanted to derive a simple, general bound—of the sort in which thermodynamicists specialize—on a photoisomer’s switching probability.

He had a hunch as to how he could derive such a bound. I’ve blogged, many times, about thermodynamic resource theories. Thermodynamic resource theories are simple models, developed in quantum information theory, for exchanges of heat, particles, information, and more. These models involve few assumptions: the conservation of energy, quantum theory, and, to some extent, the existence of a large environment (Markovianity). With such a model, David suspected, he might derive his bound.

I knew nothing about photoisomers when I met David, but I knew about thermodynamic resource theories. I’d contributed to their development, to the theorems that have piled up in the resource-theory corner of quantum information theory. Then, the corner had given me claustrophobia. Those theorems felt so formal, abstract, and idealized. Formal, abstract theory has drawn me ever since I started studying physics in college. But did resource theories model physical reality? Could they impact science beyond our corner of quantum information theory? Did resource theories matter?

I called for connecting thermodynamic resource theories to physical reality four years ago, in a paper that begins with an embarrassing story about me. Resource theorists began designing experiments whose results should agree with our theorems. Theorists also tried to improve the accuracy with which resource theories model experimentalists’ limitations. See David’s and my paper for a list of these achievements. They delighted me, as a step toward the broadening of resource theories’ usefulness.

Like any first step, this step pointed toward opportunities. Experiments designed to test our theorems essentially test quantum mechanics. Scientists have tested quantum mechanics for decades; we needn’t test it much more. Such experimental proposals can push experimentalists to hone their abilities, but I hoped that the community could accomplish more. We should be able to apply resource theories to answer questions cultivated in other fields, such as condensed matter and chemistry. We should be useful to scientists outside our corner of quantum information.

David’s idea lit me up like photons on a solar-fuel-storage device. He taught me about photoisomers, I taught him about resource theories, and we derived his bound. Our proof relies on the “second laws of thermodynamics.” These abstract resource-theory results generalize the second law of thermodynamics, which helps us understand why time flows in only one direction. We checked our bound against numerical simulations (experts: of Lindbladian evolution). Our bound is fairly tight if the photoisomer has a low probability of absorbing a photon, as in the Harvard-MIT experiment.

Experts: We also quantified the photoisomer’s coherences relative to the energy eigenbasis. Coherences can’t boost the switching probability, we concluded. But, en route to this conclusion, we found that the molecule is a natural realization of a quantum clock. Our quantum-clock modeling extends to general dissipative Landau-Zener transitions, prevalent across condensed matter and chemistry.

As I worked on our paper one day, a jingle unfolded in my head. I recognized the tune first: “I’m a little teapot.” I hadn’t sung that much since kindergarten, I realized. Lyrics suggested themselves:

I’m a little isomer
with two hands.
Here is my cis pose;
here is my trans.

Stand me in the sunlight;
watch me spin.
I’ll keep solar
energy in!

The song lodged itself in my head for weeks. But if you have to pay an earworm to collaborate with David, do.

# Bas|ket>ball: A Game for Young Students Learning Quantum Computing

It is no secret that quantum computing has recently become one of the trendiest topics within the physics community, gaining financial support and good press at an ever increasing pace. The new technology not only promises huge advances in information processing, but it also – in theory – has the potential to crack the encryption that currently protects sensitive information inside governments and businesses around the world. Consequently, quantum research has extended beyond academic groups and has entered the technical industry, creating new job opportunities for both experimentalists and theorists. However, in order for this technology to become a reality, we need qualified engineers and scientists that can fill these positions.

Increasing the number of individuals with an interest in this field starts with educating our youth. While it does not take particularly advanced mathematics to explain the basics of quantum computing, there are still involved topics such as quantum superposition, unitary evolution, and projective measurement that can be difficult to conceptualize. In order to explain these topics at a middle and high school level to encourage more students to enter this area, we decided to design an educational game called Bas|ket>ball, which allows students to directly engage with quantum computing concepts outside of the classroom while being physically active.

After playing the game with students in our local Quantum Information High (QIHigh) Program at Stevens Institute of Technology, we realized that the game is a fun learning tool worth sharing with the broader physics community. Here, we describe a non-gender specific activity that can be used to effectively teach the basics of quantum computing at a high school level.

Quantum Basketball is something that helps you understand a very confusing topic, especially for a ninth grader! In the QI-High Program at Stevens, I was approached with a challenge of learning about quantum computing, and while I was hesitant at first, my mentors made the topic so much more understandable by relating it to a sport that I love!

Grace Conlin, Freshman Student from High Tech High School

## The Rules of Bas|ket>ball

The game can have up to 10 student players and only requires one basketball. Each player acts as a quantum bit (qubit) in a quantum register and is initialized to the |0> position. During each turn, a player will perform one of the allowed quantum gates depending on their position on the court. A diagram of the court positions is displayed at the bottom.

There are four options of quantum gates from which players can choose to move around the court:

1. X Gate – This single qubit gate will take a player from the |0> to the |1> position, and vice versa.
2. Hadamard Gate – This single qubit gate will take a player from the |0> to the (|0> + |1>) / $\sqrt{2}$ position and the |1> to the (|0> – |1>) / $\sqrt{2}$ position, and vice versa.
3. Control-Not Gate – This two-qubit gate allows one player to control another only if they are in the |1> position, or in superposition between |0> and |1>. The player in the |1> position can move a player back and forth between the |0> and |1> positions. The player in the superposition can choose to entangle with a player in the |0> position.
4. Z Measurement – The player takes a shot. The player measures a 1 if he/she makes the shot and measures a 0 if he/she misses. Once the player shoots, he/she has to return back to the |0> position no matter what was measured.

The first player to measure ten 1’s (make ten shots) wins! In order to make the game more interesting, the following additional rules are put in place:

1. Each player has one SWAP gate that can be utilized per game to switch positions with any other player, including players that are entangled. This is an effective way to replace yourself with someone in an entangled state.
2. Up to five players can be entangled at any given time. The only way to break the entanglement is to make a Z Measurement by taking a shot. If one of the entangled players makes a shot, each player entangled with that player receives a point value equal to the number of individuals they are entangled with (including themselves). If the player misses, the entanglement is broken and no points are awarded. Either way, all players go back to the |0> position.

## Example Bas|ket>ball Match

For example, let’s say that we have three student players. Each will start at the red marker, behind the basketball hoop. One by one, each student will choose from the list of gate operations above. If the first player chooses an X-gate, he/she will physically move to the blue marker and be in a better position to make a measurement (take a shot) during the next turn. If the second player chooses to perform a Hadamard gate, he/she will move to the green marker. Each of the students will continue to move around the court and score points by making measurements until 10 points are reached.

However, things can get more interesting when players start to form alliances by entangling. If player 1 is at the green marker and player 3 is at the red marker, then player 1 can perform a C-Not gate on player 3 to become entangled with them. Now if either player takes a shot and scores a point, both players will be awarded 2 points (1 x the number of players entangled).

We believe that simple games such as this, along with Quantum TiqTaqToe and Quantum Chess, will attract more young students to pursue degrees in physics and computer science, and eventually specialize in quantum information and computing fields. Not only is this important for the overall progression of the field, but also to encourage more diversity and inclusion in STEM.

# The quantum steampunker by Massachusetts Bay

Every spring, a portal opens between Waltham, Massachusetts and another universe.

The other universe has a Watch City dual to Waltham, known for its watch factories. The cities throw a festival to which explorers, inventors, and tourists flock. Top hats, goggles, leather vests, bustles, and lace-up boots dot the crowds. You can find pet octopodes, human-machine hybrids, and devices for bending space and time. Steam powers everything.

Watch City Steampunk Festival

So I learned thanks to Maxim Olshanyi, a professor of physics at the University of Massachusetts Boston. He hosted my colloquium, “Quantum steampunk: Quantum information meets thermodynamics,” earlier this month. Maxim, I discovered, has more steampunk experience than I. He digs up century-old designs for radios, builds the radios, and improves upon the designs. He exhibits his creations at the Watch City Steampunk Festival.

Maxim Olshanyi

I never would have guessed that Maxim moonlights with steampunkers. But his hobby makes sense: Maxim has transformed our understanding of quantum integrability.

Integrability is to thermalization as Watch City is to Waltham. A bowl of baked beans thermalizes when taken outside in Boston in October: Heat dissipates into the air. After half-an-hour, large-scale properties bear little imprint of their initial conditions: The beans could have begun at 112ºF or 99º or 120º. Either way, the beans have cooled.

Integrable systems avoid thermalizing; more of their late-time properties reflect early times. Why? We can understand through an example, an integrable system whose particles don’t interact with each other (whose particles are noninteracting fermions). The dynamics conserve the particles’ momenta. Consider growing the system by adding particles. The number of conserved quantities grows as the system size. The conserved quantities retain memories of the initial conditions.

Imagine preparing an integrable system, analogously to preparing a bowl of baked beans, and letting it sit for a long time. Will the system equilibrate, or settle down to, a state predictable with a simple rule? We might expect not. Obeying the same simple rule would cause different integrable systems to come to resemble each other. Integrable systems seem unlikely to homogenize, since each system retains much information about its initial conditions.

Maxim and collaborators exploded this expectation. Integrable systems do relax to simple equilibrium states, which the physicists called the generalized Gibbs ensemble (GGE). Josiah Willard Gibbs cofounded statistical mechanics during the 1800s. He predicted the state to which nonintegrable systems, like baked beans in autumnal Boston, equilibrate. Gibbs’s theory governs classical systems, like baked beans, as does the GGE theory. But also quantum systems equilibrate to the GGE, and Gibbs’s conclusions translate into quantum theory with few adjustments. So I’ll explain in quantum terms.

Consider quantum baked beans that exchange heat with a temperature-$T$ environment. Let $\hat{H}$ denote the system’s Hamiltonian, which basically represents the beans’ energy. The beans equilibrate to a quantum Gibbs state, $e^{ - \hat{H} / ( k_{\rm B} T ) } / Z$. The $k_{\rm B}$ denotes Boltzmann’s constant, a fundamental constant of nature. The partition function $Z$ enables the quantum state to obey probability theory (normalizes the state).

Maxim and friends modeled their generalized Gibbs ensemble on the Gibbs state. Let $\hat{I}_m$ denote a quantum integrable system’s $m^{\rm th}$ conserved quantity. This system equilibrates to $e^{ - \sum_m \lambda_m \hat{I}_m } / Z_{\rm GGE}$. The $Z_{\rm GGE}$ normalizes the state. The intensive parameters $\lambda_m$’s serve analogously to temperature and depend on the conserved quantities’ values. Maxim and friends predicted this state using information theory formalized by Ed Jaynes. Inventing the GGE, they unlocked a slew of predictions about integrable quantum systems.

A radio built by Maxim. According to him, “The invention was to replace a diode with a diode bridge, in a crystal radio, thus gaining a factor of two in the output power.”

I define quantum steampunk as the intersection of quantum theory, especially quantum information theory, with thermodynamics, and the application of this intersection across science. Maxim has used information theory to cofound a branch of quantum statistical mechanics. Little wonder that he exhibits homemade radios at the Watch City Steampunk Festival. He also holds a license to drive steam engines and used to have my postdoc position. I appreciate having older cousins to look up to. Here’s hoping that I become half the quantum steampunker that I found by Massachusetts Bay.

With thanks to Maxim and the rest of the University of Massachusetts Boston Department of Physics for their hospitality.

The next Watch City Steampunk Festival takes place on May 9, 2020. Contact me if you’d attend a quantum-steampunk meetup!

# Sultana: The Girl Who Refused To Stop Learning

Caltech attracts some truly unique individuals from all across the globe with a passion for figuring things out. But there was one young woman on campus this past summer whose journey towards scientific research was uniquely inspiring.

Sultana spent the summer at Caltech in the SURF program, working on next generation quantum error correction codes under the supervision of Dr. John Preskill. As she wrapped up her summer project, returning to her “normal” undergraduate education in Arizona, I had the honor of helping her document her remarkable journey. This is her story:

Afghanistan

My name is Sultana. I was born in Afghanistan. For years I was discouraged and outright prevented from going to school by the war. It was not safe for me because of the active war and violence in the region, even including suicide bombings. Society was still recovering from the decades long civil war, the persistent influence of a dethroned, theocratically regressive regime and the current non-functioning government. These forces combined to make for a very insecure environment for a woman. It was tacitly accepted that the only place safe for a woman was to remain at home and stay quiet. Another consequence of these circumstances was that the teachers at local schools were all male and encouraged the girls to not come to school and study. What was the point if at the end of the day a woman’s destiny was to stay at home and cook?

For years, I would be up every day at 8am and every waking hour was devoted to housework and preparing the house to host guests, typically older women and my grandmother’s friends. I was destined to be a homemaker and mother. My life had no meaning outside of those roles.

My brothers would come home from school, excited about mathematics and other subjects. For them, it seemed like life was full of infinite possibilities. Meanwhile I had been confined to be behind the insurmountable walls of my family’s compound. All the possibilities for my life had been collapsed, limited to a single identity and purpose.

At fourteen I had had enough. I needed to find a way out of the mindless routine and depressing destiny. And more specifically, I wanted to understand how complex, and clearly powerful, human social systems, such as politics, economics and culture, combined to create overtly negative outcomes like imbalance and oppression. I made the decision to wake up two hours early every day to learn English, before taking on the day’s expected duties.

My grandfather had a saying, “If you know English, then you don’t have to worry about where the food is going to come from.”

He taught himself English and eventually became a professor of literature and humanities. He had even encouraged his five daughters to pursue advanced education. My aunts became medical doctors and chemists (one an engineer, another a teacher). My mother became a lecturer at a university, a profession she would be forced to leave when the Mujaheddin came to power.

I started by studying newspapers and any book I could get my hands on. My hunger for knowledge proved insatiable.

When my father got the internet, the floodgates of information opened. I found and took online courses through sites like Khan Academy and, later, Coursera.

I was intrigued by discussions between my brothers on mathematics. Countless pages of equations and calculations could propagate from a single, simple question; just like how a complex and towering tree can emerge from a single seed.

Khan Academy provided a superbly structured approach to learning mathematics from scratch. Most importantly, mathematics did not rely on a mastery of English as a prerequisite.

Over the next few years I consumed lesson after lesson, expanding my coursework into physics. I would supplement this unorthodox yet structured education with a more self-directed investigation into philosophy through books like Kant’s Critique of Pure Reason. While math and physics helped me develop confidence and ability, ultimately, I was still driven by trying to understand the complexities of human behavior and social systems.

Emily from Iowa

To further develop my hold on English I enrolled in a Skype-based student exchange program and made a critical friend in Emily from Iowa. After only a few conversations, Emily suggested that my English was so good that I should consider taking the SAT and start applying for schools. She soon became a kind of college counselor for me.

Even though my education was stonewalled by an increasingly repressive socio-political establishment, I had the full support of my family. There were no SAT testing locations in Afghanistan. So when it was clear to my family I had the potential to get a college education, my uncle took me across the border into Pakistan, to take the SAT. However, a passport from Afghanistan was required to take the test and, when it was finally granted, it had to be smuggled across the border. Considering that I had no formal education and little time to study for the SAT, I earned a surprisingly competitive score on the exam.

My confidence soared and I convinced my family to make the long trek to the American embassy and apply for a student visa. I was denied in less than sixty seconds! They thought I would end up not studying and becoming an economic burden. I was crushed. And my immaturely formed vision of the world was clearly more idealized than the reality that presented itself and slammed its door in my face. I was even more confused by how the world worked and I immediately became invested in understanding politics.

The New York Times

Emily was constantly working in the background on my behalf, and on the other side of the world, trying to get the word out about my struggle. This became her life’s project, to somehow will me into a position to attend a university. New York Times writer Nicholas Kristoff heard about my story and we conducted an interview over Skype. The story was published in the summer of 2016.

The New York Times opinion piece was published in June. Ironically, I didn’t have much say or influence on the opinion-editorial piece. I felt that the piece was overly provocative.

Even now, because family members still live under the threat of violence, I will not allow myself to be photographed. Suffice to say, I never wanted to stir up trouble, or call attention to myself. Even so, the net results of that article are overwhelmingly positive. I was even offered a scholarship to attend Arizona State University; that was, if I could secure a visa.

I was pessimistic. I had been rejected twice already by what should have been the most logical and straightforward path towards formal education in America. How was this special asylum plea going to result in anything different? But Nicholas Kristoff was absolutely certain I would get it. He gave my case to an immigration lawyer with a relationship to the New York Times. In just a month and a half I was awarded humanitarian parole. This came with some surprising constraints, including having to fly to the U.S. within ten days and a limit of four months to stay there while applying for asylum. As quickly as events were unfolding, I didn’t even hesitate.

As I was approaching America, I realized that over 5,000 miles of water would now separate me from the most influential forces in my life. The last of these flights took me deep into the center of America, about a third of the way around the planet.

The clock was ticking on my time in America – at some point, factors and decisions outside of my control would deign that I was safe to go back to Afghanistan – so I exhausted every opportunity to obtain knowledge while I was isolated from the forces that would keep me from formal education. I petitioned for an earlier than expected winter enrollment at Arizona State University. In the meantime, I continued my self-education through edX classes (coursework from MIT made available online), as well as with Khan Academy and Coursera.

Phoenix

The answer came back from Arizona State University. They had granted me enrollment for the winter quarter. In December of 2016, I flew to the next state in my journey for intellectual independence and began my first full year of formal education at the largest university in America. Mercifully, my tenure in Phoenix began in the cool winter months. In fact, the climate was very similar to what I knew in Afghanistan.

However, as summer approached, I began to have a much different experience. This was the first time I was living on my own. It took me a while to be accustomed to that. I would generally stay in my room and study, even avoiding classes. The intensifying heat of the Arizona sun ensured that I would stay safely and comfortably encased inside. And I was actually doing okay. At first.

Happy as I was to finally be a part of formal education, it was in direct conflict with the way in which I had trained myself to learn. The rebellious spirit which helped me defy the cultural norms and risk harm to myself and my family, the same fire that I had to continuously stoke for years on my own, also made me rebel against the system that actively wanted me to learn. I constantly felt that I had better approaches to absorb the material and actively ignored the homework assignments. Naturally, my grades suffered and I was forced to make a difficult internal adjustment. I also benefited from advice from Emily, as well as a cousin who was pursuing education in Canada.

As I gritted my teeth and made my best attempts to adopt the relatively rigid structures of formal education, I began to feel more and more isolated. I found myself staying in my room day after day, focused simply on studying. But for what purpose? I was aimless. A machine of insatiable learning, but without any specific direction to guide my curiosity. I did not know it at the time, but I was desperate for something to motivate me.

The ripples from the New York Times piece were still reverberating and Sultana was contacted by author Betsy Devine. Betsy was a writer who had written a couple of books with notable scientists. Betsy was particularly interested in introducing Sultana to her husband, Nobel prize winner in physics, Frank Wilczek.

The first time I met Frank Wilczek was at lunch with with him and his wife. Wilczek enjoys hiking in the mountains overlooking surrounding Phoenix and Betsy suggested that I join Frank on an early morning hike. A hike. With Frank Wilczek. This was someone whose book, A Beautiful Question: Finding Nature’s Deep Design, I had read while in Afghanistan. To say that I was nervous is an understatement, but thankfully we fell into an easy flow of conversation. After going over my background and interests he asked me if I was interested in physics. I told him that I was, but I was principally interested in concepts that could be applied very generally, broadly – so that I could better understand the underpinnings of how society functions.

He told me that I should pursue quantum physics. And more specifically, he got me very excited about the prospects of quantum computers. It felt like I was placed at the start of a whole new journey, but I was walking on clouds. I was filled with a confidence that could only be generated by finding oneself comfortable in casual conversation with a Nobel laureate.

Immediately after the hike I went and collected all of the relevant works Wilczek had suggested, including Dirac’s “The Principles of Quantum Mechanics.”

Reborn

With a new sense of purpose, I immersed myself in the formal coursework, as well as my own, self-directed exploration of quantum physics. My drive was rewarded with all A’s in the fall semester of my sophomore year.

That same winter Nicholas Kristoff had published his annual New York Times opinion review of the previous year titled, “Why 2017 Was the Best Year in Human History.” I was mentioned briefly.

It was the start of the second semester of my sophomore year, and I was starting to feel a desire to explore applied physics. I was enrolled in a graduate-level seminar class in quantum theory that spring. One of the lecturers for the class was a young female professor who was interested in entropy, and more importantly, how we can access seemingly lost information. In other words, she wanted access to the unknown.

To that end, she was interested in gauge/gravity duality models like the one meant to explain the black hole “firewall” paradox, or the Anti-de Sitter space/conformal field theory (AdS/CFT) correspondence that uses a model of the universe where space-time has negative, hyperbolic curvature.

The geometry of 5D space-time in AdS space resembles that of an M.C.Escher drawing, where fish wedge themselves together, end-to-end, tighter and tighter as we move away from the origin. These connections between fish are consistent, radiating in an identical pattern, infinitely approaching the edge.

Unbeknownst to me, a friend of that young professor had read the Times opinion article. The article not only mentioned that I had been teaching myself string theory, but also that I was enrolled at Arizona State University and taking graduate level courses. She asked the young professor if she would be interested in meeting me.

The young professor invited me to her office, she told me about how black holes were basically a massive manifestation of entropy, and the best laboratory by which to learn the true nature of information loss, and how it might be reversed. We discussed the possibility of working on a research paper to help her codify the quantum component for her holographic duality models.

I immediately agreed. If there was anything in physics as difficult as understanding human social, religious and political dynamics, it was probably understanding the fundamental nature of space and time. Because the AdS/CFT model of spacetime was negatively curved, we could employ something called holographic quantum error correction to create a framework by which the information of a bulk entity (like a black hole) can be preserved at its boundary, even with some of its physical components (particles) becoming corrupted, or lost.

I spent the year wrestling with, and developing, quantum error correcting codes for a very specific kind of black hole. I learned that information has a way of protecting itself from decay through correlations. For instance, a single logical quantum bit (or “qubit”) of information can be represented, or preserved, by five stand-in, or physical, qubits. At a black hole’s event horizon, where entangled particles are pulled apart, information loss can be prevented as long as less than three-out-of-five of the representative physical qubits are lost to the black hole interior. The original quantum information can be recalled by using a quantum code to reverse this “error”.

By the end of my sophomore year I was nominated to represent Arizona State University at an inaugural event supporting undergraduate women in science. The purpose of the event was to help prepare promising women in physics for graduate school applications, as well as provide information on life as a graduate student. The event, called FUTURE of Physics, was to be held at Caltech.

I mentioned the nomination to Frank Wilczek and he excitedly told me that I must use the opportunity to meet Dr. John Preskill, who was at the forefront of quantum computing and quantum error correction. He reminded me that the best advice he could give anyone was to “find interesting minds and bother them.”

I spent two exciting days at Caltech with 32 other young women from all over the country on November 1st and 2nd of 2018. I was fortunate to meet John Preskill. And of course I introduced myself like any normal human being would, by asking him about the Shor factoring algorithm. I even got to attend a Wednesday group meeting with all of the current faculty and postdocs at IQIM. When I returned to ASU I sent an email to Dr. Preskill inquiring about potentially joining a short research project with his team.

I was extremely relieved when months later I received a response and an invitation to apply for the Summer Undergraduate Research Fellowship (SURF) at Caltech. Because Dr. Preskill’s recent work has been at the forefront of quantum error correction for quantum computing it was relatively straightforward to come up with a research proposal that aligned with the interests of my research adviser at ASU.

One of the major obstacles to efficient and widespread proliferation of quantum computers is the corruption of qubits, expensively held in very delicate low-energy states, by environmental interference and noise. People simply don’t, and should not, have confidence in practical, everyday use of quantum computers without reliable quantum error correction. The proposal was to create a proof that, if you’re starting with five physical qubits (representing a single logical qubit) and lose two of those qubits due to error, you can work backwards to recreate the original five qubits, and recover the lost logical qubit in the context of holographic error correcting codes. My application was accepted, and I made my way to Pasadena at the beginning of this summer.

The temperate climate, mountains and lush neighborhoods were a welcome change, especially with the onslaught of relentless heat that was about to envelope Phoenix.

Even at a campus as small as Caltech I felt like the smallest, most insignificant fish in a tiny, albeit prestigious, pond. But soon I was being connected to many like-minded, heavily motivated mathematicians and physicists, from all walks of life and from every corner of the Earth. Seasoned, young post-docs, like Grant Salton and Victor Albert introduced me to HaPPY tensors. HaPPY tensors are a holographic tensor network model developed by Dr. Preskill and colleagues meant to represent a toy model of AdS/CFT. Under this highly accessible and world-class mentorship, and with essentially unlimited resources, I wrestled with HaPPY tensors all summer and successfully discovered a decoder that could recover five qubits from three.

Example of tensor network causal and entanglement wedge reconstructions. From a blog post by Beni Yoshida on March 27th, 2015 on Quantum Frontiers.

This was the ultimate confidence booster. All the years of doubting myself and my ability, due to educating myself in a vacuum, lacking the critical feedback provided by real mentors, all disappeared.

Tomorrow

Now returning to ASU to finish my undergraduate education, I find myself still thinking about what’s next. I still have plans to expand my proof, extending beyond five qubits, to a continuous variable representation, and writing a general algorithm for an arbitrary N layer tensor-network construction. My mentors at Caltech have graciously extended their support to this ongoing work. And I now dream to become a professor of physics at an elite institution where I can continue to pursue the answers to life’s most confusing problems.

My days left in America are not up to me. I am applying for permanent amnesty so I can continue to pursue my academic dreams, and to take a crack at some of the most difficult problems facing humanity, like accelerating the progress towards quantum computing. I know I can’t pursue those goals back in Afghanistan. At least, not yet. Back there, women like myself are expected to stay at home, prepare food and clean the house for everybody else.

Little do they know how terrible I am at housework – and how much I love math.

# Yes, seasoned scientists do extraordinary science.

Imagine that you earned tenure and your field’s acclaim decades ago. Perhaps you received a Nobel Prize. Perhaps you’re directing an institute for science that you helped invent. Do you still do science? Does mentoring youngsters, advising the government, raising funds, disentangling logistics, presenting keynote addresses at conferences, chairing committees, and hosting visitors dominate the time you dedicate to science? Or do you dabble, attend seminars, and read, following progress without spearheading it?

People have asked whether my colleagues do science when weighed down with laurels. The end of August illustrates my answer.

At the end of August, I participated in the eighth Conference on Quantum Information and Quantum Control (CQIQC) at Toronto’s Fields Institute. CQIQC bestows laurels called “the John Stewart Bell Prize” on quantum-information scientists. John Stewart Bell revolutionized our understanding of entanglement, strong correlations that quantum particles can share and that power quantum computing. Aephraim Steinberg, vice-chair of the selection committee, bestowed this year’s award. The award, he emphasized, recognizes achievements accrued during the past six years. This year’s co-winners have been leading quantum information theory for decades. But the past six years earned the winners their prize.

Peter Zoller co-helms IQOQI in Innsbruck. (You can probably guess what the acronym stands for. Hint: The name contains “Quantum” and “Institute.”) Ignacio Cirac is a director of the Max Planck Institute of Quantum Optics near Munich. Both winners presented recent work about quantum many-body physics at the conference. You can watch videos of their talks here.

Peter discussed how a lab in Austria and a lab across the world can check whether they’ve prepared the same quantum state. One lab might have trapped ions, while the other has ultracold atoms. The experimentalists might not know which states they’ve prepared, and the experimentalists might have prepared the states at different times. Create multiple copies of the states, Peter recommended, measure the copies randomly, and play mathematical tricks to calculate correlations.

Ignacio expounded upon how to simulate particle physics on a quantum computer formed from ultracold atoms trapped by lasers. For expert readers: Simulate matter fields with fermionic atoms and gauge fields with bosonic atoms. Give the optical lattice the field theory’s symmetries. Translate the field theory’s Lagrangian into Hamiltonian language using Kogut and Susskind’s prescription.

Even before August, I’d collected an arsenal of seasoned scientists who continue to revolutionize their fields. Frank Wilczek shared a physics Nobel Prize for theory undertaken during the 1970s. He and colleagues helped explain matter’s stability: They clarified how close-together quarks (subatomic particles) fail to attract each other, though quarks draw together when far apart. Why stop after cofounding one subfield of physics? Frank spawned another in 2012. He proposed the concept of a time crystal, which is like table salt, except extended across time instead of across space. Experimentalists realized a variation on Frank’s prediction in 2018, and time crystals have exploded across the scientific literature.1

Rudy Marcus is 96 years old. He received a chemistry Nobel Prize, for elucidating how electrons hop between molecules during reactions, in 1992. I took a nonequilibrium-statistical-mechanics course from Rudy four years ago. Ever since, whenever I’ve seen him, he’s asked for the news in quantum information theory. Rudy’s research group operates at Caltech, and you won’t find “Emeritus” in the title on his webpage.

My PhD supervisor, John Preskill, received tenure at Caltech for particle-physics research performed before 1990. You might expect the rest of his career to form an afterthought. But he helped establish quantum computing, starting in the mid-1990s. During the past few years, he co-midwifed the subfield of holographic quantum information theory, which concerns black holes, chaos, and the unification of quantum theory with general relativity. Watching a subfield emerge during my PhD left a mark like a tree on a bicyclist (or would have, if such a mark could uplift instead of injure). John hasn’t helped create subfields only by garnering resources and encouraging youngsters. Several papers by John and collaborators—about topological quantum matter, black holes, quantum error correction, and more—have transformed swaths of physics during the past 15 years. Nor does John stamp his name on many papers: Most publications by members of his group don’t list him as a coauthor.

Do my colleagues do science after laurels pile up on them? The answer sounds to me, in many cases, more like a roar than like a “yes.” Much science done by senior scientists inspires no less than the science that established them. Beyond their results, their enthusiasm inspires. Never mind receiving a Bell Prize. Here’s to working toward deserving a Bell Prize every six years.

With thanks to the Fields Institute, the University of Toronto, Daniel F. V. James, Aephraim Steinberg, and the rest of the conference committee for their invitation and hospitality.

You can find videos of all the conference’s talks here. My talk is shown here

1To scientists, I recommend this Physics Today perspective on time crystals. Few articles have awed and inspired me during the past year as much as this review did.

# Quantum conflict resolution

If only my coauthors and I had quarreled.

I was working with Tony Bartolotta, a PhD student in theoretical physics at Caltech, and Jason Pollack, a postdoc in cosmology at the University of British Columbia. They acted as the souls of consideration. We missed out on dozens of opportunities to bicker—about the paper’s focus, who undertook which tasks, which journal to submit to, and more. Bickering would have spiced up the story behind our paper, because the paper concerns disagreement.

Quantum observables can disagree. Observables are measurable properties, such as position and momentum. Suppose that you’ve measured a quantum particle’s position and obtained an outcome $x$. If you measure the position immediately afterward, you’ll obtain $x$ again. Suppose that, instead of measuring the position again, you measure the momentum. All the possible outcomes have equal probabilities of obtaining. You can’t predict the outcome.

The particle’s position can have a well-defined value, or the momentum can have a well-defined value, but the observables can’t have well-defined values simultaneously. Furthermore, if you measure the position, you randomize the outcome of a momentum measurement. Position and momentum disagree.

How should we quantify the disagreement of two quantum observables, $\hat{A}$ and $\hat{B}$? The question splits physicists into two camps. Pure quantum information (QI) theorists use uncertainty relations, whereas condensed-matter and high-energy physicists prefer out-of-time-ordered correlators. Let’s meet the camps in turn.

Heisenberg intuited an uncertainty relation that Robertson formalized during the 1920s,

$\Delta \hat{A} \, \Delta \hat{B} \geq \frac{1}{i \hbar} \langle [\hat{A}, \hat{B}] \rangle$.

Imagine preparing a quantum state $| \psi \rangle$ and measuring $\hat{A}$, then repeating this protocol in many trials. Each trial has some probability $p_a$ of yielding the outcome $a$. Different trials will yield different $a$’s. We quantify the spread in $a$ values with the standard deviation $\Delta \hat{A} = \sqrt{ \langle \psi | \hat{A}^2 | \psi \rangle - \langle \psi | \hat{A} | \psi \rangle^2 }$. We define $\Delta \hat{B}$ analogously. $\hbar$ denotes Planck’s constant, a number that characterizes our universe as the electron’s mass does.

$[\hat{A}, \hat{B}]$ denotes the observables’ commutator. The numbers that we use in daily life commute: $7 \times 5 = 5 \times 7$. Quantum numbers, or operators, represent $\hat{A}$ and $\hat{B}$. Operators don’t necessarily commute. The commutator $[\hat{A}, \hat{B}] = \hat{A} \hat{B} - \hat{B} \hat{A}$ represents how little $\hat{A}$ and $\hat{B}$ resemble 7 and 5.

Robertson’s uncertainty relation means, “If you can predict an $\hat{A}$ measurement’s outcome precisely, you can’t predict a $\hat{B}$ measurement’s outcome precisely, and vice versa. The uncertainties must multiply to at least some number. The number depends on how much $\hat{A}$ fails to commute with $\hat{B}$.” The higher an uncertainty bound (the greater the inequality’s right-hand side), the more the operators disagree.

Heisenberg and Robertson explored operator disagreement during the 1920s. They wouldn’t have seen eye to eye with today’s QI theorists. For instance, QI theorists consider how we can apply quantum phenomena, such as operator disagreement, to information processing. Information processing includes cryptography. Quantum cryptography benefits from operator disagreement: An eavesdropper must observe, or measure, a message. The eavesdropper’s measurement of one observable can “disturb” a disagreeing observable. The message’s sender and intended recipient can detect the disturbance and so detect the eavesdropper.

How efficiently can one perform an information-processing task? The answer usually depends on an entropy $H$, a property of quantum states and of probability distributions. Uncertainty relations cry out for recasting in terms of entropies. So QI theorists have devised entropic uncertainty relations, such as

$H (\hat{A}) + H( \hat{B} ) \geq - \log c. \qquad (^*)$

The entropy $H( \hat{A} )$ quantifies the difficulty of predicting the outcome $a$ of an $\hat{A}$ measurement. $H( \hat{B} )$ is defined analogously. $c$ is called the overlap. It quantifies your ability to predict what happens if you prepare your system with a well-defined $\hat{A}$ value, then measure $\hat{B}$. For further analysis, check out this paper. Entropic uncertainty relations have blossomed within QI theory over the past few years.

Pure QI theorists, we’ve seen, quantify operator disagreement with entropic uncertainty relations. Physicists at the intersection of condensed matter and high-energy physics prefer out-of-time-ordered correlators (OTOCs). I’ve blogged about OTOCs so many times, Quantum Frontiers regulars will be able to guess the next two paragraphs.

Consider a quantum many-body system, such as a chain of qubits. Imagine poking one end of the system, such as by flipping the first qubit upside-down. Let the operator $\hat{W}$ represent the poke. Suppose that the system evolves chaotically for a time $t$ afterward, the qubits interacting. Information about the poke spreads through many-body entanglement, or scrambles.

Imagine measuring an observable $\hat{V}$ of a few qubits far from the $\hat{W}$ qubits. A little information about $\hat{W}$ migrates into the $\hat{V}$ qubits. But measuring $\hat{V}$ reveals almost nothing about $\hat{W}$, because most of the information about $\hat{W}$ has spread across the system. $\hat{V}$ disagrees with $\hat{W}$, in a sense. Actually, $\hat{V}$ disagrees with $\hat{W}(t)$. The $(t)$ represents the time evolution.

The OTOC’s smallness reflects how much $\hat{W}(t)$ disagrees with $\hat{V}$ at any instant $t$. At early times $t \gtrsim 0$, the operators agree, and the OTOC $\approx 1$. At late times, the operators disagree loads, and the OTOC $\approx 0$.

Different camps of physicists, we’ve seen, quantify operator disagreement with different measures: Today’s pure QI theorists use entropic uncertainty relations. Condensed-matter and high-energy physicists use OTOCs. Trust physicists to disagree about what “quantum operator disagreement” means.

I want peace on Earth. I conjectured, in 2016 or so, that one could reconcile the two notions of quantum operator disagreement. One must be able to prove an entropic uncertainty relation for scrambling, wouldn’t you think?

You might try substituting $\hat{W}(t)$ for the $\hat{A}$ in Ineq. ${(^*)}$, and $\hat{V}$ for the $\hat{B}$. You’d expect the uncertainty bound to tighten—the inequality’s right-hand side to grow—when the system scrambles. Scrambling—the condensed-matter and high-energy-physics notion of disagreement—would coincide with a high uncertainty bound—the pure-QI-theory notion of disagreement. The two notions of operator disagreement would agree. But the bound I’ve described doesn’t reflect scrambling. Nor do similar bounds that I tried constructing. I banged my head against the problem for about a year.

The sky brightened when Jason and Tony developed an interest in the conjecture. Their energy and conversation enabled us to prove an entropic uncertainty relation for scrambling, published this month.1 We tested the relation in computer simulations of a qubit chain. Our bound tightens when the system scrambles, as expected: The uncertainty relation reflects the same operator disagreement as the OTOC. We reconciled two notions of quantum operator disagreement.

As Quantum Frontiers regulars will anticipate, our uncertainty relation involves weak measurements and quasiprobability distributions: I’ve been studying their roles in scrambling over the past three years, with colleagues for whose collaborations I have the utmost gratitude. I’m grateful to have collaborated with Tony and Jason. Harmony helps when you’re tackling (quantum operator) disagreement—even if squabbling would spice up your paper’s backstory.

1Thanks to Communications Physics for publishing the paper. For pedagogical formatting, read the arXiv version.