# The spirit of relativity

One of the most immersive steampunk novels I’ve read winks at an experiment performed in a university I visited this month. The Watchmaker of Filigree Street, by Natasha Pulley, features a budding scientist named Grace Carrow. Grace attends Oxford as one of its few women students during the 1880s. To access the university’s Bodleian Library without an escort, she masquerades as male. The librarian grouses over her request.

“‘The American Journal of  Science – whatever do you want that for?’” As the novel points out, “The only books more difficult to get hold of than little American journals were first copies of [Isaac Newton’s masterpiece] Principia, which were chained to the desks.”

As a practitioner of quantum steampunk, I relish slipping back to this stage of intellectual history. The United States remained an infant, to centuries-old European countries. They looked down upon the US as an intellectual—as well as partially a literal—wilderness.1 Yet potential was budding, as Grace realized. She was studying an American experiment that paved the path for Einstein’s special theory of relativity.

How does light travel? Most influences propagate through media. For instance, ocean waves propagate in water. Sound propagates in air. The Victorians surmised that light similarly travels through a medium, which they called the luminiferous aether. Nobody, however, had detected the aether.

Albert A. Michelson and Edward W. Morley squared up to the task in 1887. Michelson, brought up in a Prussian immigrant family, worked as a professor at the Case School of Applied Science in Cleveland, Ohio. Morley taught chemistry at Western Reserve University, which shared its campus with the recent upstart Case. The two schools later merged to form Case Western Reserve University, which I visited this month.

We can intuit Michelson and Morley’s experiment by imagining two passengers on a (steam-driven, if you please) locomotive: Audrey and Baxter. Say that Audrey walks straight across the aisle, from one window to another. In the same time interval, and at the same speed relative to the train, Baxter walks down the aisle, from row to row of seats. The train carries both passengers in the direction in which Baxter walks.

Baxter travels farther than Audrey, as the figures below show. Covering a greater distance in the same time, he travels more quickly.

Replace each passenger with a beam of light, and replace the train with the aether. (The aether, Michelson and Morley reasoned, was moving relative to their lab as a train moves relative to the countryside. The reason was, the aether filled space and the Earth was moving through space. The Earth was moving through the aether, so the lab was moving through the aether, so the aether was moving relative to the lab.)

The scientists measured how quickly the “Audrey” beam of light traveled relative to the “Baxter” beam. The measurement relied on an apparatus that now bears the name of one of the experimentalists: the Michelson interferometer. To the scientists’ surprise, the Audrey beam traveled just as quickly as the Baxter beam. The aether didn’t carry either beam along as a train carries a passenger. Light can travel in a vacuum, without any need for a medium.

The American Physical Society, among other sources, calls Michelson and Morley’s collaboration “what might be regarded as the most famous failed experiment to date.” The experiment provided the first rigorous evidence that the aether doesn’t exist and that, no matter how you measure light’s speed, you’ll only ever observe one value for it (if you measure it accurately). Einstein’s special theory of relativity provided a theoretical underpinning for these observations in 1905. The theory provides predictions about two observers—such as Audrey and Baxter—who are moving relative to each other. As long as they aren’t accelerating, they agree about all physical laws, including the speed of light.

Morley garnered accolades across the rest of his decades-long appointment at Western Reserve University. Michelson quarreled with his university’s administration and eventually resettled at the University of Chicago. In 1907, he received the first Nobel Prize awarded to any American for physics. The citation highlighted “his optical precision instruments and the spectroscopic and metrological investigations carried out with their aid.”

Today, both scientists enjoy renown across Case Western Reserve University. Their names grace the sit-down restaurant in the multipurpose center, as well as a dormitory and a chemistry building. A fountain on the quad salutes their experiment. And stories about a symposium held in 1987—the experiment’s centennial—echo through the physics building.

But Michelson and Morley’s spirit most suffuses the population. During my visit, I had the privilege and pleasure of dining with members of WiPAC, the university’s Women in Physics and Astronomy Club. A more curious, energetic group, I’ve rarely seen. Grace Carrow would find kindred spirits there.

With thanks to Harsh Mathur (pictured above), Patricia Princehouse, and Glenn Starkman, for their hospitality, as well as to the Case Western Reserve Department of Physics, the Institute for the Science of Origins, and the Gundzik Endowment.

Aside: If you visit Cleveland, visit its art museum! As Quantum Frontiers regulars know, I have a soft spot for ancient near-Eastern and ancient Egyptian art. I was impressed by the Cleveland Museum of Art’s artifacts from the reign of pharaoh Amenhotep III and the museum’s reliefs of the Egyptian queen Nefertiti. Also, boasting a statue of Gudea (a ruler of the ancient city-state of Lagash) and a relief from the palace of Assyrian kind Ashurnasirpal II, the museum is worth its ancient-near-Eastern salt.

1Not that Oxford enjoyed scientific renown during the Victorian era. As Cecil Rhodes—creator of the Rhodes Scholarship—opined then, “Wherever you turn your eye—except in science—an Oxford man is at the top of the tree.”

# Rocks that roll

In Terry Pratchett’s fantasy novel Soul Music, rock ’n roll arrives in Ankh-Morpork. Ankh-Morpork resembles the London of yesteryear—teeming with heroes and cutthroats, palaces and squalor—but also houses vampires, golems, wizards, and a sentient suitcase. Against this backdrop, a young harpist stumbles upon a mysterious guitar. He forms a band with a dwarf and with a troll who plays tuned rocks, after which the trio calls its style “Music with Rocks In.” The rest of the story consists of satire, drums, and rocks that roll.

The topic of rolling rocks sounds like it should elicit more yawns than an Elvis concert elicited screams. But rocks’ rolling helped recent University of Maryland physics PhD student Zackery Benson win a National Research Council Fellowship. He and his advisor, Wolfgang Losert, converted me into a fan of granular flow.

What I’ve been studying recently. Kind of.

Grains make up materials throughout the galaxy, such as the substance of avalanches. Many granular materials undergo repeated forcing by their environments. For instance, the grains that form an asteroid suffer bombardment from particles flying through outer space. The gravel beneath train tracks is compressed whenever a train passes.

Often, a pattern characterizes the forces in a granular system’s environment. For instance, trains in a particular weight class may traverse some patch of gravel, and the trains may arrive with a particular frequency. Some granular systems come to encode information about those patterns in their microscopic configurations and large-scale properties. So granular flow—little rocks that roll—can impact materials science, engineering, geophysics, and thermodynamics.

Granular flow sounds so elementary, you might expect us to have known everything about it since long before the Beatles’ time. But we didn’t even know until recently how to measure rolling in granular flows.

Envision a grain as a tiny sphere, like a globe of the Earth. Scientists focused mostly on how far grains are translated through space in a flow, analogouslly to how far a globe travels across a desktop if flicked. Recently, scientists measured how far a grain rotates about one axis, like a globe fixed in a frame. Sans frame, though, a globe can spin about more than one axis—about three independent axes. Zack performed the first measurement of all the rotations and translations of all the particles in a granular flow.

Each grain was an acrylic bead about as wide as my pinky nail. Two holes were drilled into each bead, forming an X, for reasons I’ll explain.

Image credit: Benson et al., Phys. Rev. Lett. 129, 048001 (2022).

Zack dumped over 10,000 beads into a rectangular container. Then, he poured in a fluid that filled the spaces between the grains. Placing a weight atop the grains, he exerted a constant pressure on them. Zack would push one of the container’s walls inward, compressing the grains similarly to how a train compresses gravel. Then, he’d decompress the beads. He repeated this compression cycle many times.

Image credit: Benson et al., Phys. Rev. E 103, 062906 (2021).

Each cycle consisted of many steps: Zack would compress the beads a tiny amount, pause, snap pictures, and then compress a tiny amount more. During each pause, the camera activated a fluorescent dye in the fluid, which looked clear in the photographs. Lacking the fluorescent dye, the beads showed up as dark patches. Clear X’s cut through the dark patches, as dye filled the cavities drilled into the beads. From the X’s, Zack inferred every grain’s orientation. He inferred how every grain rotated by comparing the orientation in one snapshot with the orientation in the next snapshot.

Image credit: Benson et al., Phys. Rev. Lett. 129, 048001 (2022).

Wolfgang’s lab had been trying for fifteen years to measure all the motions in a granular flow. The feat required experimental and computational skill. I appreciated the chance to play a minor role, in analyzing the data. Physical Review Letters published our paper last month.

From Zack’s measurements, we learned about the unique roles played by rotations in granular flow. For instance, rotations dominate the motion in a granular system’s bulk, far from the container’s walls. Importantly, the bulk dissipates the most energy. Also, whereas translations are reversible—however far grains shift while compressed, they tend to shift oppositely while decompressed—rotations are not. Such irreversibility can contribute to materials’ aging.

In Soul Music, the spirit of rock ’n roll—conceived of as a force in its own right—offers the guitarist the opportunity to never age. He can live fast, die young, and enjoy immortality as a legend, for his guitar comes from a dusty little shop not entirely of Ankh-Morpork’s world. Such shops deal in fate and fortune, the author maintains. Doing so, he takes a dig at the River Ankh, which flows through the city of Ankh-Morpork. The Ankh’s waters hold so much garbage, excrement, midnight victims, and other muck that they scarcely count as waters:

And there was even an Ankh-Morpork legend, wasn’t there, about some old drum [ . . . ] that was supposed to bang itself if an enemy fleet was seen sailing up the Ankh? The legend had died out in recent centuries, partly because this was the Age of Reason and also because no enemy fleet could sail up the Ankh without a gang of men with shovels going in front.

Such a drum would qualify as magic easily, but don’t underestimate the sludge. As a granular-flow system, it’s more incredible than you might expect.

# Quantum connections

We were seated in the open-air back of a boat, motoring around the Stockholm archipelago. The Swedish colors fluttered above our heads; the occasional speedboat zipped past, rocking us in its wake; and wildflowers dotted the bank on either side. Suddenly, a wood-trimmed boat glided by, and the captain waved from his perch.

The gesture surprised me. If I were in a vehicle of the sort most familiar to me—a car—I wouldn’t wave to other drivers. In a tram, I wouldn’t wave to passengers on a parallel track. Granted, trams and cars are closed, whereas boats can be open-air. But even as a pedestrian in a downtown crossing, I wouldn’t wave to everyone I passed. Yet, as boat after boat pulled alongside us, we received salutation after salutation.

The outing marked the midpoint of the Quantum Connections summer school. Physicists Frank Wilczek, Antti Niemi, and colleagues coordinate the school, which draws students and lecturers from across the globe. Although sponsored by Stockholm University, the school takes place at a century-old villa whose name I wish I could pronounce: Högberga Gård. The villa nestles atop a cliff on an island in the archipelago. We ventured off the island after a week of lectures.

Charlie Marcus lectured about materials formed from superconductors and semiconductors; John Martinis, about superconducting qubits; Jianwei Pan, about quantum advantages; and others, about symmetries, particle statistics, and more. Feeling like an ant among giants, I lectured about quantum thermodynamics. Two other lectures linked quantum physics with gravity—and in a way you might not expect. I appreciated the opportunity to reconnect with the lecturer: Igor Pikovski.

Igor doesn’t know it, but he’s one of the reasons why I joined the Harvard-Smithsonian Institute for Theoretical Atomic, Molecular, and Optical Physics (ITAMP) as an ITAMP Postdoctoral Fellow in 2018. He’d held the fellowship beginning a few years before, and he’d earned a reputation for kindness and consideration. Also, his research struck me as some of the most fulfilling that one could undertake.

If you’ve heard about the intersection of quantum physics and gravity, you’ve probably heard of approaches other than Igor’s. For instance, physicists are trying to construct a theory of quantum gravity, which would describe black holes and the universe’s origin. Such a “theory of everything” would reduce to Einstein’s general theory of relativity when applied to planets and would reduce to quantum theory when applied to atoms. In another example, physicists leverage quantum technologies to observe properties of gravity. Such technologies enabled the observatory LIGO to register gravitational waves—ripples in space-time.

Igor and his colleagues pursue a different goal: to observe phenomena whose explanations depend on quantum theory and on gravity.

In his lectures, Igor illustrated with an experiment first performed in 1975. The experiment relies on what happens if you jump: You gain energy associated with resisting the Earth’s gravitational pull—gravitational potential energy. A quantum object’s energy determines how the object’s quantum state changes in time. The experimentalists applied this fact to a beam of neutrons.

They put the beam in a superposition of two locations: closer to the Earth’s surface and farther away. The closer component changed in time in one way, and the farther component changed another way. After a while, the scientists recombined the components. The two interfered with each other similarly to the waves created by two raindrops falling near each other on a puddle. The interference evidenced gravity’s effect on the neutrons’ quantum state.

The experimentalists approximated gravity as dominated by the Earth alone. But other masses can influence the gravitational field noticeably. What if you put a mass in a superposition of different locations? What would happen to space-time?

Or imagine two quantum particles too far apart to interact with each other significantly. Could a gravitational field entangle the particles by carrying quantum correlations from one to the other?

Physicists including Igor ponder these questions…and then ponder how experimentalists could test their predictions. The more an object influences gravity, the more massive the object tends to be, and the more easily the object tends to decohere—to spill the quantum information that it holds into its surroundings.

The “gravity-quantum interface,” as Igor entitled his lectures, epitomizes what I hoped to study in college, as a high-school student entranced by physics, math, and philosophy. What’s more curious and puzzling than superpositions, entanglement, and space-time? What’s more fundamental than quantum theory and gravity? Little wonder that connecting them inspires wonder.

But we humans are suckers for connections. I appreciated the opportunity to reconnect with a colleague during the summer school. Boaters on the Stockholm archipelago waved to our cohort as they passed. And who knows—gravitational influences may even have rippled between the boats, entangling us a little.

With thanks to the summer-school organizers, including Pouya Peighami and Elizabeth Yang, for their invitation and hospitality.

# The power of being able to say “I can explain that”

Caltech condensed-matter theorist Gil Refael explained his scientific raison dê’tre early in my grad-school career: “What really gets me going is seeing a plot [of experimental data] and being able to say, ‘I can explain that.’” The quote has stuck with me almost word for word. When I heard it, I was working deep in abstract quantum information theory and thermodynamics, proving theorems about thought experiments. Embedding myself in pure ideas has always held an aura of romance for me, so I nodded along without seconding Gil’s view.

Roughly nine years later, I concede his point.

The revelation walloped me last month, as I was polishing a paper with experimental collaborators. Members of the Institute for Quantum Optics and Quantum Information (IQOQI) in Innsbruck, Austria—Florian Kranzl, Manoj Joshi, and Christian Roos—had performed an experiment in trapped-ion guru Rainer Blatt’s lab. Their work realized an experimental proposal that I’d designed with fellow theorists near the beginning of my postdoc stint. We aimed to observe signatures of particularly quantum thermalization

Throughout the universe, small systems exchange stuff with their environments. For instance, the Earth exchanges heat and light with the rest of the solar system. After exchanging stuff for long enough, the small system equilibrates with the environment: Large-scale properties of the small system (such as its volume and energy) remain fairly constant; and as much stuff enters the small system as leaves, on average. The Earth remains far from equilibrium, which is why we aren’t dead yet

In many cases, in equilibrium, the small system shares properties of the environment, such as the environment’s temperature. In these cases, we say that the small system has thermalized and, if it’s quantum, has reached a thermal state.

The stuff exchanged can consist of energy, particles, electric charge, and more. Unlike classical planets, quantum systems can exchange things that participate in quantum uncertainty relations (experts: that fail to commute). Quantum uncertainty mucks up derivations of the thermal state’s mathematical form. Some of us quantum thermodynamicists discovered the mucking up—and identified exchanges of quantum-uncertain things as particularly nonclassical thermodynamics—only a few years ago. We reworked conventional thermodynamic arguments to accommodate this quantum uncertainty. The small system, we concluded, likely equilibrates to near a thermal state whose mathematical form depends on the quantum-uncertain stuff—what we termed a non-Abelian thermal state. I wanted to see this equilibration in the lab. So I proposed an experiment with theory collaborators; and Manoj, Florian, and Christian took a risk on us.

The experimentalists arrayed between six and fifteen ions in a line. Two ions formed the small system, and the rest formed the quantum environment. The ions exchanged the $x$-, $y$-, and $z$-components of their spin angular momentum—stuff that participates in quantum uncertainty relations. The ions began with a fairly well-defined amount of each spin component, as described in another blog post. The ions exchanged stuff for a while, and then the experimentalists measured the small system’s quantum state.

The small system equilibrated to near the non-Abelian thermal state, we found. No conventional thermal state modeled the results as accurately. Score!

My postdoc and numerical-simulation wizard Aleks Lasek modeled the experiment on his computer. The small system, he found, remained farther from the non-Abelian thermal state in his simulation than in the experiment. Aleks plotted the small system’s distance to the non-Abelian thermal state against the ion chain’s length. The points produced experimentally sat lower down than the points produced numerically. Why?

I think I can explain that, I said. The two ions exchange stuff with the rest of the ions, which serve as a quantum environment. But the two ions exchange stuff also with the wider world, such as stray electromagnetic fields. The latter exchanges may push the small system farther toward equilibrium than the extra ions alone do.

Fortunately for the development of my explanatory skills, collaborators prodded me to hone my argument. The wider world, they pointed out, effectively has a very high temperature—an infinite temperature.1 Equilibrating with that environment, the two ions would acquire an infinite temperature themselves. The two ions would approach an infinite-temperature thermal state, which differs from the non-Abelian thermal state we aimed to observe.

Fair, I said. But the extra ions probably have a fairly high temperature themselves. So the non-Abelian thermal state is probably close to the infinite-temperature thermal state. Analogously, if someone cooks goulash similarly to his father, and the father cooks goulash similarly to his grandfather, then the youngest chef cooks goulash similarly to his grandfather. If the wider world pushes the two ions to equilibrate to infinite temperature, then, because the infinite-temperature state lies near the non-Abelian thermal state, the wider world pushes the two ions to equilibrate to near the non-Abelian thermal state.

I plugged numbers into a few equations to check that the extra ions do have a high temperature. (Perhaps I should have done so before proposing the argument above, but my collaborators were kind enough not to call me out.)

Aleks hammered the nail into the problem’s coffin by incorporating into his simulations the two ions’ interaction with an infinite-temperature wider world. His numerical data points dropped to near the experimental data points. The new plot supported my story.

I can explain that! Aleks’s results buoyed me the whole next day; I found myself smiling at random times throughout the afternoon. Not that I’d explained a grand mystery, like the unexpected hiss heard by Arno Penzias and Robert Wilson when they turned on a powerful antenna in 1964. The hiss turned out to come from the cosmic microwave background (CMB), a collection of photons that fill the visible universe. The CMB provided evidence for the then-controversial Big Bang theory of the universe’s origin. Discovering the CMB earned Penzias and Wilson a Nobel Prize. If the noise caused by the CMB was music to cosmologists’ ears, the noise in our experiment is the quiet wailing of a shy banshee. But it’s our experiment’s noise, and we understand it now.

The experience hasn’t weaned me off the romance of proving theorems about thought experiments. Theorems about thermodynamic quantum uncertainty inspired the experiment that yielded the plot that confused us. But I now second Gil’s sentiment. In the throes of an experiment, “I can explain that” can feel like a battle cry.

1Experts: The wider world effectively has an infinite temperature because (i) the dominant decoherence is dephasing relative to the $\sigma_z$ product eigenbasis and (ii) the experimentalists rotate their qubits often, to simulate a rotationally invariant Hamiltonian evolution. So the qubits effectively undergo dephasing relative to the $\sigma_x$, $\sigma_y$, and $\sigma_z$ eigenbases.

# Up we go! or From abstract theory to experimental proposal

Mr. Mole is trapped indoors, alone. Spring is awakening outside, but he’s confined to his burrow. Birds are twittering, and rabbits are chattering, but he has only himself for company.

Sound familiar?

Spring—crocuses, daffodils, and hyacinths budding; leaves unfurling; and birds warbling—burst upon Cambridge, Massachusetts last month. The city’s shutdown vied with the season’s vivaciousness. I relieved the tension by rereading The Wind in the Willows, which I’ve read every spring since 2017.

Project Gutenberg offers free access to Kenneth Grahame’s 1908 novel. He wrote the book for children, but never mind that. Many masterpieces of literature happen to have been written for children.

One line in the novel demanded, last year, that I memorize it. On page one, Mole is cleaning his house beneath the Earth’s surface. He’s been dusting and whitewashing for hours when the spring calls to him. Life is pulsating on the ground and in the air above him, and he can’t resist joining the party. Mole throws down his cleaning supplies and tunnels upward through the soil: “he scraped and scratched and scrabbled and scrooged, and then he scrooged again and scrabbled and scratched and scraped.”

The quotation appealed to me not only because of its alliteration and chiasmus. Mole’s journey reminded me of research.

Take a paper that I published last month with Michael Beverland of Microsoft Research and Amir Kalev of the Joint Center for Quantum Information and Computer Science (now of the Information Sciences Institute at the University of Southern California). We translated a discovery from the abstract, mathematical language of quantum-information-theoretic thermodynamics into an experimental proposal. We had to scrabble, but we kept on scrooging.

Over four years ago, other collaborators and I uncovered a thermodynamics problem, as did two other groups at the same time. Thermodynamicists often consider small systems that interact with large environments, like a magnolia flower releasing its perfume into the air. The two systems—magnolia flower and air—exchange things, such as energy and scent particles. The total amount of energy in the flower and the air remains constant, as does the total number of perfume particles. So we call the energy and the perfume-particle number conserved quantities.

We represent quantum conserved quantities with matrices $Q_1$ and $Q_2$. We nearly always assume that, in this thermodynamic problem, those matrices commute with each other: $Q_1 Q_2 = Q_2 Q_1$. Almost no one mentions this assumption; we make it without realizing. Eliminating this assumption invalidates a derivation of the state reached by the small system after a long time. But why assume that the matrices commute? Noncommutation typifies quantum physics and underlies quantum error correction and quantum cryptography.

What if the little system exchanges with the large system thermodynamic quantities represented by matrices that don’t commute with each other?

Colleagues and I began answering this question, four years ago. The small system, we argued, thermalizes to near a quantum state that contains noncommuting matrices. We termed that state, $e^{ - \sum_\alpha \beta_\alpha Q_\alpha } / Z$, the non-Abelian thermal state. The $Q_\alpha$’s represent conserved quantities, and the $\beta_\alpha$’s resemble temperatures. The real number $Z$ ensures that, if you measure any property of the state, you’ll obtain some outcome. Our arguments relied on abstract mathematics, resource theories, and more quantum information theory.

Over the past four years, noncommuting conserved quantities have propagated across quantum-information-theoretic thermodynamics.1 Watching the idea take root has been exhilarating, but the quantum information theory didn’t satisfy me. I wanted to see a real physical system thermalize to near the non-Abelian thermal state.

Michael and Amir joined the mission to propose an experiment. We kept nosing toward a solution, then dislodging a rock that would shower dirt on us and block our path. But we scrabbled onward.

Imagine a line of ions trapped by lasers. Each ion contains the physical manifestation of a qubit—a quantum two-level system, the basic unit of quantum information. You can think of a qubit as having a quantum analogue of angular momentum, called spin. The spin has three components, one per direction of space. These spin components are represented by matrices $Q_x = S_x$, $Q_y = S_y$, and $Q_z = S_z$ that don’t commute with each other.

A couple of qubits can form the small system, analogous to the magnolia flower. The rest of the qubits form the large system, analogous to the air. I constructed a Hamiltonian—a matrix that dictates how the qubits evolve—that transfers quanta of all the spin’s components between the small system and the large. (Experts: The Heisenberg Hamiltonian transfers quanta of all the spin components between two qubits while conserving $S_{x, y, z}^{\rm tot}$.)

The Hamiltonian led to our first scrape: I constructed an integrable Hamiltonian, by accident. Integrable Hamiltonians can’t thermalize systems. A system thermalizes by losing information about its initial conditions, evolving to a state with an exponential form, such as $e^{ - \sum_\alpha \beta_\alpha Q_\alpha } / Z$. We clawed at the dirt and uncovered a solution: My Hamiltonian coupled together nearest-neighbor qubits. If the Hamiltonian coupled also next-nearest-neighbor qubits, or if the ions formed a 2D or 3D array, the Hamiltonian would be nonintegrable.

We had to scratch at every stage—while formulating the setup, preparation procedure, evolution, measurement, and prediction. But we managed; Physical Review E published our paper last month. We showed how a quantum system can evolve to the non-Abelian thermal state. Trapped ions, ultracold atoms, and quantum dots can realize our experimental proposal. We imported noncommuting conserved quantities in thermodynamics from quantum information theory to condensed matter and atomic, molecular, and optical physics.

As Grahame wrote, the Mole kept “working busily with his little paws and muttering to himself, ‘Up we go! Up we go!’ till at last, pop! his snout came out into the sunlight and he found himself rolling in the warm grass of a great meadow.”

1See our latest paper’s introduction for references. https://journals.aps.org/pre/abstract/10.1103/PhysRevE.101.042117

# A new possibility for quantum networks

It has been roughly 1 year since Dr Jon Kindem and I finished at Caltech (JK graduating with a PhD and myself – JB – graduating from my postdoc to take up a junior faculty position at the University of Sydney). During our three-and-a-half-year overlap in the IQIM we often told each other that we should write something for Quantum Frontiers. As two of the authors of a paper reporting a recent breakthrough for rare-earth ion spin qubits (Nature, 2020), it was now or never. Here we go…

Throughout 2019, telecommunication companies began deploying 5th generation (5G) network infrastructure to allow our wireless communication to be faster, more reliable, and cope with greater capacity. This roll out of 5G technology promises to support up to 10x the number of devices operating with speeds 10x faster than what is possible with 4th generation (4G) networks. If you stop and think about new opportunities 4G networks unlocked for working, shopping, connecting, and more, it is easy to see why some people are excited about the new world 5G networks might offer.

Classical networks like 5G and fiber optic networks (the backbone of the internet) share classical information: streams of bits (zeros and ones) that encode our conversations, tweets, music, podcasts, videos and anything else we communicate through our digital devices. Every improvement in the network hardware (for example an optical switch with less loss or a faster signal router) contributes to big changes in speed and capacity. The bottom line is that with enough advances, the network evolves to the point where things that were previously impossible (like downloading a movie in the late 90s) become instantaneous.

Alongside the hype and advertising around 5G networks, we are part of the world-wide effort to develop a fundamentally different network (with a little less advertising, but similar amounts of hype). Rather than being a bigger, better version of 5G, this new network is trying to build a quantum internet: a set of technologies that will allows us to connect and share information at the quantum level. For an insight into the quantum internet origin story, read this post about the pioneering experiments that took place at Caltech in Prof. Jeff Kimble’s group.

Quantum technologies operate using the counter-intuitive phenomena of quantum mechanics like superposition and entanglement. Quantum networks need to distribute this superposition and entanglement between different locations. This is a much harder task than distributing bits in a regular network because quantum information is extremely susceptible to loss and noise. If realized, this quantum internet could enable powerful quantum computing clusters, and create networks of quantum sensors that measure infinitesimally small fluctuations in their environment.

At this point it is worth asking the question:

Does the world really need a quantum internet?

This is an important question because a quantum internet is unlikely to improve any of the most common uses for the classical internet (internet facts and most popular searches).

We think there are at least three reasons why a quantum network is important:

1. To build better quantum computers. The quantum internet will effectively transform small, isolated quantum processors into one much larger, more powerful computer. This could be a big boost in the race to scale-up quantum computing.
2. To build quantum-encrypted communication networks. The ability of quantum technology to make or break encryption is one of the earliest reasons why quantum technology was funded. A fully-fledged quantum computer should be very efficient at hacking commonly used encryption protocols, while ideal quantum encryption provides the basis for communications secured by the fundamental properties of physics.
3. To push the boundaries of quantum physics and measurement sensitivity by increasing the length scale and complexity of entangled systems. The quantum internet can help turn thought experiments into real experiments.

The next question is: How do we build a quantum internet?

The starting point for most long-distance quantum network strategies is to base them on the state-of-the-art technology for current classical networks: sending information using light. (But that doesn’t rule out microwave networks for local area networks, as recent work from ETH Zurich has shown).

The technology that drives quantum networks is a set of interfaces that connect matter systems (like atoms) to photons at a quantum level. These interfaces need to efficiently exchange quantum information between matter and light, and the matter part needs to be able to store the information for a time that is much longer than the time it takes for the light to get to its destination in the network. We also need to be able to entangle the quantum matter systems to connect network links, and to process quantum information for error correction. This is a significant challenge that requires novel materials and unparalleled control of light to ultimately succeed.

State-of-the-art quantum networks are still elementary links compared to the complexity and scale of modern telecommunication. One of the most advanced platforms that has demonstrated a quantum network link consists of two atomic defects in diamonds separated by 1.3 km. The defects act as the quantum light-matter interface allowing quantum information to be shared between the two remote devices. But these defects in diamond currently have limitations that prohibit the expansion of such a network. The central challenge is finding defects/emitters that are stable and robust to environmental fluctuations, while simultaneously efficiently connecting with light. While these emitters don’t have to be in solids, the allure of a scalable solid-state fabrication process akin to today’s semiconductor industry for integrated circuits is very appealing. This has motivated the research and development of a range of quantum light-matter interfaces in solids (for example, see recent work by Harvard researchers) with the goal of meeting the simultaneous goals of efficiency and stability.

The research group we were a part of at Caltech was Prof. Andrei Faraon’s group, which put forward an appealing alternative to other solid-state technologies. The team uses rare-earth atoms embedded in crystals commonly used for lasers. JK joined as the group’s 3rd graduate student in 2013, while I joined as a postdoc in 2016.

Rare-earth atoms have long been of interest for quantum technologies such as quantum memories for light because they are very stable and are excellent at preserving quantum information. But compared to other emitters, they only interact very weakly with light, which means that one usually needs large crystals with billions of atoms all working in harmony to make useful quantum interfaces. To overcome this problem, research in the Faraon group pioneered coupling these ions to nanoscale optical cavities like these ones:

These microscopic Toblerone-like structures are fabricated directly in the crystal that plays host to the rare-earth atoms. The periodic patterning effectively acts like two mirrors that form an optical cavity to confine light, which enhances the connection between light and the rare-earth atoms. In 2017, our group showed that the improved optical interaction in these cavities can be used to shrink down optical quantum memories by orders of magnitude compared to previous demonstrations, and ones manufactured on-chip.

We have used this nanophotonic platform to open up new avenues for quantum networks based on single rare-earth atoms, a task that previously was exceptionally challenging because these atoms have very low brightness. We have worked with both neodymium and ytterbium atoms embedded in a commercially available laser crystal.

Ytterbium looks particularly promising. Working with Prof. Rufus Cone’s group at Montana State University, we showed that these ytterbium atoms absorb and emit light better than most other rare-earth atoms and that they can store quantum information long enough for extended networks (>10 ms) when cooled down to a few Kelvin (-272 degrees Celsius) [Kindem et al., Physical Review B, 98, 024404 (2018) – link to arXiv version].

By using the nanocavity to improve the brightness of these ytterbium atoms, we have now been able to identify and investigate their properties at the single atom level. We can precisely control the quantum state of the single atoms and measure them with high fidelity – both prerequisites for using these atoms in quantum information technologies. When combined with the long quantum information storage times, our work demonstrates important steps to using this system in a quantum network.

The next milestone is forming an optical link between two individual rare-earth atoms to build an elementary quantum network. This goal is in our sights and we are already working on optimizing the light-matter interface stability and efficiency. A more ambitious milestone is to provide interconnects for other types of qubits – such as superconducting qubits – to join the network. This requires a quantum transducer to convert between microwave signals and light. Rare-earth atoms are promising for transducer technologies (see recent work from the Faraon group), as are a number of other hybrid quantum systems (for example, optomechanical devices like the ones developed in the Painter group at Caltech).

It took roughly 50 years from the first message sent over ARPANET to the roll out of 5G technology.

So, when are we going to see the quantum internet?

The technology and expertise needed to build quantum links between cities are developing rapidly with impressive progress made even between 2018 and 2020. Basic quantum network capabilities will likely be up and running in the next decade, which will be an exciting time for breakthroughs in fundamental and applied quantum science. Using single rare-earth atoms is relatively new, but this technology is also advancing quickly (for example, our ytterbium material was largely unstudied just three years ago). Importantly, the discovery of new materials will continue to be important to push quantum technologies forward.

You can read more about this work in this summary article and this synopsis written by lead author JK (Caltech PhD 2019), or dive into the full paper published in Nature.

J. M. Kindem, A. Ruskuc, J. G. Bartholomew, J. Rochman, Y.-Q. Huan, and A. Faraon. Control and single-shot readout of an ion embedded in a nanophotonic cavity. Nature (2020).

Now is an especially exciting time for our field with the Thompson Lab at Princeton publishing a related paper on single rare-earth atom quantum state detection, in their case using erbium. Check out their article here.

# Sense, sensibility, and superconductors

Jonathan Monroe disagreed with his PhD supervisor—with respect. They needed to measure a superconducting qubit, a tiny circuit in which current can flow forever. The qubit emits light, which carries information about the qubit’s state. Jonathan and Kater intensify the light using an amplifier. They’d fabricated many amplifiers, but none had worked. Jonathan suggested changing their strategy—with a politeness to which Emily Post couldn’t have objected. Jonathan’s supervisor, Kater Murch, suggested repeating the protocol they’d performed many times.

“That’s the definition of insanity,” Kater admitted, “but I think experiment needs to involve some of that.”

I watched the exchange via Skype, with more interest than I’d have watched the Oscars with. Someday, I hope, I’ll be able to weigh in on such a debate, despite working as a theorist. Someday, I’ll have partnered with enough experimentalists to develop insight.

I’m partnering with Jonathan and Kater on an experiment that coauthors and I proposed in a paper blogged about here. The experiment centers on an uncertainty relation, an inequality of the sort immortalized by Werner Heisenberg in 1927. Uncertainty relations imply that, if you measure a quantum particle’s position, the particle’s momentum ceases to have a well-defined value. If you measure the momentum, the particle ceases to have a well-defined position. Our uncertainty relation involves weak measurements. Weakly measuring a particle’s position doesn’t disturb the momentum much and vice versa. We can interpret the uncertainty in information-processing terms, because we cast the inequality in terms of entropies. Entropies, described here, are functions that quantify how efficiently we can process information, such as by compressing data. Jonathan and Kater are checking our inequality, and exploring its implications, with a superconducting qubit.

I had too little experience to side with Jonathan or with Kater. So I watched, and I contemplated how their opinions would sound if expressed about theory. Do I try one strategy again and again, hoping to change my results without changing my approach?

At the Perimeter Institute for Theoretical Physics, Masters students had to swallow half-a-year of course material in weeks. I questioned whether I’d ever understand some of the material. But some of that material resurfaced during my PhD. Again, I attended lectures about Einstein’s theory of general relativity. Again, I worked problems about observers in free-fall. Again, I calculated covariant derivatives. The material sank in. I decided never to question, again, whether I could understand a concept. I might not understand a concept today, or tomorrow, or next week. But if I dedicate enough time and effort, I chose to believe, I’ll learn.

My decision rested on experience and on classes, taught by educational psychologists, that I’d taken in college. I’d studied how brains change during learning and how breaks enhance the changes. Sense, I thought, underlay my decision—though expecting outcomes to change, while strategies remain static, sounds insane.

Does sense underlie Kater’s suggestion, likened to insanity, to keep fabricating amplifiers as before? He’s expressed cynicism many times during our collaboration: Experiment needs to involve some insanity. The experiment probably won’t work for a long time. Plenty more things will likely break.

Jonathan and I agree with him. Experiments have a reputation for breaking, and Kater has a reputation for knowing experiments. Yet Jonathan—with professionalism and politeness—remains optimistic that other methods will prevail, that we’ll meet our goals early. I hope that Jonathan remains optimistic, and I fancy that Kater hopes, too. He prophesies gloom with a quarter of a smile, and his record speaks against him: A few months ago, I met a theorist who’d collaborated with Kater years before. The theorist marveled at the speed with which Kater had operated. A theorist would propose an experiment, and boom—the proposal would work.

Perhaps luck smiled upon the implementation. But luck dovetails with the sense that underlies Kater’s opinion: Experiments involve factors that you can’t control. Implement a protocol once, and it might fail because the temperature has risen too high. Implement the protocol again, and it might fail because a truck drove by your building, vibrating the tabletop. Implement the protocol again, and it might fail because you bumped into a knob. Implement the protocol a fourth time, and it might succeed. If you repeat a protocol many times, your environment might change, changing your results.

Sense underlies also Jonathan’s objections to Kater’s opinions. We boost our chances of succeeding if we keep trying. We derive energy to keep trying from creativity and optimism. So rebelling against our PhD supervisors’ sense is sensible. I wondered, watching the Skype conversation, whether Kater the student had objected to prophesies of doom as Jonathan did. Kater exudes the soberness of a tenured professor but the irreverence of a Californian who wears his hair slightly long and who tattooed his wedding band on. Science thrives on the soberness and the irreverence.

Who won Jonathan and Kater’s argument? Both, I think. Last week, they reported having fabricated amplifiers that work. The lab followed a protocol similar to their old one, but with more conscientiousness.

I’m looking forward to watching who wins the debate about how long the rest of the experiment takes. Either way, check out Jonathan’s talk about our experiment if you attend the American Physical Society’s March Meeting. Jonathan will speak on Thursday, March 5, at 12:03, in room 106. Also, keep an eye out for our paper—which will debut once Jonathan coaxes the amplifier into synching with his qubit.

# Breaking up the band structure

Note from the editor: During the Summer of 2019, a group of thirteen undergraduate students from Caltech and universities around the world, spent 10 weeks on campus performing research in experimental quantum physics. Below, Aiden Cullo, a student from Binghampton University in New York, shares his experience working in Professor Yeh’s lab. The program, termed QuantumSURF, will run again during the Summer of 2020.

This summer, I worked in Nai-Chang Yeh’s experimental condensed matter lab. The aim of my project was to observe the effects of a magnetic field on our topological insulator (TI) sample, ${(BiSb)}_2{Te}_3$. The motivation behind this project was to examine more closely the transformation between a topological insulator and a state exhibiting the anomalous hall effect (AHE).

Both states of matter have garnered a good deal of interest in condensed matter research because of their interesting transport properties, among other things. TIs have gained popularity due to their applications in electronics (spintronics), superconductivity, and quantum computation. TIs are peculiar in that they simultaneously have insulating bulk states and conducting surface state. Due to time-reversal symmetry (TRS) and spin-momentum locking, these surface states have a very symmetric hourglass-like gapless energy band structure (Dirac cone).

The focus of our particular study was the effects of “c-plane” magnetization of our TI’s surface state. Theory predicts TRS and spin-momentum locking will be broken, resulting in a gapped spectrum with a single connection between the valence and conduction bands. This gapping has been theorized and shown experimentally in Chromium (Cr)-doped ${(BiSb)}_2{Te}_3$ and numerous other TIs with similar make-up.

In 2014, Nai-Chang Yeh’s group showed that Cr-doped ${Bi}_2{Se}_3$ exhibit this gap opening due to the surface state of ${Bi}_2{Se}_3$ interacting via the proximity effect with a ferromagnet. Our contention is that a similar material, Cr-doped ${(BiSb)}_2{Te}_3$, exhibits a similar effect, but more homogeneously because of reduced structural strain between atoms. Specifically, at temperatures below the Curie temperature (Tc), we expect to see a gap in the energy band and an overall increase in the gap magnitude. In short, the main goal of my summer project was to observe the gapping of our TI’s energy band.

Overall, my summer project entailed a combination of reading papers/textbooks and hands-on experimental work. It was difficult to understand fully the theory behind my project in such a short amount of time, but even with a cursory knowledge of topological insulators, I was able to provide a meaningful analysis/interpretation of our data.

Additionally, my experiment relied heavily on external factors such as our supplier for liquid helium, argon gas, etc. As a result, our progress was slowed if an order was delayed or not placed far enough in advance. Most of the issues we encountered were not related to the abstract theory of the materials/machinery, but rather problems with less complex mechanisms such as wiring, insulation, and temperature regulation.

While I expected to spend a good deal of time troubleshooting, I severely underestimated the amount of time that would be spent dealing with quotidian problems such as configuring software or etching STM tips. Working on a machine as powerful as an STM was frustrating at times, but also very rewarding as eventually we were able to collect a large amount of data on our samples.

An important (and extremely difficult) part of our analysis of STM data was whether patterns/features in our data set were artifacts or genuine phenomena, or a combination. I was fortunate enough to be surrounded by other researchers that helped me sift through the volumes of data and identify traits of our samples. Reflecting on my SURF, I believe it was a positive experience as it not only taught me a great deal about research, but also, more importantly, closely mimicked the experience of graduate school.

# “A theorist I can actually talk with”

Haunted mansions have ghosts, football teams have mascots, and labs have in-house theorists. I found myself posing as a lab’s theorist at Caltech. The gig began when Oskar Painter, a Caltech experimentalist, emailed that he’d read my first paper about quantum chaos. Would I discuss the paper with the group?

Oskar’s lab was building superconducting qubits, tiny circuits in which charge can flow forever. The lab aimed to control scores of qubits, to develop a quantum many-body system. Entanglement—strong correlations that quantum systems can sustain and everyday systems can’t—would spread throughout the qubits. The system could realize phases of matter—like many-particle quantum chaos—off-limits to most materials.

How could Oskar’s lab characterize the entanglement, the entanglement’s spread, and the phases? Expert readers will suggest measuring an entropy, a gauge of how much information this part of the system holds about that part. But experimentalists have had trouble measuring entropies. Besides, one measurement can’t capture many-body entanglement; such entanglement involves too many intricacies. Oskar was searching for arrows to add to his lab’s measurement quiver.

In-house theorist?

I’d proposed a protocol for measuring a characterization of many-body entanglement, quantum chaos, and thermalization—a property called “the out-of-time-ordered correlator.” The protocol appealed to Oskar. But practicalities limit quantum many-body experiments: The more qubits your system contains, the more the system can contact its environment, like stray particles. The stronger the interactions, the more the environment entangles with the qubits, and the less the qubits entangle with each other. Quantum information leaks from the qubits into their surroundings; what happens in Vegas doesn’t stay in Vegas. Would imperfections mar my protocol?

I didn’t know. But I knew someone who could help us find out.

Justin Dressel works at Chapman University as a physics professor. He’s received the highest praise that I’ve heard any experimentalist give a theorist: “He’s a theorist I can actually talk to.” With other collaborators, Justin and I simplified my scheme for measuring out-of-time-ordered correlators. Justin knew what superconducting-qubit experimentalists could achieve, and he’d been helping them reach for more.

How about, I asked Justin, we simulate our protocol on a computer? We’d code up virtual superconducting qubits, program in interactions with the environment, run our measurement scheme, and assess the results’ noisiness. Justin had the tools to simulate the qubits, but he lacked the time.

Know any postdocs or students who’d take an interest? I asked.

Chapman University’s former science center. Don’t you wish you spent winters in California?

José Raúl González Alonso has a smile like a welcome sign and a coffee cup glued to one hand. He was moving to Chapman University to work as a Grand Challenges Postdoctoral Fellow. José had built simulations, and he jumped at the chance to study quantum chaos.

José confirmed Oskar’s fear and other simulators’ findings: The environment threatens measurements of the out-of-time-ordered correlator. Suppose that you measure this correlator at each of many instants, you plot the correlator against time, and you see the correlator drop. If you’ve isolated your qubits from their environment, we can expect them to carry many-body entanglement. Golden. But the correlator can drop if, instead, the environment is harassing your qubits. You can misdiagnose leaking as many-body entanglement.

Our triumvirate identified a solution. Justin and I had discovered another characterization of quantum chaos and many-body entanglement: a quasiprobability, a quantum generalization of a probability.

The quasiprobability contains more information about the entanglement than the out-of-time-ordered-correlator does. José simulated measurements of the quasiprobability. The quasiprobability, he found, behaves one way when the qubits entangle independently of their environment and behaves another way when the qubits leak. You can measure the quasiprobability to decide whether to trust your out-of-time-ordered-correlator measurement or to isolate your qubits better. The quasiprobability enables us to avoid false positives.

Physical Review Letters published our paper last month. Working with Justin and José deepened my appetite for translating between the abstract and the concrete, for proving abstractions as a theorist’s theorist and realizing them experimentally as a lab’s theorist. Maybe, someday, I’ll earn the tag “a theorist I can actually talk with” from an experimentalist. For now, at least I serve better than a football-team mascot.

# Humans can intuit quantum physics.

One evening this January, audience members packed into a lecture hall in MIT’s physics building. Undergraduates, members of the public, faculty members, and other scholars came to watch a film premiere and a panel discussion. NOVA had produced the film, “Einstein’s Quantum Riddle,” which stars entanglement. Entanglement is a relationship between quantum systems such as electrons. Measuring two entangled electrons yields two outcomes, analogous to the numbers that face upward after you roll two dice. The quantum measurements’ outcomes can exhibit correlations stronger than any measurements of any classical, or nonquantum, systems can. Which die faces point upward can share only so much correlation, even if the dice hit each other.

Dice feature in the film’s explanations of entanglement. So does a variation on the shell game, in which one hides a ball under one of three cups, shuffles the cups, and challenges viewers to guess which cup is hiding the ball. The film derives its drama from the Cosmic Bell test. Bell tests are experiments crafted to show that classical physics can’t describe entanglement. Scientists recently enhanced Bell tests using light from quasars—ancient, bright, faraway galaxies. Mix astrophysics with quantum physics, and an edgy, pulsing soundtrack follows.

The Cosmic Bell test grew from a proposal by physicists at MIT and the University of Chicago. The coauthors include David Kaiser, a historian of science and a physicist on MIT’s faculty. Dave co-organized the premiere and the panel discussion that followed. The panel featured Dave; Paola Cappellaro, an MIT quantum experimentalist; Alan Guth, an MIT cosmologist who contributed to the Bell test; Calvin Leung, an MIT PhD student who contributed; Chris Schmidt, the film’s producer; and me. Brindha Muniappan, the Director of Education and Public Programs at the MIT Museum, moderated the discussion.

think that the other panelists were laughing with me.

Brindha asked what challenges I face when explaining quantum physics, such as on this blog. Quantum theory wears the labels “weird,” “counterintuitive,” and “bizarre” in journalism, interviews, blogs, and films. But the thorn in my communicational side reflects quantum “weirdness” less than it reflects humanity’s self-limitation: Many people believe that we can’t grasp quantum physics. They shut down before asking me to explain.

Examples include a friend and Quantum Frontiers follower who asks, year after year, for books about quantum physics. I suggest literature—much by Dave Kaiser—he reads some, and we discuss his impressions. He’s learning, he harbors enough curiosity to have maintained this routine for years, and he has technical experience as a programmer. But he’s demurred, several times, along the lines of “But…I don’t know. I don’t think I’ll ever understand it. Humans can’t understand quantum physics, can we? It’s too weird.”

Quantum physics defies many expectations sourced from classical physics. Classical physics governs how basketballs arch, how paint dries, how sunlight slants through your window, and other everyday experiences. Yet we can gain intuition about quantum physics. If we couldn’t, how could we solve problems and accomplish research? Physicists often begin solving problems by trying to guess the answer from intuition. We reason our way toward a guess by stripping away complications, constructing toy models, and telling stories. We tell stories about particles hopping from site to site on lattices, particles trapped in wells, and arrows flipping upward and downward. These stories don’t capture all of quantum physics, but they capture the essentials. After grasping the essentials, we translate them into math, check how far our guesses lie from truth, and correct our understanding. Intuition about quantum physics forms the compass that guides problem solving.

Growing able to construct, use, and mathematize such stories requires work. You won’t come to understand quantum theory by watching NOVA films, though films can prime you for study. You can gain a facility with quantum theory through classes, problem sets, testing, research, seminars, and further processing. You might not have the time or inclination to. Even if you have, you might not come to understand why quantum theory describes our universe: Science can’t necessarily answer all “why” questions. But you can grasp what quantum theory implies about our universe.

People grasp physics arguably more exotic than quantum theory, without exciting the disbelief excited by a grasp of quantum theory. Consider the Voyager spacecraft launched in 1977. Voyager has survived solar winds and -452º F weather, imaged planets, and entered interstellar space. Classical physics—the physics of how basketballs arch—describes much of Voyager’s experience. But even if you’ve shot baskets, how much intuition do you have about interstellar space? I know physicists who claim to have more intuition about quantum physics than about much classical. When astrophysicists discuss Voyager and interstellar space, moreover, listeners don’t fret that comprehension lies beyond them. No one need fret when quantum physicists discuss the electrons in us.

Fretting might not occur to future generations: Outreach teams are introducing kids to quantum physics through games and videos. Caltech’s Institute for Quantum Information and Matter has partnered with Google to produce QCraft, a quantum variation on Minecraft, and with the University of Southern California on quantum chess. In 2017, the American Physical Society’s largest annual conference featured a session called “Gamification and other Novel Approaches in Quantum Physics Outreach.” Such outreach exposes kids to quantum terminology and concepts early. Quantum theory becomes a playground to explore, rather than a source of intimidation. Players will grow up primed to think about quantum-mechanics courses not “Will my grade-point average survive this semester?” but “Ah, so this is the math under the hood of entanglement.”

Sociology restricts people to thinking quantum physics weird. But quantum theory defies classical expectations less than it could. Measurement outcomes could share correlations stronger than the correlations sourced by entanglement. How strong could the correlations grow? How else could physics depart farther from classical physics than quantum physics does? Imagine the worlds governed by all possible types of physics, called “generalized probabilistic theories” (GPTs). GPTs form a landscape in which quantum theory constitutes an island, on which classical physics constitutes a hill. Compared with the landscape’s outskirts, our quantum world looks tame.

GPTs fall under the research category of quantum foundations. Quantum foundations concerns why the math that describes quantum systems describes quantum systems, reformulations of quantum theory, how quantum theory differs from classical mechanics, how quantum theory could deviate but doesn’t, and what happens during measurements of quantum systems. Though questions about quantum foundations remain, they don’t block us from intuiting about quantum theory. A stable owner can sense when a horse has colic despite lacking a veterinary degree.

Moreover, quantum-foundations research has advanced over the past few decades. Collaborations and tools have helped: Theorists have been partnering with experimentalists, such as on the Cosmic Bell test and on studies of measurement. Information theory has engendered mathematical tools for quantifying entanglement and other quantum phenomena. Information theory has also firmed up an approach called “operationalism.” Operationalists emphasize preparation procedures, evolutions, and measurements. Focusing on actions and data concretizes arguments and facilitates comparisons with experiments. As quantum-foundations research has advanced, so have quantum information theory, quantum experiments, quantum technologies, and interdisciplinary cross-pollination. Twentieth-century quantum physicists didn’t imagine the community, perspectives, and knowledge that we’ve accrued. So don’t adopt 20th-century pessimism about understanding quantum theory. Einstein grasped much, but today’s scientific community grasps more. Richard Feynman said, “I think I can safely say that nobody understands quantum mechanics.” Feynman helped spur the quantum-information revolution; he died before its adolescence. Besides, Feynman understood plenty about quantum theory. Intuition jumps off the pages of his lecture notes and speeches.

Landscape beyond quantum theory

I’ve swum in oceans and lakes, studied how the moon generates tides, and canoed. But piloting a steamboat along the Mississippi would baffle me. I could learn, given time, instruction, and practice; so can you learn quantum theory. Don’t let “weirdness,” “bizarreness,” or “counterintuitiveness” intimidate you. Humans can intuit quantum physics.