About Nicole Yunger Halpern

I'm a theoretical physicist and an ITAMP Postdoctoral Fellow at the Harvard-Smithsonian Institute for Theoretical Atomic, Molecular, and Optical Physics (ITAMP). Before moving here, I completed a physics PhD at Caltech's Institute for Quantum Information and Matter, under John Preskill's auspices. I write one article per month for Quantum Frontiers. My research consists of what I call "quantum steampunk" (https://quantumfrontiers.com/2018/07/29/so-long-and-thanks-for-all-the-fourier-transforms/): I re-envision 19th-century thermodynamics with 21st-century quantum information theory, and I use the combination as a new lens through which to view various fields of science. As of September 2021, I'll be a physicist at the National Institute of Standards and Technology (NIST), a QuICS Fellow, a JQI affiliate, and an adjunct assistant professor at the Department of Physics and the Institute for Physical Science and Technology at the University of Maryland. For information about those acronyms, see my research group's webpage (https://quantumsteampunk.umiacs.io).

Cutting the quantum mustard

I had a relative to whom my parents referred, when I was little, as “that great-aunt of yours who walked into a glass door at your cousin’s birthday party.” I was a small child in a large family that mostly lived far away; little else distinguished this great-aunt from other relatives, in my experience. She’d intended to walk from my grandmother’s family room to the back patio. A glass door stood in the way, but she didn’t see it. So my great-aunt whammed into the glass; spent part of the party on the couch, nursing a nosebleed; and earned the epithet via which I identified her for years.

After growing up, I came to know this great-aunt as a kind, gentle woman who adored her family and was adored in return. After growing into a physicist, I came to appreciate her as one of my earliest instructors in necessary and sufficient conditions.

My great-aunt’s intended path satisfied one condition necessary for her to reach the patio: Nothing visible obstructed the path. But the path failed to satisfy a sufficient condition: The invisible obstruction—the glass door—had been neither slid nor swung open. Sufficient conditions, my great-aunt taught me, mustn’t be overlooked.

Her lesson underlies a paper I published this month, with coauthors from the Cambridge other than mine—Cambridge, England: David Arvidsson-Shukur and Jacob Chevalier Drori. The paper concerns, rather than pools and patios, quasiprobabilities, which I’ve blogged about many times [1,2,3,4,5,6,7]. Quasiprobabilities are quantum generalizations of probabilities. Probabilities describe everyday, classical phenomena, from Monopoly to March Madness to the weather in Massachusetts (and especially the weather in Massachusetts). Probabilities are real numbers (not dependent on the square-root of -1); they’re at least zero; and they compose in certain ways (the probability of sun or hail equals the probability of sun plus the probability of hail). Also, the probabilities that form a distribution, or a complete set, sum to one (if there’s a 70% chance of rain, there’s a 30% chance of no rain). 

In contrast, quasiprobabilities can be negative and nonreal. We call such values nonclassical, as they’re unavailable to the probabilities that describe classical phenomena. Quasiprobabilities represent quantum states: Imagine some clump of particles in a quantum state described by some quasiprobability distribution. We can imagine measuring the clump however we please. We can calculate the possible outcomes’ probabilities from the quasiprobability distribution.

Not from my grandmother’s house, although I wouldn’t mind if it were.

My favorite quasiprobability is an obscure fellow unbeknownst even to most quantum physicists: the Kirkwood-Dirac distribution. John Kirkwood defined it in 1933, and Paul Dirac defined it independently in 1945. Then, quantum physicists forgot about it for decades. But the quasiprobability has undergone a renaissance over the past few years: Experimentalists have measured it to infer particles’ quantum states in a new way. Also, colleagues and I have generalized the quasiprobability and discovered applications of the generalization across quantum physics, from quantum chaos to metrology (the study of how we can best measure things) to quantum thermodynamics to the foundations of quantum theory.

In some applications, nonclassical quasiprobabilities enable a system to achieve a quantum advantage—to usefully behave in a manner impossible for classical systems. Examples include metrology: Imagine wanting to measure a parameter that characterizes some piece of equipment. You’ll perform many trials of an experiment. In each trial, you’ll prepare a system (for instance, a photon) in some quantum state, send it through the equipment, and measure one or more observables of the system. Say that you follow the protocol described in this blog post. A Kirkwood-Dirac quasiprobability distribution describes the experiment.1 From each trial, you’ll obtain information about the unknown parameter. How much information can you obtain, on average over trials? Potentially more information if some quasiprobabilities are negative than if none are. The quasiprobabilities can be negative only if the state and observables fail to commute with each other. So noncommutation—a hallmark of quantum physics—underlies exceptional metrological results, as shown by Kirkwood-Dirac quasiprobabilities.

Exceptional results are useful, and we might aim to design experiments that achieve them. We can by designing experiments described by nonclassical Kirkwood-Dirac quasiprobabilities. When can the quasiprobabilities become nonclassical? Whenever the relevant quantum state and observables fail to commute, the quantum community used to believe. This belief turns out to mirror the expectation that one could access my grandmother’s back patio from the living room whenever no visible barriers obstructed the path. As a lack of visible barriers was necessary for patio access, noncommutation is necessary for Kirkwood-Dirac nonclassicality. But noncommutation doesn’t suffice, according to my paper with David and Jacob. We identified a sufficient condition, sliding back the metaphorical glass door on Kirkwood-Dirac nonclassicality. The condition depends on simple properties of the system, state, and observables. (Experts: Examples include the Hilbert space’s dimensionality.) We also quantified and upper-bounded the amount of nonclassicality that a Kirkwood-Dirac quasiprobability can contain.

From an engineering perspective, our results can inform the design of experiments intended to achieve certain quantum advantages. From a foundational perspective, the results help illuminate the sources of certain quantum advantages. To achieve certain advantages, noncommutation doesn’t cut the mustard—but we now know a condition that does.

For another take on our paper, check out this news article in Physics Today.  

1Really, a generalized Kirkwood-Dirac quasiprobability. But that phrase contains a horrendous number of syllables, so I’ll elide the “generalized.”

Learning about learning

The autumn of my sophomore year of college was mildly hellish. I took the equivalent of three semester-long computer-science and physics courses, atop other classwork; co-led a public-speaking self-help group; and coordinated a celebrity visit to campus. I lived at my desk and in office hours, always declining my flatmates’ invitations to watch The West Wing

Hard as I studied, my classmates enjoyed greater facility with the computer-science curriculum. They saw immediately how long an algorithm would run, while I hesitated and then computed the run time step by step. I felt behind. So I protested when my professor said, “You’re good at this.” 

I now see that we were focusing on different facets of learning. I rued my lack of intuition. My classmates had gained intuition by exploring computer science in high school, then slow-cooking their experiences on a mental back burner. Their long-term exposure to the material provided familiarity—the ability to recognize a new problem as belonging to a class they’d seen examples of. I was cooking course material in a mental microwave set on “high,” as a semester’s worth of material was crammed into ten weeks at my college.

My professor wasn’t measuring my intuition. He only saw that I knew how to compute an algorithm’s run time. I’d learned the material required of me—more than I realized, being distracted by what I hadn’t learned that difficult autumn.

We can learn a staggering amount when pushed far from our comfort zones—and not only we humans can. So can simple collections of particles.

Examples include a classical spin glass. A spin glass is a collection of particles that shares some properties with a magnet. Both a magnet and a spin glass consist of tiny mini-magnets called spins. Although I’ve blogged about quantum spins before, I’ll focus on classical spins here. We can imagine a classical spin as a little arrow that points upward or downward.  A bunch of spins can form a material. If the spins tend to point in the same direction, the material may be a magnet of the sort that’s sticking the faded photo of Fluffy to your fridge.

The spins may interact with each other, similarly to how electrons interact with each other. Not entirely similarly, though—electrons push each other away. In contrast, a spin may coax its neighbors into aligning or anti-aligning with it. Suppose that the interactions are random: Any given spin may force one neighbor into alignment, gently ask another neighbor to align, entreat a third neighbor to anti-align, and having nothing to say to neighbors four and five.

The spin glass can interact with the external world in two ways. First, we can stick the spins in a magnetic field, as by placing magnets above and below the glass. If aligned with the field, a spin has negative energy; and, if antialigned, positive energy. We can sculpt the field so that it varies across the spin glass. For instance, spin 1 can experience a strong upward-pointing field, while spin 2 experiences a weak downward-pointing field.

Second, say that the spins occupy a fixed-temperature environment, as I occupy a 74-degree-Fahrenheit living room. The spins can exchange heat with the environment. If releasing heat to the environment, a spin flips from having positive energy to having negative—from antialigning with the field to aligning.

Let’s perform an experiment on the spins. First, we design a magnetic field using random numbers. Whether the field points upward or downward at any given spin is random, as is the strength of the field experienced by each spin. We sculpt three of these random fields and call the trio a drive.

Let’s randomly select a field from the drive and apply it to the spin glass for a while; again, randomly select a field from the drive and apply it; and continue many times. The energy absorbed by the spins from the fields spikes, then declines.

Now, let’s create another drive of three random fields. We’ll randomly pick a field from this drive and apply it; again, randomly pick a field from this drive and apply it; and so on. Again, the energy absorbed by the spins spikes, then tails off.

Here comes the punchline. Let’s return to applying the initial fields. The energy absorbed by the glass will spike—but not as high as before. The glass responds differently to a familiar drive than to a new drive. The spin glass recognizes the original drive—has learned the first fields’ “fingerprint.” This learning happens when the fields push the glass far from equilibrium,1 as I learned when pushed during my mildly hellish autumn.

So spin glasses learn drives that push them far from equilibrium. So do many other simple, classical, many-particle systems: polymers, viscous liquids, crumpled sheets of Mylar, and more. Researchers have predicted such learning and observed it experimentally. 

Scientists have detected many-particle learning by measuring thermodynamic observables. Examples include the energy absorbed by the spin glass—what thermodynamicists call work. But thermodynamics developed during the 1800s, to describe equilibrium systems, not to study learning. 

One study of learning—the study of machine learning—has boomed over the past two decades. As described by the MIT Technology Review, “[m]achine-learning algorithms use statistics to find patterns in massive amounts of data.” Users don’t tell the algorithms how to find those patterns.

xkcd.com/1838

It seems natural and fitting to use machine learning to learn about the learning by many-particle systems. That’s what I did with collaborators from the group of Jeremy England, a GlaxoSmithKline physicist who studies complex behaviors of many particle systems. Weishun Zhong, Jacob Gold, Sarah Marzen, Jeremy, and I published our paper last month. 

Using machine learning, we detected and measured many-particle learning more reliably and precisely than thermodynamic measures seem able to. Our technique works on multiple facets of learning, analogous to the intuition and the computational ability I encountered in my computer-science course. We illustrated our technique on a spin glass, but one can apply our approach to other systems, too. I’m exploring such applications with collaborators at the University of Maryland.

The project pushed me far from my equilibrium: I’d never worked with machine learning or many-body learning. But it’s amazing, what we can learn when pushed far from equilibrium. I first encountered this insight sophomore fall of college—and now, we can quantify it better than ever.

1Equilibrium is a quiet, restful state in which the glass’s large-scale properties change little. No net flow of anything—such as heat or particles—enter or leave the system.

One if by land minus two if by sea, over the square-root of two

Happy National Poetry Month! The United States salutes word and whimsy in April, and Quantum Frontiers is continuing its tradition of celebrating. As a resident of Cambridge, Massachusetts and as a quantum information scientist, I have trouble avoiding the poem “Paul Revere’s Ride.” 

Henry Wadsworth Longfellow wrote the poem, as well as others in the American canon, during the 1800s. Longfellow taught at Harvard in Cambridge, and he lived a few blocks away from the university, in what’s now a national historic site. Across the street from the house, a bust of the poet gazes downward, as though lost in thought, in Longfellow Park. Longfellow wrote one of his most famous poems about an event staged a short drive from—and, arguably, partially in—Cambridge.

Longfellow Park

The event took place “on the eighteenth of April, in [Seventeen] Seventy-Five,” as related by the narrator of “Paul Revere’s Ride.” Revere was a Boston silversmith and a supporter of the American colonies’ independence from Britain. Revolutionaries anticipated that British troops would set out from Boston sometime during the spring. The British planned to seize revolutionaries’ weapons in the nearby town of Concord and to jail revolutionary leaders in Lexington. The troops departed Boston during the night of April 18th. 

Upon learning of their movements, sexton Robert Newman sent a signal from Boston’s old North Church to Charlestown. Revere and the physician William Dawes rode out from Charlestown to warn the people of Lexington and the surrounding areas. A line of artificial hoof prints, pressed into a sidewalk a few minutes from the Longfellow house, marks part of Dawes’s trail through Cambridge. The initial riders galvanized more riders, who stirred up colonial militias that resisted the troops’ advance. The Battles of Lexington and Concord ensued, initiating the Revolutionary War.

Longfellow took liberties with the facts he purported to relate. But “Paul Revere’s Ride” has blown the dust off history books for generations of schoolchildren. The reader shares Revere’s nervous excitement as he fidgets, awaiting Newman’s signal: 

Now he patted his horse’s side, 
Now gazed on the landscape far and near, 
Then impetuous stamped the earth, 
And turned and tightened his saddle-girth;
But mostly he watched with eager search 
The belfry-tower of the old North Church.

The moment the signal arrives, that excitement bursts its seams, and Revere leaps astride his horse. The reader comes to gallop through with the silversmith the night, the poem’s clip-clop-clip-clop rhythm evoking a horse’s hooves on cobblestones.

The author, outside Longfellow House, on the eighteenth of April in…Twenty Twenty.

Not only does “Paul Revere’s Ride” revitalize history, but it also offers a lesson in information theory. While laying plans, Revere instructs Newman: 

He said to his friend, “If the British march
By land or sea from the town to-night,
Hang a lantern aloft in the belfry-arch
Of the North-Church-tower, as a signal light.

Then comes one of the poem’s most famous lines: “One if by land, and two if by sea.” The British could have left Boston by foot or by boat, and Newman had to communicate which. Specifying one of two options, he related one bit, or one basic unit of information. Newman thereby exemplifies a cornerstone of information theory: the encoding of a bit of information—an abstraction—in a physical system that can be in one of two possible states—a light that shines from one or two lanterns.

Benjamin Schumacher and Michael Westmoreland point out the information-theoretic interpretation of Newman’s action in their quantum-information textbook. I used their textbook in my first quantum-information course, as a senior in college. Before reading the book, I’d never felt that I could explain what information is or how it can be quantified. Information is an abstraction and a Big Idea, like consciousness, life, and piety. But, Schumacher and Westmoreland demonstrated, most readers already grasp the basics of information theory; some readers even studied the basics while memorizing a poem in elementary school. So I doff my hat—or, since we’re discussing the 1700s, my mobcap—to the authors.

Reading poetry enriches us more than we realize. So read a poem this April. You can find Longfellow’s poem here or ride off wherever your fancy takes you.  

Life among the experimentalists

I used to catch lizards—brown anoles, as I learned to call them later—as a child. They were colored as their name suggests, were about as long as one of my hands, and resented my attention. But they frequented our back porch, and I had a butterfly net. So I’d catch lizards, with my brother or a friend, and watch them. They had throats that occasionally puffed out, exposing red skin, and tails that detached and wriggled of their own accord, to distract predators.

Some theorists might appreciate butterfly nets, I imagine, for catching experimentalists. Some of us theorists will end a paper or a talk with “…and these predictions are experimentally accessible.” A pause will follow the paper’s release or the talk, in hopes that a reader or an audience member will take up the challenge. Usually, none does, and the writer or speaker retires to the Great Deck Chair of Theory on the Back Patio of Science.

So I was startled when an anole, metaphorically speaking, volunteered a superconducting qubit for an experiment I’d proposed.

The experimentalist is one of the few people I can compare to a reptile without fear that he’ll take umbrage: Kater Murch, an associate professor of physics at Washington University in St. Louis. The most evocative description of Kater that I can offer appeared in an earlier blog post: “Kater exudes the soberness of a tenured professor but the irreverence of a Californian who wears his hair slightly long and who tattooed his wedding band on.”

Kater expressed interest in an uncertainty relation I’d proved with theory collaborators. According to some of the most famous uncertainty relations, a quantum particle can’t have a well-defined position and a well-defined momentum simultaneously. Measuring the position disturbs the momentum; any later momentum measurement outputs a completely random, or uncertain, number. We measure uncertainties with entropies: The greater an entropy, the greater our uncertainty. We can cast uncertainty relations in terms of entropies.

I’d proved, with collaborators, an entropic uncertainty relation that describes chaos in many-particle quantum systems. Other collaborators and I had shown that weak measurements, which don’t disturb a quantum system much, characterize chaos. So you can check our uncertainty relation using weak measurements—as well as strong measurements, which do disturb quantum systems much. One can simplify our uncertainty relation—eliminate the chaos from the problem and even eliminate most of the particles. An entropic uncertainty relation for weak and strong measurements results.

Kater specializes in weak measurements, so he resolved to test our uncertainty relation. Physical Review Letters published the paper about our collaboration this month. Quantum measurements can not only create uncertainty, the paper shows, but also reduce it: Kater and his PhD student Jonathan Monroe used light to measure a superconducting qubit, a tiny circuit in which current can flow forever. The qubit had properties analogous to position and momentum (the spin’s z– and x-components). If the atom started with a well-defined “position” (the z-component) and the “momentum” (the x-component) was measured, the outcome was highly random; the total uncertainty about the two measurements was large. But if the atom started with a well-defined “position” (z-component) and another property (the spin’s y-component) was measured before the “momentum” (the x-component) was measured strongly, the total uncertainty was lower. The extra measurement was designed not to disturb the atom much. But the nudge prodded the atom enough, rendering the later “momentum” measurement (the x measurement) more predictable. So not only can quantum measurements create uncertainty, but gentle quantum measurements can also reduce it.

I didn’t learn only physics from our experiment. When I’d catch a lizard, I’d tip it into a tank whose lid contained a magnifying lens, and I’d watch the lizard. I didn’t trap Kater and Jonathan under a magnifying glass, but I did observe their ways. Here’s what I learned about the species experimentalus quanticus.

1) They can run experiments remotely when a pandemic shuts down campus: A year ago, when universities closed and cities locked down, I feared that our project would grind to a halt. But Jonathan twiddled knobs and read dials via his computer, and Kater popped into the lab for the occasional fixer-upper. Jonathan even continued his experiment from another state, upon moving to Texas to join his parents. And here we theorists boast of being able to do our science almost anywhere.

2) They speak with one less layer of abstraction than I: We often discussed, for instance, the thing used to measure the qubit. I’d call the thing “the detector.” Jonathan would call it “the cavity mode,” referring to the light that interacts with the qubit, which sits in a box, or cavity. I’d say “poh-tay-toe”; they’d say “poh-tah-toe”; but I’m glad we didn’t call the whole thing off.

Fred Astaire: “Detector.”
Ginger Rogers: “Cavity mode.”

3) Experiments take longer than expected—even if you expect them to take longer than estimated: Kater and I hatched the plan for this project during June 2018. The experiment would take a few months, Kater estimated. It terminated last summer.

4) How they explain their data: Usually in terms of decoherence, the qubit’s leaking of quantum information into its environment. For instance, to check that the setup worked properly, Jonathan ran a simple test that ended with a measurement. (Experts: He prepared a \sigma_z eigenstate, performed a Hadamard gate, and measured \sigma_z.) The measurement should have had a 50% chance of yielding +1 and a 50% chance of yield -1. But the -1 outcome dominated the trials. Why? Decoherence pushed the qubit toward toward -1. (Amplitude damping dominated the noise.)

5) Seeing one’s theoretical proposal turn into an experiment feels satisfying: Due to point (3), among other considerations, experiments aren’t cheap. The lab’s willingness to invest in the idea I’d developed with other theorists was heartening. Furthermore, the experiment pushed us to uncover more theory—for example, how tight the uncertainty bound could grow.

After getting to know an anole, I’d release it into our backyard and bid it adieu.1 So has Kater moved on to experimenting with topology, and Jonathan has progressed toward graduation. But more visitors are wriggling in the Butterfly Net of Theory-Experiment Collaboration. Stay tuned.

1Except for the anole I accidentally killed, by keeping it in the tank for too long. But let’s not talk about that.

Project Ant-Man

The craziest challenge I’ve undertaken hasn’t been skydiving; sailing the Amazon on a homemade raft; scaling Mt. Everest; or digging for artifacts atop a hill in a Middle Eastern desert, near midday, during high summer.1 The craziest challenge has been to study the possibility that quantum phenomena affect cognition significantly. 

Most physicists agree that quantum phenomena probably don’t affect cognition significantly. Cognition occurs in biological systems, which have high temperatures, many particles, and watery components. Such conditions quash entanglement (a relationship that quantum particles can share and that can produce correlations stronger than any produceable by classical particles). 

Yet Matthew Fisher, a condensed-matter physicist, proposed a mechanism by which entanglement might enhance coordinated neuron firing. Phosphorus nuclei have spins (quantum properties similar to angular momentum) that might store quantum information for long times when in Posner molecules. These molecules may protect the information from decoherence (leaking quantum information to the environment), via mechanisms that Fisher described.

I can’t check how correct Fisher’s proposal is; I’m not a biochemist. But I’m a quantum information theorist. So I can identify how Posners could process quantum information if Fisher were correct. I undertook this task with my colleague Elizabeth Crosson, during my PhD

Experimentalists have begun testing elements of Fisher’s proposal. What if, years down the road, they find that Posners exist in biofluids and protect quantum information for long times? We’ll need to test whether Posners can share entanglement. But detecting entanglement tends to require control finer than you can exert with a stirring rod. How could you check whether a beakerful of particles contains entanglement?

I asked that question of Adam Bene Watts, a PhD student at MIT, and John Wright, then an MIT postdoc and now an assistant professor in Texas. John gave our project its codename. At a meeting one day, he reported that he’d watched the film Avengers: Endgame. Had I seen it? he asked.

No, I replied. The only superhero movie I’d seen recently had been Ant-Man and the Wasp—and that because, according to the film’s scientific advisor, the movie riffed on research of mine. 

Go on, said John.

Spiros Michalakis, the Caltech mathematician in charge of this blog, served as the advisor. The film came out during my PhD; during a meeting of our research group, Spiros advised me to watch the movie. There was something in it “for you,” he said. “And you,” he added, turning to Elizabeth. I obeyed, to hear Laurence Fishburne’s character tell Ant-Man that another character had entangled with the Posner molecules in Ant-Man’s brain.2 

John insisted on calling our research Project Ant-Man.

John and Adam study Bell tests. Bell test sounds like a means of checking whether the collar worn by your cat still jingles. But the test owes its name to John Stewart Bell, a Northern Irish physicist who wrote a groundbreaking paper in 1964

Say you’d like to check whether two particles share entanglement. You can run an experiment, described by Bell, on them. The experiment ends with a measurement of the particles. You repeat this experiment in many trials, using identical copies of the particles in subsequent trials. You accumulate many measurement outcomes, whose statistics you calculate. You plug those statistics into a formula concocted by Bell. If the result exceeds some number that Bell calculated, the particles shared entanglement.

We needed a variation on Bell’s test. In our experiment, every trial would involve hordes of particles. The experimentalists—large, clumsy, classical beings that they are—couldn’t measure the particles individually. The experimentalists could record only aggregate properties, such as the intensity of the phosphorescence emitted by a test tube.

Adam, MIT physicist Aram Harrow, and I concocted such a Bell test, with help from John. Physical Review A published our paper this month—as a Letter and an Editor’s Suggestion, I’m delighted to report.

For experts: The trick was to make the Bell correlation function nonlinear in the state. We assumed that the particles shared mostly pairwise correlations, though our Bell inequality can accommodate small aberrations. Alas, no one can guarantee that particles share only mostly pairwise correlations. Violating our Bell inequality therefore doesn’t rule out hidden-variables theories. Under reasonable assumptions, though, a not-completely-paranoid experimentalist can check for entanglement using our test. 

One can run our macroscopic Bell test on photons, using present-day technology. But we’re more eager to use the test to characterize lesser-known entities. For instance, we sketched an application to Posner molecules. Detecting entanglement in chemical systems will require more thought, as well as many headaches for experimentalists. But our paper broaches the cask—which I hope to see flow in the next Ant-Man film. Due to debut in 2022, the movie has the subtitle Quantumania. Sounds almost as crazy as studying the possibility that quantum phenomena affect cognition.

1Of those options, I’ve undertaken only the last.

2In case of any confusion: We don’t know that anyone’s brain contains Posner molecules. The movie features speculative fiction.

Random walks

A college professor of mine proposed a restaurant venture to our class. He taught statistical mechanics, the physics of many-particle systems. Examples range from airplane fuel to ice cubes to primordial soup. Such systems contain 1024 particles each—so many particles that we couldn’t track them all if we tried. We can gather only a little information about the particles, so their actions look random.

So does a drunkard’s walk. Imagine a college student who (outside of the pandemic) has stayed out an hour too late and accepted one too many red plastic cups. He’s arrived halfway down a sidewalk, where he’s clutching a lamppost, en route home. Each step has a 50% chance of carrying him leftward and a 50% chance of carrying him rightward. This scenario repeats itself every Friday. On average, five minutes after arriving at the lamppost, he’s back at the lamppost. But, if we wait for a time T, we have a decent chance of finding him a distance \sqrt{T} away. These characteristic typify a simple random walk.

Random walks crop up across statistical physics. For instance, consider a grain of pollen dropped onto a thin film of water. The water molecules buffet the grain, which random-walks across the film. Robert Brown observed this walk in 1827, so we call it Brownian motion. Or consider a magnet at room temperature. The magnet’s constituents don’t walk across the surface, but they orient themselves according random-walk mathematics. And, in quantum many-particle systems, information can spread via a random walk. 

So, my statistical-mechanics professor said, someone should open a restaurant near MIT. Serve lo mein and Peking duck, and call the restaurant the Random Wok.

This is the professor who, years later, confronted another alumna and me at a snack buffet.

“You know what this is?” he asked, waving a pastry in front of us. We stared for a moment, concluded that the obvious answer wouldn’t suffice, and shook our heads.

“A brownie in motion!”

Not only pollen grains undergo Brownian motion, and not only drunkards undergo random walks. Many people random-walk to their careers, trying out and discarding alternatives en route. We may think that we know our destination, but we collide with a water molecule and change course.

Such is the thrust of Random Walks, a podcast to which I contributed an interview last month. Abhigyan Ray, an undergraduate in Mumbai, created the podcast. Courses, he thought, acquaint us only with the successes in science. Stereotypes cast scientists as lone geniuses working in closed offices and silent labs. He resolved to spotlight the collaborations, the wrong turns, the lessons learned the hard way—the random walks—of science. Interviewees range from a Microsoft researcher to a Harvard computer scientist to a neurobiology professor to a genomicist.

You can find my episode on Instagram, Apple Podcasts, Google Podcasts, and Spotify. We discuss the bridging of disciplines; the usefulness of a liberal-arts education in physics; Quantum Frontiers; and the delights of poking fun at my PhD advisor, fellow blogger and Institute for Quantum Information and Matter director John Preskill

The Grand Tour of quantum thermodynamics

Young noblemen used to undertake a “Grand Tour” during the 1600s and 1700s. Many of the tourists hailed from England, though well-to-do compatriots traveled from Scandinavia, Germany, and the United States. The men had just graduated from university—in many cases, Oxford or Cambridge. They’d studied classical history, language, and literature; and now, they’d experience what they’d read. Tourists flocked to Rome, Venice, and Florence, as well as to Paris; optional additions included Naples, Switzerland, Germany, and the Netherlands.

Tutors accompanied the tourists, guiding their charges across Europe. The tutors rounded out the young men’s education, instructing them in art, music, architecture, and continental society. I felt like those tutors, this month and last.1

I’m the one in the awkward-looking pose on the left.

I was lecturing in a quantum-thermodynamics mini course, with fellow postdoctoral scholar Matteo Lostaglio. Gabriel Landi, a professor of theoretical physics at the University of São Paolo in Brazil, organized the course. It targeted early-stage graduate students, who’d mastered the core of physics and who wished to immerse in quantum thermodynamics. But the enrollment ranged from PhD and Masters students to undergraduates, postdocs, faculty members, and industry employees.

The course toured quantum thermodynamics similarly to how young noblemen toured Europe. I imagine quantum thermodynamics as a landscape—one inked on a parchment map, with blue whorls representing the sea and with a dragon breathing fire in one corner. Quantum thermodynamics encompasses many communities whose perspectives differ and who wield different mathematical and conceptual tools. These communities translate into city-states, principalities, republics, and other settlements on the map. The class couldn’t visit every city, just as Grand Tourists couldn’t. But tourists had a leg up on us in their time budgets: A Grand Tour lasted months or years, whereas we presented nine hour-and-a-half lectures.

Attendees in Stuttgart

Grand Tourists returned home with trinkets, books, paintings, and ancient artifacts. I like to imagine that the tutors, too, acquired souvenirs. Here are four of my favorite takeaways from the course:

1) Most captivating subfield that I waded into for the course: Thermodynamic uncertainty relations. Researchers have derived these inequalities using nonequilibrium statistical mechanics, a field that encompasses molecular motors, nanorobots, and single strands of DNA. Despite the name “uncertainty relations,” classical and quantum systems obey these inequalities.

Imagine a small system interacting with big systems that have different temperatures and different concentrations of particles. Energy and particles hop between the systems, dissipating entropy (\Sigma) and forming currents. The currents change in time, due to the probabilistic nature of statistical mechanics. 

How much does a current vary, relative to its average value, \langle J \rangle? We quantify this variation with the relative variance, {\rm var}(J) / \langle J \rangle^2. Say that you want a low-variance, predictable current. You’ll have to pay a high entropy cost: \frac{ {\rm var} (J) }{\langle J \rangle^2 } \geq  \frac{2 k_{\rm B} }{\Sigma}, wherein k_{\rm B} denotes Boltzmann’s constant. 

Thermodynamic uncertainty relations govern systems arbitrarily far from equilibrium. We know loads about systems at equilibrium, in which large-scale properties remain approximately constant and no net flows (such as flows of particles) enter or leave the system. We know much about systems close to equilibrium. The regime arbitrarily far from equilibrium is the Wild, Wild West of statistical mechanics. Proving anything about this regime tends to require assumptions and specific models, to say nothing of buckets of work. But thermodynamic uncertainty relations are general, governing classical and quantum systems from molecular motors to quantum dots.

Multiple cats attended our mini course, according to the selfies we received.

2) Most unexpected question: During lecture one, I suggested readings that introduce quantum thermodynamics. The suggestions included two reviews and the article I wrote for Scientific American about quantum steampunk, my angle on quantum thermodynamics. The next day, a participant requested recommendations of steampunk novels. I’d prepared more for requests for justifications of the steps in my derivations. But I forwarded a suggestion given to me twice: The Difference Engine, by William Gibson and Bruce Sterling.

3) Most insightful observation: My fellow tutor—I mean lecturer—pointed out how quantum thermodynamics doesn’t and does diverge from classical thermodynamics. Quantum systems can’t break the second law of thermodynamics, as classical systems can’t. Quantum engines can’t operate more efficiently than Carnot’s engine. Erasing information costs work, regardless of whether the information-bearing degree of freedom is classical or quantum. So broad results about quantum thermodynamics coincide with broad results about classical thermodynamics. We can find discrepancies by focusing on specific physical systems, such as a spring that can be classical or quantum.  

4) Most staggering numbers: Unlike undertaking a Grand Tour, participating in the mini course cost nothing. We invited everyone across the world to join, and 420 participants from 48 countries enrolled. I learned of the final enrollment days before the course began, scrolling through the spreadsheet of participants. Motivated as I had been to double-check my lecture notes, the number spurred my determination like steel on a horse’s flanks.

The Grand Tour gave rise to travelogues and guidebooks read by tourists across the centuries: Mark Twain has entertained readers—partially at his own expense—since 1869 in the memoir The Innocents Abroad. British characters in the 1908 novel A Room with a View diverge in their views of Baedeker’s Handbook to Northern Italy. Our course material, and videos of the lectures, remain online and available to everyone for free. You’re welcome to pack your trunk, fetch your cloak, and join the trip.

A screenshot from the final lecture

1In addition to guiding their wards, tutors kept the young men out of trouble—and one can only imagine what trouble wealthy young men indulged in the year after college. I didn’t share that responsibility.

May you go from weakness to weakness

I used to eat lunch at the foundations-of-quantum-theory table. 

I was a Masters student at the Perimeter Institute for Theoretical Physics, where I undertook a research project during the spring term. The project squatted on the border between quantum information theory and quantum foundations, where my two mentors worked. Quantum foundations concerns how quantum physics differs from classical physics; which alternatives to quantum physics could govern our world but don’t; and those questions, such as about Schrödinger’s cat, that fascinate us when we first encounter quantum theory, that many advisors warn probably won’t land us jobs if we study them, and that most physicists argue about only over a beer in the evening.

I don’t drink beer, so I had to talk foundations over sandwiches around noon.

One of us would dream up what appeared to be a perpetual-motion machine; then the rest of us would figure out why it couldn’t exist. Satisfied that the second law of thermodynamics still reigned, we’d decamp for coffee. (Perpetual-motion machines belong to the foundations of thermodynamics, rather than the foundations of quantum theory, but we didn’t discriminate.) I felt, at that lunch table, an emotion blessed to a student finding her footing in research, outside her country of origin: belonging.

The quantum-foundations lunch table came to mind last month, when I learned that Britain’s Institute of Physics had selected me to receive its International Quantum Technology Emerging Researcher Award. I was very grateful for the designation, but I was incredulous: Me? Technology? But I began grad school at the quantum-foundations lunch table. Foundations is to technology as the philosophy of economics is to dragging a plow across a wheat field, at least stereotypically.

Worse, I drag plows from wheat field to barley field to oat field. I’m an interdisciplinarian who never belongs in the room I’ve joined. Among quantum information theorists, I’m the thermodynamicist, or that theorist who works with experimentalists; among experimentalists, I’m the theorist; among condensed-matter physicists, I’m the quantum information theorist; among high-energy physicists, I’m the quantum information theorist or the atomic-molecular-and-optical (AMO) physicist; and, among quantum thermodynamicists, I do condensed matter, AMO, high energy, and biophysics. I usually know less than everyone else in the room about the topic under discussion. An interdisciplinarian can leverage other fields’ tools to answer a given field’s questions and can discover questions. But she may sound, to those in any one room, as though she were born yesterday. As Kermit the Frog said, 

Grateful as I am, I’d rather not dwell on why the Institute of Physics chose my file; anyone interested can read the citation or watch the thank-you speech. But the decision turned out to involve foundations and interdisciplinarity. So I’m dedicating this article to two sources of inspiration: an organization that’s blossomed by crossing fields and an individual who’s driven technology by studying fundamentals.

Britain’s Institute for Physics has a counterpart in the American Physical Society. The latter has divisions, each dedicated to some subfield of physics. If you belong to the society and share an interest in one of those subfields, you can join that division, attend its conferences, and receive its newsletters. I learned about Division of Soft Matter from this article, which I wish I could quote almost in full. This division’s members study “a staggering variety of materials from the everyday to the exotic, including polymers such as plastics, rubbers, textiles, and biological materials like nucleic acids and proteins; colloids, a suspension of solid particles such as fogs, smokes, foams, gels, and emulsions; liquid crystals like those found in electronic displays; [ . . . ] and granular materials.” Members belong to physics, chemistry, biology, engineering, and geochemistry. 

Despite, or perhaps because of, its interdisciplinarity, the division has thrived. The group grew from a protodivision (a “topical group,” in the society’s terminology) to a division in five years—at “an unprecedented pace.” Intellectual diversity has complemented sociological diversity: The division “ranks among the top [American Physical Society] units in terms of female membership.” The division’s chair observes a close partnership between theory and experiment in what he calls “a vibrant young field.”

And some division members study oobleck. Wouldn’t you like to have an excuse to say “oobleck” every day?

The second source of inspiration lives, like the Institute of Physics, in Britain. David Deutsch belongs at the quantum-foundations table more than I. A theoretical physicist at Oxford, David cofounded the field of quantum computing. He explained why to me in a fusion of poetry and the pedestrian: He was “fixing the roof” of quantum theory. As a graduate student, David wanted to understand quantum foundations—what happens during a measurement—but concluded that quantum theory has too many holes. The roof was leaking through those holes, so he determined to fix them. He studied how information transformed during quantum processes, married quantum theory with computer science, and formalized what quantum computers could and couldn’t accomplish. Which—years down the road, fused with others’ contributions—galvanized experimentalists to harness ions and atoms, improve lasers and refrigerators, and build quantum computers and quantum cryptography networks. 

David is a theorist and arguably a philosopher. But he’d have swept the Institute of Physics’s playing field, could he have qualified as an “emerging researcher” this autumn (David began designing quantum algorithms during the 1980s).

I returned to the Perimeter Institute during the spring term of 2019. I ate lunch at the quantum-foundations table, and I felt that I still belonged. I feel so still. But I’ve eaten lunch at other tables by now, and I feel that I belong at them, too. I’m grateful if the habit has been useful.

Congratulations to Hannes Bernien, who won the institute’s International Quantum Technology Young Scientist Award, and to the “highly commended” candidates, whom you can find here!

Seven reasons why I chose to do science in the government

When I was in college, people asked me what I wanted to do with my life. I’d answer, “I want to be of use and to learn always.” The question resurfaced in grad school and at the beginning of my postdoc. I answered that I wanted to do extraordinary science that I’d steer. Academia attracted me most, but I wouldn’t discount alternatives.

Last spring, I accepted an offer to build my research group as a member of NIST, the National Institute for Standards and Technology in the U.S. government. My group will be headquartered on the University of Maryland campus, nestled amongst quantum and interdisciplinary institutes. I’m grateful to be joining NIST, and I’m surprised. I never envisioned myself working for the government. I could have accepted an assistant professorship (and I was extremely grateful for the offers), but NIST swept me off my feet. Here are seven reasons why, for other early-career researchers contemplating possibilities.

1) The science. One event illustrates this reason: The notice of my job offer came from NIST Maryland’s friendly neighborhood Nobel laureate. NIST and the university invested in quantum science years before everyone and her uncle began scrambling to create a quantum institute. That investment has flowered, including in reason (2).

2) The research environment. I wouldn’t say that I have a love affair with the University of Maryland. But I’ve found myself visiting every few years (sometimes blogging about the experience). Why? Much of the quantum community passes through Maryland. Seminars fill the week, visitors fill many offices, and conferences happen once or twice a year. Theorists and experimentalists mingle over lunch and collaborate. 

The university shares two quantum institutes with NIST: QuICS (the Joint Center for Quantum Information and Computer Science) and the JQI (the Joint Quantum Institute). My group will be based at the former and affiliated with the latter. We’ll also belong to IPST (the university’s Institute for Physical Science and Technology), a hub for interdisciplinarity and thermodynamics. When visiting a university, I ask how much researchers collaborate across department lines. I usually hear an answer along the lines of “We value interdisciplinarity, and we wish that we had more of it, but we don’t have much.” Few universities ingrain interdisciplinarity into their bones by dedicating institutes to it.

Maryland’s quantum community and thermodynamics communities bustle and produce. They grant NIST researchers an academic environment, independence to shape their research paths, and the freedom to participate in the broader scientific community. If weary of the three institutes mentioned above, one can explore the university’s Quantum Technology Center and Condensed-Matter-Theory Center

3) The people. The first Maryland quantum researcher I met was the friendly neighborhood Nobel laureate, Bill Phillips. Bill was presenting a keynote address at Dartmouth College’s physics department, where I’d earned my Bachelors. Bill said that he’d attended a small liberal-arts college before pursuing his PhD at MIT. During the question-and-answer session, I welcomed him back to a small liberal-arts college. How, I asked, had he benefited from the liberal arts? Juniata College, Bill said, had made him a good person. MIT had helped make him a good scientist. Since then, I’ve kept in occasional contact with Bill, we’ve attended talks of each other’s, and I’ve watched him exhibit the most curiosity I’ve seen in almost anyone. What more could one wish for in a colleague?

An equality used across thermodynamics bears Chris Jarzynski’s last name, but he never calls the equality what everyone else does. I benefited from Chris’s mentorship during my PhD, despite our working on opposite sides of the country. His awards include not only membership in the National Academy of Sciences, but also an Outstanding Referee designation, for reviewing so many journal submissions in service to the scientific community. Chris calls IPST, the university’s interdisciplinary and thermodynamic institute, his intellectual home. That recommendation suffices for me.

I’ve looked up to Alexey Gorshkov since beginning my PhD. I keep an eye out for Mohammad Hafezi’s and Pratyush Tiwari’s papers. A quantum researcher couldn’t ignore Chris Monroe’s papers if she tried. Postdoctoral and graduate fellowships stock the community with energetic young researchers. Three energetic researchers are joining QuICS as senior Fellows around the time I am. I’ll spare you the rest of my sources of inspiration.

4) The teaching. Most faculty members at R1 research universities teach two to three courses per year. NIST members can teach once every other year. I value teaching and appreciate how teaching benefits not only students, but also instructors. I respect teachers and remain grateful for their influence. I’m grateful to have received reports that I teach well. Because I’ve acquired some skill at communicating, people tend to assume that I adore teaching. I adore presenting talks, but I don’t feel a calling to teach. Mentors have exhorted me to pursue what excites me most and what only I can accomplish. I feel called to do research and to mentor younger researchers. 

Furthermore, if I had to teach much, I wouldn’t have time for writing anything other than papers or grants, such as blog posts. Some of you readers have astonished me with accounts of what my writing means to you. You’ve approached me at conferences, buttonholed me after seminars, and emailed. I’m grateful (as I keep saying, but I mean what I say) for the opportunity to touch lives across the world. I hope to inspire students to take quantum, information-theory, and thermodynamics courses (including the quantum-thermodynamics course that I’d like to teach occasionally). Instructors teach quantum courses throughout the world. No one else writes about Egyptian sarcophagi and the second law of thermodynamics, to my knowledge, or the Russian writer Alexander Pushkin and reproductive science. Perhaps no one should. But, since no one else does, I have to.1

5) The funding. Faculty members complain that they do little apart from applying for grants. Grants fund students, postdocs, travel, summer salaries, equipment, visitors, and workshops. NIST provides primary investigators with research funding every year. Not all the funding that some groups need, but enough to free up time to undertake the research that primary investigators love.

6) The lack of tenure stress. Many junior faculty members fear that they won’t achieve tenure. The fear pushes them away from taking risks in their research programs. This month, I embarked upon a risk that I know I should take but that, had I been facing an assistant professorship, would have given me pause.

7) The acronyms. Above, I introduced NIST (the National Institute of Standards and Technology), UMD (the University of Maryland), QuICS (the Joint Center for Quantum Information and Computer Science), the JQI (the Joint Quantum Institute), and IPST (the Institute for Physical Science and Technology). I’ll also have an affiliation with UMIACS (the University of Maryland Institute for Advanced Computer Science). Where else can one acquire six acronyms? I adore collecting affiliations, which force me to cross intellectual borders. I also enjoy the opportunity to laugh at my CV.

I’ve deferred joining NIST until summer 2021, to complete my postdoctoral fellowship at the Harvard-Smithsonian Institute for Theoretical Atomic, Molecular, and Optical Physics (an organization that needs its acronym, ITAMP, as much as “the Joint Center for Quantum Information and Computer Science” does). After then, please stop by. If you’d like to join my group, please email: I’m accepting applications for PhD and postdoctoral positions this fall. See you in Maryland next year.

1Also, blogging benefits my research. I’ll leave the explanation for another post.

I credit my husband with the Nesquick-NIST/QuICS parallel.

Love in the time of thermo

An 81-year-old medical doctor has fallen off a ladder in his house. His pet bird hopped out of his reach, from branch to branch of a tree on the patio. The doctor followed via ladder and slipped. His servants cluster around him, the clamor grows, and he longs for his wife to join him before he dies. She arrives at last. He gazes at her face; utters, “Only God knows how much I loved you”; and expires.

I set the book down on my lap and looked up. I was nestled in a wicker chair outside the Huntington Art Gallery in San Marino, California. Busts of long-dead Romans kept me company. The lawn in front of me unfurled below a sky that—unusually for San Marino—was partially obscured by clouds. My final summer at Caltech was unfurling. I’d walked to the Huntington, one weekend afternoon, with a novel from Caltech’s English library.1

What a novel.

You may have encountered the phrase “love in the time of corona.” Several times. Per week. Throughout the past six months. Love in the Time of Cholera predates the meme by 35 years. Nobel laureate Gabriel García Márquez captured the inhabitants, beliefs, architecture, mores, and spirit of a Colombian city around the turn of the 20th century. His work transcends its setting, spanning love, death, life, obsession, integrity, redemption, and eternity. A thermodynamicist couldn’t ask for more-fitting reading.

Love in the Time of Cholera centers on a love triangle. Fermina Daza, the only child of a wealthy man, excels in her studies. She holds herself with poise and self-assurance, and she spits fire whenever others try to control her. The girl dazzles Florentino Ariza, a poet, who restructures his life around his desire for her. Fermina Daza’s pride impresses Dr. Juvenal Urbino, a doctor renowned for exterminating a cholera epidemic. After rejecting both men, Fermina Daza marries Dr. Juvenal Urbino. The two personalities clash, and one betrays the other, but they cling together across the decades. Florentino Ariza retains his obsession with Fermina Daza, despite having countless affairs. Dr. Juvenal Urbino dies by ladder, whereupon Florentino Ariza swoops in to win Fermina Daza over. Throughout the book, characters mistake symptoms of love for symptoms of cholera; and lovers block out the world by claiming to have cholera and self-quarantining.

As a thermodynamicist, I see the second law of thermodynamics in every chapter. The second law implies that time marches only forward, order decays, and randomness scatters information to the wind. García Márquez depicts his characters aging, aging more, and aging more. Many characters die. Florentino Ariza’s mother loses her memory to dementia or Alzheimer’s disease. A pawnbroker, she buys jewels from the elite whose fortunes have eroded. Forgetting the jewels’ value one day, she mistakes them for candies and distributes them to children.

The second law bites most, to me, in the doctor’s final words, “Only God knows how much I loved you.” Later, the widow Fermina Daza sighs, “It is incredible how one can be happy for so many years in the midst of so many squabbles, so many problems, damn it, and not really know if it was love or not.” She doesn’t know how much her husband loved her, especially in light of the betrayal that rocked the couple and a rumor of another betrayal. Her husband could have affirmed his love with his dying breath, but he refused: He might have loved her with all his heart, and he might not have loved her; he kept the truth a secret to all but God. No one can retrieve the information after he dies.2 

Love in the Time of Cholera—and thermodynamics—must sound like a mouthful of horseradish. But each offers nourishment, an appetizer and an entrée. According to the first law of thermodynamics, the amount of energy in every closed, isolated system remains constant: Physics preserves something. Florentino Ariza preserves his love for decades, despite Fermina Daza’s marrying another man, despite her aging.

The latter preservation can last only so long in the story: Florentino Ariza, being mortal, will die. He claims that his love will last “forever,” but he won’t last forever. At the end of the novel, he sails between two harbors—back and forth, back and forth—refusing to finish crossing a River Styx. I see this sailing as prethermalization: A few quantum systems resist thermalizing, or flowing to the physics analogue of death, for a while. But they succumb later. Florentino Ariza can’t evade the far bank forever, just as the second law of thermodynamics forbids his boat from functioning as a perpetuum mobile.

Though mortal within his story, Florentino Ariza survives as a book character. The book survives. García Márquez wrote about a country I’d never visited, and an era decades before my birth, 33 years before I checked his book out of the library. But the book dazzled me. It pulsed with the vibrancy, color, emotion, and intellect—with the fullness—of life. The book gained another life when the coronavius hit. Thermodynamics dictates that people age and die, but the laws of thermodynamics remain.3 I hope and trust—with the caveat about humanity’s not destroying itself—that Love in the Time of Cholera will pulse in 350 years. 

What’s not to love?

1Yes, Caltech has an English library. I found gems in it, and the librarians ordered more when I inquired about books they didn’t have. I commend it to everyone who has access.

2I googled “Only God knows how much I loved you” and was startled to see the line depicted as a hallmark of romance. Please tell your romantic partners how much you love them; don’t make them guess till the ends of their lives.

3Lee Smolin has proposed that the laws of physics change. If they do, the change seems to have to obey metalaws that remain constant.