Many people ask why I became a theoretical physicist. The answer runs through philosophy—which I thought, for years, I’d left behind in college.
My formal relationship with philosophy originated with Mr. Bohrer. My high school classified him as a religion teacher, but he co-opted our junior-year religion course into a philosophy course. He introduced us to Plato’s cave, metaphysics, and the pursuit of the essence beneath the skin of appearance. The essence of reality overlaps with quantum theory and relativity, which fascinated him. Not that he understood them, he’d hasten to clarify. But he passed along that fascination to me. I’d always loved dealing in abstract ideas, so the notion of studying the nature of the universe attracted me. A friend and I joked about growing up to be philosophers and—on account of not being able to find jobs—living in cardboard boxes next to each other.
After graduating from high school, I searched for more of the same in Dartmouth College’s philosophy department. I began with two prerequisites for the philosophy major: Moral Philosophy and Informal Logic. I adored those courses, but I adored all my courses.
As a sophomore, I embarked upon an upper-level philosophy course: philosophy of mind. I was one of the course’s youngest students, but the professor assured me that I’d accumulated enough background information in science and philosophy classes. Yet he and the older students threw around technical terms, such as qualia, that I’d never heard of. Those terms resurfaced in the assigned reading, again without definitions. I struggled to follow the conversation.
Meanwhile, I’d been cycling through the sciences. I’d taken my high school’s highest-level physics course, senior year—AP Physics C: Mechanics and Electromagnetism. So, upon enrolling in college, I made the rounds of biology, chemistry, and computer science. I cycled back to physics at the beginning of sophomore year, taking Modern Physics I in parallel with Informal Logic. The physics professor, Miles Blencowe, told me, “I want to see physics in your major.” I did, too, I assured him. But I wanted to see most subjects in my major.
Miles, together with department chair Jay Lawrence, helped me incorporate multiple subjects into a physics-centric program. The major, called “Physics Modified,” stood halfway between the physics major and the create-your-own major offered at some American liberal-arts colleges. The program began with heaps of prerequisite courses across multiple departments. Then, I chose upper-level physics courses, a math course, two history courses, and a philosophy course. I could scarcely believe that I’d planted myself in a physics department; although I’d loved physics since my first course in it, I loved all subjects, and nobody in my family did anything close to physics. But my major would provide a well-rounded view of the subject.
From shortly after I declared my Physics Modified major. Photo from outside the National Academy of Sciences headquarters in Washington, DC.
The major’s philosophy course was an independent study on quantum theory. In one project, I dissected the “EPR paper” published by Einstein, Podolsky, and Rosen (EPR) in 1935. It introduced the paradox that now underlies our understanding of entanglement. But who reads the EPR paper in physics courses nowadays? I appreciated having the space to grapple with the original text. Still, I wanted to understand the paper more deeply; the philosophy course pushed me toward upper-level physics classes.
What I thought of as my last chance at philosophy evaporated during my senior spring. I wanted to apply to graduate programs soon, but I hadn’t decided which subject to pursue. The philosophy and history of physics remained on the table. A history-of-physics course, taught by cosmologist Marcelo Gleiser, settled the matter. I worked my rear off in that course, and I learned loads—but I already knew some of the material from physics courses. Moreover, I knew the material more deeply than the level at which the course covered it. I couldn’t stand the thought of understanding the rest of physics only at this surface level. So I resolved to burrow into physics in graduate school.
Appropriately, Marcelo published a book with a philosopher (and an astrophysicist) this March.
Burrow I did: after a stint in condensed-matter research, I submerged up to my eyeballs in quantum field theory and differential geometry at the Perimeter Scholars International master’s program. My research there bridged quantum information theory and quantum foundations. I appreciated the balance of fundamental thinking and possible applications to quantum-information-processing technologies. The rigorous mathematical style (lemma-theorem-corollary-lemma-theorem-corollary) appealed to my penchant for abstract thinking. Eating lunch with the Perimeter Institute’s quantum-foundations group, I felt at home.
Craving more research at the intersection of quantum thermodynamics and information theory, I enrolled at Caltech for my PhD. As I’d scarcely believed that I’d committed myself to my college’s physics department, I could scarcely believe that I was enrolling in a tech school. I was such a child of the liberal arts! But the liberal arts include the sciences, and I ended up wrapping Caltech’s hardcore vibe around myself like a favorite denim jacket.
Caltech kindled interests in condensed matter; atomic, molecular, and optical physics; and even high-energy physics. Theorists at Caltech thought not only abstractly, but also about physical platforms; so I started to, as well. I began collaborating with experimentalists as a postdoc, and I’m now working with as many labs as I can interface with at once. I’ve collaborated on experiments performed with superconducting qubits, photons, trapped ions, and jammed grains. Developing an abstract idea, then nursing it from mathematics to reality, satisfies me. I’m even trying to redirect quantum thermodynamics from foundational insights to practical applications.
At the University of Toronto in 2022, with my experimental collaborator Batuhan Yılmaz—and a real optics table!
So I did a double-take upon receiving an invitation to present a named lecture at the University of Pittsburgh Center for Philosophy of Science. Even I, despite not being a philosopher, had heard of the cache of Pitt’s philosophy-of-science program. Why on Earth had I received the invitation? I felt the same incredulity as when I’d handed my heart to Dartmouth’s physics department and then to a tech school. But now, instead of laughing at the image of myself as a physicist, I couldn’t see past it.
Why had I received that invitation? I did a triple-take. At Perimeter, I’d begun undertaking research on resource theories—simple, information-theoretic models for situations in which constraints restrict the operations one can perform. Hardly anyone worked on resource theories then, although they form a popular field now. Philosophers like them, and I’ve worked with multiple classes of resource theories by now.
More recently, I’ve worked with contextuality, a feature that distinguishes quantum theory from classical theories. And I’ve even coauthored papers about closed timelike curves (CTCs), hypothetical worldlines that travel backward in time. CTCs are consistent with general relativity, but we don’t know whether they exist in reality. Regardless, one can simulate CTCs, using entanglement. Collaborators and I applied CTC simulations to metrology—to protocols for measuring quantities precisely. So we kept a foot in practicality and a foot in foundations.
Perhaps the idea of presenting a named lecture on the philosophy of science wasn’t hopelessly bonkers. All right, then. I’d present it.
Presenting at the Center for Philosophy of Science
This March, I presented an ALS Lecture (an Annual Lecture Series Lecture, redundantly) entitled “Field notes on the second law of quantum thermodynamics from a quantum physicist.” Scientists formulated the second law the early 1800s. It helps us understand why time appears to flow in only one direction. I described three enhancements of that understanding, which have grown from quantum thermodynamics and nonequilibrium statistical mechanics: resource-theory results, fluctuation theorems, and thermodynamic applications of entanglement. I also enjoyed talking with Center faculty and graduate students during the afternoon and evening. Then—being a child of the liberal arts—I stayed in Pittsburgh for half the following Saturday to visit the Carnegie Museum of Art.
With a copy of a statue of the goddess Sekhmet. She lives in the Carnegie Museum of Natural History, which shares a building with the art museum, from which I detoured to see the natural-history museum’s ancient-Egypt area (as Quantum Frontiers regulars won’t be surprised to hear).
Don’t get me wrong: I’m a physicist, not a philosopher. I don’t have the training to undertake philosophy, and I have enough work to do in pursuit of my physics goals. But my high-school self would approve—that self is still me.
Even if you don’t recognize the name, you probably recognize the saguaro cactus. It’s the archetype of the cactus, a column from which protrude arms bent at right angles like elbows. As my husband pointed out, the cactus emoji is a saguaro: 🌵. In Tucson, Arizona, even the airport has a saguaro crop sufficient for staging a Western short film. I didn’t have a film to shoot, but the garden set the stage for another adventure: the ITAMP winter school on quantum thermodynamics.
Tucson airport
ITAMP is the Institute for Theoretical Atomic, Molecular, and Optical Physics (the Optical is silent). Harvard University and the Smithsonian Institute share ITAMP, where I worked as a postdoc. ITAMP hosted the first quantum-thermodynamics conference to take place on US soil, in 2017. Also, ITAMP hosts a winter school in Arizona every February. (If you lived in the Boston area, you might want to escape to the southwest then, too.) The winter school’s topic varies from year to year.
How about a winter school on quantum thermodynamics? ITAMP’s director, Hossein Sadeghpour, asked me when I visited Cambridge, Massachusetts last spring.
Let’s do it, I said.
Lecturers came from near and far. Kanu Sinha, of the University of Arizona, spoke about how electric charges fluctuate in the quantum vacuum. Fluctuations feature also in extensions of the second law of thermodynamics, which helps explain why time flows in only one direction. Gabriel Landi, from the University of Rochester, lectured about these fluctuation relations. ITAMP Postdoctoral Fellow Ceren Dag explained why many-particle quantum systems register time’s arrow. Ferdinand Schmidt-Kaler described the many-particle quantum systems—the trapped ions—in his lab at the University of Mainz.
Ronnie Kosloff, of Hebrew University in Jerusalem, lectured about quantum engines. Nelly Ng, an Assistant Professor at Nanyang Technological University, has featured on Quantum Frontiersat leastthreetimes. She described resource theories—information-theoretic models—for thermodynamics. Information and energy both serve as resources in thermodynamics and computation, I explained in my lectures.
The 2024 ITAMP winter school
The winter school took place at the conference center adjacent to Biosphere 2. Biosphere 2 is an enclosure that contains several miniature climate zones, including a coastal fog desert, a rainforest, and an ocean. You might have heard of Biosphere 2 due to two experiments staged there during the 1990s: in each experiment, a group of people was sealed in the enclosure. The experimentalists harvested their own food and weren’t supposed to receive any matter from outside. The first experiment lasted for two years. The group, though, ran out of oxygen, which a support crew pumped in. Research at Biosphere 2 contributes to our understanding of ecosystems and space colonization.
Fascinating as the landscape inside Biosphere 2 is, so is the landscape outside. The winter school included an afternoon hike, and my husband and I explored the territory around the enclosure.
Did you see any snakes? my best friend asked after I returned home.
No, I said. But we were chased by a vicious beast.
On our first afternoon, my husband and I followed an overgrown path away from the biosphere to an almost deserted-looking cluster of buildings. We eventually encountered what looked like a warehouse from which noises were emanating. Outside hung a sign with which I resonated.
Scientists, I thought. Indeed, a researcher emerged from the warehouse and described his work to us. His group was preparing to seal off a building where they were simulating a Martian environment. He also warned us about the territory we were about to enter, especially the creature that roosted there. We were too curious to retreat, though, so we set off into a ghost town.
At least, that’s what the other winter-school participants called the area, later in the week—a ghost town. My husband and I had already surveyed the administrative offices, conference center, and other buildings used by biosphere personnel today. Personnel in the 1980s used a different set of buildings. I don’t know why one site gave way to the other. But the old buildings survive—as what passes for ancient ruins to many Americans.
Weeds have grown up in the cracks in an old parking lot’s tarmac. A sign outside one door says, “Classroom”; below it is a sign that must not have been correct in decades: “Class in progress.” Through the glass doors of the old visitors’ center, we glimpsed cushioned benches and what appeared to be a diorama exhibit; outside, feathers and bird droppings covered the ground. I searched for a tumbleweed emoji, to illustrate the atmosphere, but found only a tumbler one: 🥃.
After exploring, my husband and I rested in the shade of an empty building, drank some of the water we’d brought, and turned around. We began retracing our steps past the defunct visitors’ center. Suddenly, a monstrous Presence loomed on our right.
I can’t tell you how large it was; I only glimpsed it before turning and firmly not running away. But the Presence loomed. And it confirmed what I’d guessed upon finding the feathers and droppings earlier: the old visitors’ center now served as the Lair of the Beast.
The Mars researcher had warned us about the aggressive male turkey who ruled the ghost town. The turkey, the researcher had said, hated men—especially men wearing blue. My husband, naturally, was wearing a blue shirt. You might be able to outrun him, the researcher added pensively.
My husband zipped up his black jacket over the blue shirt. I advised him to walk confidently and not too quickly. Hikes in bear country, as well as summers at Busch Gardens Zoo Camp, gave me the impression that we mustn’t run; the turkey would probably chase us, get riled up, and excite himself to violence. So we walked, and the monstrous turkey escorted us. For surprisingly and frighteningly many minutes.
The turkey kept scolding us in monosyllabic squawks, which sounded increasingly close to the back of my head. I didn’t turn around to look, but he sounded inches away. I occasionally responded in the soothing voice I was taught to use on horses. But my husband and I marched increasingly quickly.
We left the old visitors’ center, curved around, and climbed most of a hill before ceasing to threaten the turkey—or before he ceased to threaten us. He squawked a final warning and fell back. My husband and I found ourselves amid the guest houses of workshops past, shaky but unmolested. Not that the turkey wreaks much violence, according to the Mars researcher: at most, he beats his wings against people and scratches up their cars (especially blue ones). But we were relieved to return to civilization.
Afternoon hike at Catalina State Park, a drive away from Biosphere 2. (Yes, that’s a KITP hat.)
The ITAMP winter school reminded me of Roughing It, a Mark Twain book I finished this year. Twain chronicled the adventures he’d experienced out West during the 1860s. The Gold Rush, he wrote, attracted the top young men of all nations. The quantum-technologies gold rush has been attracting the top young people of all nations, and the winter school evidenced their eagerness. Yet the winter school also evidenced how many women have risen to the top: 10 of the 24 registrants were women, as were four of the seven lecturers.1
The winter-school participants in the shuttle I rode from the Tucson airport to Biosphere 2
We’ll see to what extent the quantum-technologies gold rush plays out like Mark Twain’s. Ours at least involves a ghost town and ferocious southwestern critters.
1For reference, when I applied to graduate programs, I was told that approximately 20% of physics PhD students nationwide were women. The percentage of women drops as one progresses up the academic chain to postdocs and then to faculty members. And primarily PhD students and postdocs registered for the winter school.
My husband taught me how to pronounce the name of the city where I’d be presenting a talk late last July: Aveiro, Portugal. Having studied Spanish, I pronounced the name as Ah-VEH-roh, with a v partway to a hard b. But my husband had studied Portuguese, so he recommended Ah-VAI-roo.
His accuracy impressed me when I heard the name pronounced by the organizer of the conference I was participating in—Theory of Quantum Computation, or TQC. Lídia del Rio grew up in Portugal and studied at the University of Aveiro, so I bow to her in matters of Portuguese pronunciation. I bow to her also for organizing one of the world’s largest annual quantum-computation conferences (with substantial help—fellow quantum physicist Nuriya Nurgalieva shared the burden). But Lídia cofounded Quantum, a journal that’s risen from a Gedankenexperiment to a go-to venue in six years. So she gives the impression of being able to manage anything.
Aveiro architecture
Watching Lídia open TQC gave me pause. I met her in 2013, the summer before beginning my PhD at Caltech. She was pursuing her PhD at ETH Zürich, which I was visiting. Lídia took me dancing at an Argentine-tango studio one evening. Now, she’d invited me to speak at an international conference that she was coordinating.
Lídia and me in Zürich as PhD students
Lídia opening TQC
Not only Lídia gave me pause; so did the three other invited speakers. Every one of them, I’d met when each of us was a grad student or a postdoc.
Richard Küng described classical shadows, a technique for extracting information about quantum states via measurements. Suppose we wish to infer about diverse properties of a quantum state (about diverse observables’ expectation values). We have to measure many copies of —some number of copies. The community expected to grow exponentially with the system’s size—for instance, with the number of qubits in a quantum computer’s register. We can get away with far fewer, Richard and collaborators showed, by randomizing our measurements.
Richard postdocked at Caltech while I was a grad student there. Two properties of his stand out in my memory: his describing, during group meetings, the math he’d been exploring and the Austrian accent in which he described that math.
Did this restaurant’s owners realize that quantum physicists were descending on their city? I have no idea.
Also while I was a grad student, Daniel Stilck França visited Caltech. Daniel’s TQC talk conveyed skepticism about whether near-term quantum computers can beat classical computers in optimization problems. Near-term quantum computers are NISQ (noisy, intermediate-scale quantum) devices. Daniel studied how noise (particularly, local depolarizing noise) propagates through NISQ circuits. Imagine a quantum computer suffering from a 1% noise error. The quantum computer loses its advantage over classical competitors after 10 layers of gates, Daniel concluded. Nor does he expect error mitigation—a bandaid en route to the sutures of quantum error correction—to help much.
From a 2021 Quantum Frontierspost of mine. I was tickled to see that TQC’s organizers used the photo from my 2021 post as Adam’s speaker photo.
Adam distinguished what we can compute using simple quantum circuits but not using simple classical ones. His results fall under the heading of complexity theory, about which one can rarely prove anything. Complexity theorists cling to their jobs by assuming conjectures widely expected to be true. Atop the assumptions, or conditions, they construct “conditional” proofs. Adam proved unconditional claims in complexity theory, thanks to the simplicity of the circuits he compared.
In my estimation, the talks conveyed cautious optimism: according to Adam, we can prove modest claims unconditionally in complexity theory. According to Richard, we can spare ourselves trials while measuring certain properties of quantum systems. Even Daniel’s talk inspired more optimism than he intended: a few years ago, the community couldn’t predict how noisy short-depth quantum circuits could perform. So his defeatism, rooted in evidence, marks an advance.
Aveiro nurtures optimism, I expect most visitors would agree. Sunshine drenches the city, and the canals sparkle—literally sparkle, as though devised by Elsa at a higher temperature than usual. Fresh fruit seems to wend its way into every meal.1 Art nouveau flowers scale the architecture, and fanciful designs pattern the tiled sidewalks.
What’s more, quantum information theorists of my generation were making good. Three riveted me in their talks, and another co-orchestrated one of the world’s largest quantum-computation gatherings. To think that she’d taken me dancing years before ascending to the global stage.
My husband and I made do, during our visit, by cobbling together our Spanish, his Portuguese, and occasional English. Could I hold a conversation with the Portuguese I gleaned? As adroitly as a NISQ circuit could beat a classical computer. But perhaps we’ll return to Portugal, and experimentalists are doubling down on quantum error correction. I remain cautiously optimistic.
1As do eggs, I was intrigued to discover. Enjoyed a hardboiled egg at breakfast? Have a fried egg on your hamburger at lunch. And another on your steak at dinner. And candied egg yolks for dessert.
This article takes its title from a book by former US Poet Laureate Billy Collins. The title alludes to a song in the musical My Fair Lady, “The Rain in Spain.” The song has grown so famous that I don’t think twice upon hearing the name. “The rain in Portugal” did lead me to think twice—and so did TQC.
With thanks to Lídia and Nuriya for their hospitality. You can submit to TQC2024 here.
The most ingenious invention to surprise me at CERN was a box of chocolates. CERN is a multinational particle-physics collaboration. Based in Geneva, CERN is famous for having “the world’s largest and most powerful accelerator,” according to its website. So a physicist will take for granted its colossal magnets, subatomic finesse, and petabytes of experimental data.
But I wasn’t expecting the chocolates.
In the main cafeteria, beside the cash registers, stood stacks of Toblerone. Sweet-tooth owners worldwide recognize the yellow triangular prisms stamped with Toblerone’s red logo. But I’d never seen such a prism emblazoned with CERN’s name. Scientists visit CERN from across the globe, and probably many return with Swiss-chocolate souvenirs. What better way to promulgate CERN’s influence than by coupling Switzerland’s scientific might with its culinary?1
I visited CERN last November for Sparks!, an annual public-outreach event. The evening’s speakers and performers offer perspectives on a scientific topic relevant to CERN. This year’s event highlighted quantum technologies. Physicist Sofia Vallecorsa described CERN’s Quantum Technology Initiative, and IBM philosopher Mira Wolf-Bauwens discussed ethical implications of quantum technologies. (Yes, you read that correctly: “IBM philosopher.”) Dancers Wenchi Su and I-Fang Lin presented an audiovisual performance, Rachel Maze elucidated government policies, and I spoke about quantum steampunk.
Around Sparks!, I played the physicist tourist: presented an academic talk, descended to an underground detector site, and shot the scientific breeze with members of the Quantum Technology Initiative. (What, don’t you present academic talks while touristing?) I’d never visited CERN before, but much of it felt eerily familiar.
A theoretical-physics student studies particle physics and quantum field theory (the mathematical framework behind particle physics) en route to a PhD. CERN scientists accelerate particles to high speeds, smash them together, and analyze the resulting debris. The higher the particles’ initial energies, the smaller the debris’s components, and the more elementary the physics we can infer. CERN made international headlines in 2012 for observing evidence of the Higgs boson, the particle that endows other particles with masses. As a scientist noted during my visit, one can infer CERN’s impact from how even Auto World (if I recall correctly) covered the Higgs discovery. Friends of mine process data generated by CERN, and faculty I met at Caltech helped design CERN experiments. When I mentioned to a colleague that I’d be flying to Geneva, they responded, “Oh, are you visiting CERN?” All told, a physicist can avoid CERN as easily as one can avoid the Panama Canal en route from the Atlantic Ocean to the Pacific through South America. So, although I’d never visited, CERN felt almost like a former stomping ground. It was the details that surprised me.
Familiar book, new (CERN) bookstore.
Take the underground caverns. CERN experiments take place deep underground, where too few cosmic rays reach to muck with observations much. I visited the LHCb experiment, which spotlights a particle called the “beauty quark” in Europe and the less complimentary “bottom quark” in the US. LHCb is the first experiment that I learned has its own X/Twitter account. Colloquia (weekly departmental talks at my universities) had prepared me for the 100-meter descent underground, for the hard hats we’d have to wear, and for the detector many times larger than I.
A photo of the type bandied about in particle-physics classes
A less famous hard-hat photo, showing a retired detector’s size.
But I hadn’t anticipated the bright, single-tone colors. Between the hard hats and experimental components, I felt as though I were inside the Google logo.
Or take CERN’s campus. I wandered around it for a while before a feeling of nostalgia brought me up short: I was feeling lost in precisely the same way in which I’d felt lost countless times at MIT. Numbers, rather than names, label both MIT’s and CERN’s buildings. Somebody must have chosen which number goes where by throwing darts at a map while blindfolded. Part of CERN’s hostel, building 39, neighbors buildings 222 and 577. I shouldn’t wonder to discover, someday, that the CERN building I’m searching for has wandered off to MIT.
Part of the CERN map. Can you explain it?
Between the buildings wend streets named after famous particle physicists. I nodded greetings to Einstein, Maxwell, Democritus (or Démocrite, as the French Swiss write), and Coulomb. But I hadn’t anticipated how much civil engineers venerate particle physicists. So many physicists did CERN’s designers stuff into walkways that the campus ran out of streets and had to recycle them. Route W. F. Weisskopf turns into Route R. P. Feynman at a…well, at nothing notable—not a fork or even a spoon. I applaud the enthusiasm for history; CERN just achieves feats in navigability that even MIT hasn’t.
The familiar mingled with the unfamiliar even in the crowd on campus. I was expecting to recognize only the personnel I’d coordinated with electronically. But three faces surprised me at my academic talk. I’d met those three physicists through different channels—a summer school in Malta, Harvard collaborators, and the University of Maryland—at different times over the years. But they happened to be visiting CERN at the same time as I, despite their not participating in Sparks! I’m half-reminded of the book Roughing It, which describes how Mark Twain traveled the American West via stagecoach during the 1860s. He ran into a long-lost friend “on top of the Rocky Mountains thousands of miles from home.” Exchange “on top of the Rockies” for “near the Alps” and “thousands of miles” for “even more thousands of miles.”
CERN unites physicists. We learn about its discoveries in classes, we collaborate on its research or have friends who do, we see pictures of its detectors in colloquia, and we link to its science-communication pages in blog posts. We respect CERN, and I hope we can be forgiven for fondly poking a little fun at it. So successfully has CERN spread its influence, I felt a sense of recognition upon arriving.
I didn’t buy any CERN Toblerones. But I arrived home with 4.5 pounds of other chocolates, which I distributed to family and friends, the thermodynamics lunch group I run at the University of Maryland, and—perhaps most importantly—my research group. I’ll take a leaf out of CERN’s book: to hook students on fundamental physics, start early, and don’t stint on the sweets.
With thanks to Claudia Marcelloni, Alberto Di Meglio, Michael Doser, Antonella Del Rosso, Anastasiia Lazuka, Salome Rohr, Lydia Piper, and Paulina Birtwistle for inviting me to, and hosting me at, CERN.
1After returning home, I learned that an external company runs CERN’s cafeterias and that the company orders and sells the Toblerones. Still, the idea is brilliant.
When my brother and I were little, we sometimes played video games on weekend mornings, before our parents woke up. We owned a 3DO console, which ran the game Gex. Gex is named after its main character, a gecko. Stepping into Gex’s shoes—or toe pads—a player can clamber up walls and across ceilings.
I learned this month how geckos clamber, at the 125th Statistical Mechanics Conference at Rutgers University. (For those unfamiliar with the field: statistical mechanics is a sibling of thermodynamics, the study of energy.) Joel Lebowitz, a legendary mathematical physicist and nonagenarian, has organized the conference for decades. This iteration included a talk by Kanupriya (Kanu) Sinha, an assistant professor at the University of Arizona.
Kanu studies open quantum systems, or quantum systems that interact with environments. She often studies a particle that can be polarized. Such a particle carries an electric charge, which can be distributed unevenly across the particle. Examples include a water molecule. As encoded in its chemical symbol, H2O, a water molecule consists of two hydrogen atoms and one oxygen atom. The oxygen attracts the molecule’s electrons more strongly than the hydrogen atoms do. So the molecule’s oxygen end carries a negative charge, and the hydrogen ends carry positive charges.1
The red area represents the oxygen, and the gray areas represent the hydrogen atoms. Image from the American Chemical Society.
When certain quantum particles are polarized, we can control their positions using lasers. After all, a laser consists of light—an electromagnetic field—and electric fields influence electrically charged particles’ movements. This control enables optical tweezers—laser beams that can place certain polarizable atoms wherever an experimentalist wishes. Such atoms can form a quantum computer, as John Preskill wrote in a blog post on Quantum Frontiers earlier this month.
Instead of placing polarizable atoms in an array that will perform a quantum computation, you can place the atoms in an outline of the Eiffel Tower. Image from Antoine Browaeys’s lab.
A tweezered atom’s environment consists not only of a laser, but also everything else around, including dust particles. Undesirable interactions with the environment deplete an atom of its quantum properties. Quantum information stored in the atom leaks into the environment, threatening a quantum computer’s integrity. Hence the need for researchers such as Kanu, who study open quantum systems.
Kanu illustrated the importance of polarizable particles in environments, in her talk, through geckos. A gecko’s toe pads contain tiny hairs that polarize temporarily. The electric charges therein can be attracted to electric charges in a wall. We call this attraction the van der Waals force. So Gex can clamber around for a reason related to why certain atoms suit quantum computing.
Kanu explaining how geckos stick.
Winter break offers prime opportunities for kicking back with one’s siblings. Even if you don’t play Gex (and I doubt whether you do), behind your game of choice may lie more physics than expected.
1Water molecules are polarized permanently, whereas Kanu studies particles that polarize temporarily.
Mid-afternoon, one Saturday late in September, I forgot where I was. I forgot that I was visiting Seattle for the second time; I forgot that I’d just finished co-organizing a workshop partially about nuclear physics for the first time. I’d arrived at a crowded doorway in the Chihuly Garden and Glass museum, and a froth of blue was towering above the onlookers in front of me. Glass tentacles, ranging from ultramarine through turquoise to clear, extended from the froth. Golden conch shells, starfish, and mollusks rode the waves below. The vision drove everything else from my mind for an instant.
Much had been weighing on my mind that week. The previous day had marked the end of a workshop hosted by the Inqubator for Quantum Simulation (IQuS, pronounced eye-KWISS) at the University of Washington. I’d co-organized the workshop with IQuS member Niklas Mueller, NIST physicist Alexey Gorshkov, and nuclear theorist Raju Venugopalanan (although Niklas deserves most of the credit). We’d entitled the workshop “Thermalization, from Cold Atoms to Hot Quantum Chromodynamics.” Quantum chromodynamics describes the strong force that binds together a nucleus’s constituents, so I call the workshop “Journey to the Center of the Atom” to myself.
We aimed to unite researchers studying thermal properties of quantum many-body systems from disparate perspectives. Theorists and experimentalists came; and quantum information scientists and nuclear physicists; and quantum thermodynamicists and many-body physicists; and atomic, molecular, and optical physicists. Everyone cared about entanglement, equilibration, and what else happens when many quantum particles crowd together and interact.
We quantum physicists crowded together and interacted from morning till evening. We presented findings to each other, questioned each other, coagulated in the hallways, drank tea together, and cobbled together possible projects. The week electrified us like a chilly ocean wave but also wearied me like an undertow. Other work called for attention, and I’d be presenting four more talks at four more workshops and campus visits over the next three weeks. The day after the workshop, I worked in my hotel half the morning and then locked away my laptop. I needed refreshment, and little refreshes like art.
Strongly interacting physicists
Chihuly Garden and Glass, in downtown Seattle, succeeded beyond my dreams: the museum drew me into somebody else’s dreams. Dale Chihuly grew up in Washington state during the mid-twentieth century. He studied interior design and sculpture before winning a Fulbright Fellowship to learn glass-blowing techniques in Murano, Italy. After that, Chihuly transformed the world. I’ve encountered glass sculptures of his in Pittsburgh; Florida; Boston; Jerusalem; Washington, DC; and now Seattle—and his reach dwarfs my travels.
Chihuly chandelier at the Renwick Gallery in Washington, DC
After the first few encounters, I began recognizing sculptures as Chihuly’s before checking their name plates. Every work by his team reflects his style. Tentacles, bulbs, gourds, spheres, and bowls evidence what I never expected glass to do but what, having now seen it, I’m glad it does.
This sentiment struck home a couple of galleries beyond the Seaforms. The exhibit Mille Fiori drew inspiration from the garden cultivated by Chihuly’s mother. The name means A Thousand Flowers, although I spied fewer flowers than what resembled grass, toadstools, and palm fronds. Visitors feel like grasshoppers amongst the red, green, and purple stalks that dwarfed some of us. The narrator of Jules Vernes’s Journey to the Center of the Earth must have felt similarly, encountering mastodons and dinosaurs underground. I encircled the garden before registering how much my mind had lightened. Responsibilities and cares felt miles away—or, to a grasshopper, backyards away. Wonder does wonders.
Mille Fiori
Near the end of the path around the museum, a theater plays documentaries about Chihuly’s projects. The documentaries include interviews with the artist, and several quotes reminded me of the science I’d been trained to seek out: “I really wanted to take glass to its glorious height,” Chihuly said, “you know, really make something special.” “Things—pieces got bigger, pieces got taller, pieces got wider.” He felt driven to push art forms as large as the glass would permit his team. Similarly, my PhD advisor John Preskill encouraged me to “think big.” What physics is worth doing—what would create an impact?
How did a boy from Tacoma, Washington impact not only fellow blown-glass artists—not only artists—not only an exhibition here and there in his home country—but experiences across the globe, including that of a physicist one weekend in September?
One idea from the IQuS workshop caught my eye. Some particle colliders accelerate heavy ions to high energies and then smash the ions together. Examples include lead and gold ions studied at CERN in Geneva. After a collision, the matter expands and cools. Nuclear physicists don’t understand how the matter cools; models predict cooling times longer than those observed. This mismatch has persisted across decades of experiments. The post-collision matter evades attempts at computer simulation; it’s literally a hot mess. Can recent advances in many-body physics help?
The exhibit Persian Ceiling at Chihuly Garden and Glass. Doesn’t it look like it could double as an artist’s rendering of a heavy-ion collision?
Martin Savage, the director of IQuS, hopes so. He hopes that IQuS will impact nuclear physics across the globe. Every university and its uncle boasts a quantum institute nowadays, but IQuS seems to me to have carved out a niche for itself. IQuS has grown up in the bosom of the Institute for Nuclear Theory at the University of Washington, which has guided nuclear theory for decades. IQuS is smashing that history together with the future of quantum simulators. IQuS doesn’t strike me as just another glass bowl in the kitchen of quantum science. A bowl worthy of Chihuly? I don’t know, but I’d like to hope so.
I left Chihuly Garden and Glass with respect for the past week and energy for the week ahead. Whether you find it in physics or in glass or in both—or in plunging into a dormant Icelandic volcano in search of the Earth’s core—I recommend the occasional dose of awe.
Participants in the final week of the workshop
With thanks to Martin Savage, IQuS, and the University of Washington for their hospitality.
The origin of life appears to share little with quantum computation, apart from the difficulty of achieving it and its potential for clickbait. Yet similar notions of complexity have recently garnered attention in both fields. Each topic’s researchers expect only special systems to generate high values of such complexity, or complexity at high rates: organisms, in one community, and quantum computers (and perhaps black holes), in the other.
Each community appears fairly unaware of its counterpart. This article is intended to introduce the two. Below, I review assembly theory from origin-of-life studies, followed by quantum complexity. I’ll then compare and contrast the two concepts. Finally, I’ll suggest that origin-of-life scientists can quantize assembly theory using quantum complexity. The idea is a bit crazy, but, well, sowhat?
Assembly theory in origin-of-life studies
Imagine discovering evidence of extraterrestrial life. How could you tell that you’d found it? You’d have detected a bunch of matter—a bunch of particles, perhaps molecules. What about those particles could evidence life?
This question motivated Sara Imari Walker and Lee Cronin to develop assembly theory. (Most of my assembly-theory knowledge comes from Sara, about whom I wrote this blog post years ago and with whom I share a mentor.) Assembly theory governs physical objects, from proteins to self-driving cars.
Imagine assembling a protein from its constituent atoms. First, you’d bind two atoms together. Then, you might bind another two atoms together. Eventually, you’d bind two pairs together. Your sequence of steps would form an algorithm for assembling the protein. Many algorithms can generate the same protein. One algorithm has the least number of steps. That number is called the protein’s assembly number.
Different natural processes tend to create objects that have different assembly numbers. Stars form low-assembly-number objects by fusing two hydrogen atoms together into helium. Similarly, random processes have high probabilities of forming low-assembly-number objects. For example, geological upheavals can bring a shard of iron near a lodestone. The iron will stick to the magnetized stone, forming a two-component object.
My laptop has an enormous assembly number. Why can such an object exist? Because of information, Sara and Lee emphasize. Human beings amassed information about materials science, Boolean logic, the principles of engineering, and more. That information—which exists only because organisms exists—helped engender my laptop.
If any object has a high enough assembly number, Sara and Lee posit, that object evidences life. Absent life, natural processes have too low a probability of randomly throwing together molecules into the shape of a computer. How high is “high enough”? Approximately fifteen, experiments by Lee’s group suggest. (Why do those experiments point to the number fifteen? Sara’s group is working on a theory for predicting the number.)
In summary, assembly number quantifies complexity in origin-of-life studies, according to Sara and Lee. The researchers propose that only living beings create high-assembly-number objects.
Quantum complexity in quantum computation
Quantum complexity defines a stage in the equilibration of many-particle quantum systems. Consider a clump of quantum particles isolated from its environment. The clump will be in a pure quantum state at a time . The particles will interact, evolving the clump’s state as a function .
Quantum many-body equilibration is more complicated than the equilibration undergone by your afternoon pick-me-up as it cools.
The interactions will equilibrate the clump internally. One stage of equilibration centers on local observables . They’ll come to have expectation values approximately equal to thermal expectation values , for a thermal state of the clump. During another stage of equilibration, the particles correlate through many-body entanglement.
The longest known stage centers on the quantum complexity of . The quantum complexity is the minimal number of basic operations needed to prepare from a simple initial state. We can define “basic operations” in many ways. Examples include quantum logic gates that act on two particles. Another example is an evolution for one time step under a Hamiltonian that couples together at most particles, for some independent of . Similarly, we can define “a simple initial state” in many ways. We could count as simple only the -fold tensor product of our favorite single-particle state . Or we could call any -fold tensor product simple, or any state that contains at-most-two-body entanglement, and so on. These choices don’t affect the quantum complexity’s qualitative behavior, according to string theorists Adam Brown and Lenny Susskind.
How quickly can the quantum complexity of grow? Fast growth stems from many-body interactions, long-range interactions, and random coherent evolutions. (Random unitary circuits exemplify random coherent evolutions: each gate is chosen according to the Haar measure, which we can view roughly as uniformly random.) At most, quantum complexity can grow linearly in time. Random unitary circuits achieve this rate. Black holes may; they scramble information quickly. The greatest possible complexity of any -particle state scales exponentially in , according to a counting argument.
A highly complex state looks simple from one perspective and complicated from another. Human scientists can easily measure only local observables . Such observables’ expectation values tend to look thermal in highly complex states, , as implied above. The thermal state has the greatest von Neumann entropy, , of any quantum state that obeys the same linear constraints as (such as having the same energy expectation value). Probed through simple, local observables , highly complex states look highly entropic—highly random—similarly to a flipped coin.
Yet complex states differ from flipped coins significantly, as revealed by subtler analyses. An example underlies the quantum-supremacy experiment published by Google’s quantum-computing group in 2018. Experimentalists initialized 53 qubits (quantum two-level systems) in a tensor product. The state underwent many gates, which prepared a highly complex state. Then, the experimentalists measured the -component of each qubit’s spin, randomly obtaining a -1 or a 1. One trial yielded a 53-bit string. The experimentalists repeated this process many times, using the same gates in each trial. From all the trials’ bit strings, the group inferred the probability of obtaining a given string in the next trial. The distribution resembles the uniformly random distribution…but differs from it subtly, as revealed by a cross-entropy analysis. Classical computers can’t easily generate ; hence the Google group’s claiming to have achieved quantum supremacy/advantage. Quantum complexity differs from simple randomness, that difference is difficult to detect, and the difference can evidence quantum computers’ power.
A fridge that holds one of Google’s quantum computers.
Comparison and contrast
Assembly number and quantum complexity resemble each other as follows:
Each function quantifies the fewest basic operations needed to prepare something.
Only special systems (organisms) can generate high assembly numbers, according to Sara and Lee. Similarly, only special systems (such as quantum computers and perhaps black holes) can generate high complexity quickly, quantum physicists expect.
Assembly number may distinguish products of life from products of abiotic systems. Similarly, quantum complexity helps distinguish quantum computers’ computational power from classical computers’.
High-assembly-number objects are highly structured (think of my laptop). Similarly, high-complexity quantum states are highly structured in the sense of having much many-body entanglement.
Organisms generate high assembly numbers, using information. Similarly, using information, organisms have created quantum computers, which can generate quantum complexity quickly.
Assembly number and quantum complexity differ as follows:
Classical objects have assembly numbers, whereas quantum states have quantum complexities.
In the absence of life, random natural processes have low probabilities of producing high-assembly-number objects. That is, randomness appears to keep assembly numbers low. In contrast, randomness can help quantum complexity grow quickly.
Highly complex quantum states look very random, according to simple, local probes. High-assembly-number objects do not.
Only organisms generate high assembly numbers, according to Sara and Lee. In contrast, abiotic black holes may generate quantum complexity quickly.
Another feature shared by assembly-number studies and quantum computation merits its own paragraph: the importance of robustness. Suppose that multiple copies of a high-assembly-number (or moderate-assembly-number) object exist. Not only does my laptop exist, for example, but so do many other laptops. To Sara, such multiplicity signals the existence of some stable mechanism for creating that object. The multiplicity may provide extra evidence for life (including life that’s discovered manufacturing), as opposed to an unlikely sequence of random forces. Similarly, quantum computing—the preparation of highly complex states—requires stability. Decoherence threatens quantum states, necessitating quantum error correction. Quantum error correction differs from Sara’s stable production mechanism, but both evidence the importance of robustness to their respective fields.
A modest proposal
One can generalize assembly number to quantum states, using quantum complexity. Imagine finding a clump of atoms while searching for extraterrestrial life. The atoms need not have formed molecules, so the clump can have a low classical assembly number. However, the clump can be in a highly complex quantum state. We could detect the state’s complexity only (as far as I know) using many copies of the state, so imagine finding many clumps of atoms. Preparing highly complex quantum states requires special conditions, such as a quantum computer. The clump might therefore evidence organisms who’ve discovered quantum physics. Using quantum complexity, one might extend the assembly number to identify quantum states that may evidence life. However, quantum complexity, or a high rate of complexity generation, alone may not evidence life—for example, if achievable by black holes. Fortunately, a black hole seems unlikely to generate many identical copies of a highly complex quantum state. So we seem to have a low probability of mistakenly attributing a highly complex quantum state, sourced by a black hole, to organisms (atop our low probability of detecting any complex quantum state prepared by anyone other than us).
Would I expect a quantum assembly number to greatly improve humanity’s search for extraterrestrial life? I’m no astrobiology expert (NASA videos notwithstanding), but I’d expect probably not. Still, astrobiology requires chemistry, which requires quantum physics. Quantum complexity seems likely to find applications in the assembly-number sphere. Besides, doesn’t juxtaposing the search for extraterrestrial life and the understanding of life’s origins with quantum computing sound like fun? And a sense of fun distinguishes certain living beings from inanimate matter about as straightforwardly as assembly number does.
With thanks to Jim Al-Khalili, Paul Davies, the From Physics to Life collaboration, and UCLA for hosting me at the workshop that spurred this article.
This July, I came upon a museum called the Haus der Musik in one of Vienna’s former palaces. The museum contains a room dedicated to Johann Strauss II, king of the waltz. The room, dimly lit, resembles a twilit gazebo. I could almost believe that a hidden orchestra was playing the rendition of “The Blue Danube” that filled the room. Glass cases displayed dance cards and accessories that dancers would bring to a nineteenth-century ball.
A ball. Who hasn’t read about one in a novel or seen one in a film? A throng of youngsters and their chaperones, rustling in silk. The glint of candles, the vigor of movement, the thrill of interaction, the anxiety of establishing one’s place in society.
Another throng gathered a short walk from the Haus der Musik this summer. The Vienna University of Technology hosted the conference Quantum Thermodynamics (QTD) in the heart of the city. Don’t tell the other annual conferences, but QTD is my favorite. It spotlights the breed of quantum thermodynamics that’s surged throughout the past decade—the breed saturated with quantum information theory. I began attending QTD as a PhD student, and the conference shifts from city to city from year to year. I reveled in returning in person for the first time since the pandemic began.
Yet this QTD felt different. First, instead of being a PhD student, I brought a PhD student of my own. Second, granted, I enjoyed catching up with colleagues-cum-friends as much as ever. I especially relished seeing the “classmates” who belonged to my academic generation. Yet we were now congratulating each other on having founded research groups, and we were commiserating about the workload of primary investigators.
Third, I found myself a panelist in the annual discussion traditionally called “Quo vadis, quantum thermodynamics?” The panel presented bird’s-eye views on quantum thermodynamics, analyzing trends and opining on the direction our field was taking (or should take).1 Fourth, at the end of the conference, almost the last sentence spoken into any microphone was “See you in Maryland next year.” Colleagues and I will host QTD 2024.
One of my dearest quantum-thermodynamic “classmates,” Nelly Ng, participated in the panel discussion, too. We met as students (see thesetwo blog posts), and she’s now an assistant professor at Nanyang Technological University. Photo credit: Jakub Czartowski.
The day after QTD ended, I boarded an Austrian Airlines flight. Waltzes composed by Strauss played over the loudspeakers. They flipped a switch in my mind: I’d come of age, I thought. I’d attended QTD 2017 as a debutante, presenting my first invited talk at the conference series. I’d danced through QTD 2018 in Santa Barbara, as well as the online iterations held during the pandemic. I’d reveled in the vigor of scientific argumentation, the thrill of learning, the glint of slides shining on projector screens (not really). Now, I was beginning to shoulder responsibilities like a ballgown-wearing chaperone.
As I came of age, so did QTD. The conference series budded around the time I started grad school and embarked upon quantum-thermodynamics research. In 2017, approximately 80 participants attended QTD. This year, 250 people registered to attend in person, and others attended online. Two hundred fifty! Quantum thermodynamics scarcely existed as a field of research fifteen years ago.
I’ve heard that organizers of another annual conference, Quantum Information Processing (QIP), reacted similarly to a 250-person registration list some years ago. Aram Harrow, a professor and quantum information theorist at MIT, has shared stories about co-organizing the first QIPs. As a PhD student, he’d sat in his advisor’s office, taking notes, while the local quantum-information theorists chose submissions to highlight. Nowadays, a small army of reviewers and subreviewers processes the hordes of submissions. And, from what I heard about this year’s attendance, you almost might as well navigate a Disney theme park on a holiday as the QIP crowd.
Will QTD continue to grow like QIP? Would such growth strengthen or fracture the community? Perhaps we’ll discuss those questions at a “Quo vadis?” session in Maryland next year. But I, at least, hope to continue always to grow—and to dance.2
Ludwig Boltzmann, a granddaddy of thermodynamics, worked in Vienna. I’ve waited for years to make a pilgrimage.
1My opinion: Now that quantum thermodynamics has showered us with fundamental insights, we should apply it in practical applications. How? Collaborators and I suggest one path here.
2I confess to having danced the waltz step (gleaned during my 14 years of ballet training) around that Strauss room in the Haus der Musik. I didn’t waltz around the conference auditorium, though.
Editor’s note: Since 2015, the Simons Foundation has supported the “It from Qubit” collaboration, a group of scientists drawing on ideas from quantum information theory to address deep issues in fundamental physics. The collaboration held its “Last Hurrah” event at Perimeter Institute last week. Here is a transcript of remarks by John Preskill at the conference dinner.
It from Qubit 2023 at Perimeter Institute
This meeting is forward-looking, as it should be, but it’s fun to look back as well, to assess and appreciate the progress we’ve made. So my remarks may meander back and forth through the years. Settle back — this may take a while.
We proposed the It from Qubit collaboration in March 2015, in the wake of several years of remarkable progress. Interestingly, that progress was largely provoked by an idea that most of us think is wrong: Black hole firewalls. Wrong perhaps, but challenging to grapple with.
This challenge accelerated a synthesis of quantum computing, quantum field theory, quantum matter, and quantum gravity as well. By 2015, we were already appreciating the relevance to quantum gravity of concepts like quantum error correction, quantum computational complexity, and quantum chaos. It was natural to assemble a collaboration in which computer scientists and information theorists would participate along with high-energy physicists.
We built our proposal around some deep questions where further progress seemed imminent, such as these:
Does spacetime emerge from entanglement? Do black holes have interiors? What is the information-theoretical structure of quantum field theory? Can quantum computers simulate all physical phenomena?
On April 30, 2015 we presented our vision to the Simons Foundation, led by Patrick [Hayden] and Matt [Headrick], with Juan [Maldacena], Lenny [Susskind] and me tagging along. We all shared at that time a sense of great excitement; that feeling must have been infectious, because It from Qubit was successfully launched.
Some It from Qubit investigators at a 2015 meeting.
Since then ideas we talked about in 2015 have continued to mature, to ripen. Now our common language includes ideas like islands and quantum extremal surfaces, traversable wormholes, modular flow, the SYK model, quantum gravity in the lab, nonisometric codes, the breakdown of effective field theory when quantum complexity is high, and emergent geometry described by Von Neumann algebras. In parallel, we’ve seen a surge of interest in quantum dynamics in condensed matter, focused on issues like how entanglement spreads, and how chaotic systems thermalize — progress driven in part by experimental advances in quantum simulators, both circuit-based and analog.
Why did we call ourselves “It from Qubit”? Patrick explained that in our presentation with a quote from John Wheeler in 1990. Wheeler said,
“It from bit” symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances — an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes-or-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.
As is often the case with Wheeler, you’re not quite sure what he’s getting at. But you can glean that Wheeler envisioned that progress in fundamental physics would be hastened by bringing in ideas from information theory. So we updated Wheeler’s vision by changing “it from bit” to “it from qubit.”
As you may know, Richard Feynman had been Wheeler’s student, and he once said this about Wheeler: “Some people think Wheeler’s gotten crazy in his later years, but he’s always been crazy.” So you can imagine how flattered I was when Graeme Smith said the exact same thing about me.
During the 1972-73 academic year, I took a full-year undergraduate course from Wheeler at Princeton that covered everything in physics, so I have a lot of Wheeler stories. I’ll just tell one, which will give you some feel for his teaching style. One day, Wheeler arrives in class dressed immaculately in a suit and tie, as always, and he says: “Everyone take out a sheet of paper, and write down all the equations of physics – don’t leave anything out.” We dutifully start writing equations. The Schrödinger equation, Newton’s laws, Maxwell’s equations, the definition of entropy and the laws of thermodynanics, Navier-Stokes … we had learned a lot. Wheeler collects all the papers, and puts them in a stack on a table at the front of the classroom. He gestures toward the stack and says imploringly “Fly!” [Long pause.] Nothing happens. He tries again, even louder this time: “Fly!” [Long pause.] Nothing happens. Then Wheeler concludes: “On good authority, this stack of papers contains all the equations of physics. But it doesn’t fly. Yet, the universe flies. Something must be missing.”
Channeling Wheeler at the banquet, I implore my equations to fly. Photo by Jonathan Oppenheim.
He was an odd man, but inspiring. And not just odd, but also old. We were 19 and could hardly believe he was still alive — after all, he had worked with Bohr on nuclear fission in the 1930s! He was 61. I’m wiser now, and know that’s not really so old.
Now let’s skip ahead to 1998. Just last week, Strings 2023 happened right here at PI. So it’s fitting to mention that a pivotal Strings meeting occurred 25 years ago, Strings 1998 in Santa Barbara. The participants were in a celebratory mood, so much so that Jeff Harvey led hundreds of physicists in a night of song and dance. It went like this [singing to the tune of “The Macarena”]:
You start with the brane and the brane is BPS. Then you go near the brane and the space is AdS. Who knows what it means? I don’t, I confess. Ehhhh! Maldacena!
You can’t blame them for wanting to celebrate. Admittedly I wasn’t there, so how did I know that hundreds of physicists were singing and dancing? I read about it in the New York Times!
It was significant that by 1998, the Strings meetings had already been held annually for 10 years. You might wonder how that came about. Let’s go back to 1984. Those of you who are too young to remember might not realize that in the late 70s and early 80s string theory was in eclipse. It had initially been proposed as a model of hadrons, but after the discovery of asymptotic freedom in 1973 quantum chromodynamics became accepted as the preferred theory of the strong interactions. (Maybe the QCD string will make a comeback someday – we’ll see.) The community pushing string theory forward shrunk to a handful of people around the world. That changed very abruptly in August 1984. I tried to capture that sudden change in a poem I wrote for John Schwarz’s 60th birthday in 2001. I’ll read it — think of this as a history lesson.
Thirty years ago or more John saw what physics had in store. He had a vision of a string And focused on that one big thing.
But then in nineteen-seven-three Most physicists had to agree That hadrons blasted to debris Were well described by QCD.
The string, it seemed, by then was dead. But John said: “It’s space-time instead! The string can be revived again. Give masses twenty powers of ten!
Then Dr. Green and Dr. Black, Writing papers by the stack, Made One, Two-A, and Two-B glisten. Why is it none of us would listen?
We said, “Who cares if super tricks Bring D to ten from twenty-six? Your theory must have fatal flaws. Anomalies will doom your cause.”
If you weren’t there you couldn’t know The impact of that mighty blow: “The Green-Schwarz theory could be true — It works for S-O-thirty-two!”
Then strings of course became the rage And young folks of a certain age Could not resist their siren call: One theory that explains it all.
Because he never would give in, Pursued his dream with discipline, John Schwarz has been a hero to me. So … please don’t spell it with a “t”!
And 39 years after the revolutionary events of 1984, the intellectual feast launched by string theory still thrives.
In the late 1980s and early 1990s, many high-energy physicists got interested in the black hole information problem. Of course, the problem was 15 years old by then; it arose when Hawking radiation was discovered, as Hawking himself pointed out shortly thereafter. But many of us were drawn to this problem while we waited for the Superconducting Super Collider to turn on. As I have sometimes done when I wanted to learn something, in 1990 I taught a course on quantum field theory in curved spacetime, the main purpose of which was to explain the origin of Hawking radiation, and then for a few years I tried to understand whether information can escape from black holes and if so how, as did many others in those days. That led to a 1992 Aspen program co-organized by Andy Strominger and me on “Quantum Aspects of Black Holes.” Various luminaries were there, among them Hawking, Susskind, Sidney Coleman, Kip Thorne, Don Page, and others. Andy and I were asked to nominate someone from our program to give the Aspen Center colloquium, so of course we chose Lenny, and he gave an engaging talk on “The Puzzle of Black Hole Evaporation.”
At the end of the talk, Lenny reported on discussions he’d had with various physicists he respected about the information problem, and he summarized their views. Of course, Hawking said information is lost. ‘t Hooft said that the S-matrix must be unitary for profound reasons we needed to understand. Polchinski said in 1992 that information is lost and there is no way to retrieve it. Yakir Aharonov said that the information resides in a stable Planck-sized black hole remnant. Sidney Coleman said a black hole is a lump of coal — that was the code in 1992 for what we now call the central dogma of black hole physics, that as seen from the outside a black hole is a conventional quantum system. And – remember this was Lenny’s account of what he claimed people had told him – Frank Wilczek said this is a technical problem, I’ll soon have it solved, while Ed Witten said he did not find the problem interesting.
We talked a lot that summer about the no-cloning principle, and our discomfort with the notion that the quantum information encoded in an infalling encyclopedia could be in two places at once on the same time slice, seen inside the black hole by infalling observers and seen outside the black hole by observers who peruse the Hawking radiation. That potential for cloning shook the faith of the self-appointed defenders of unitarity. Andy and I wrote a report at the end of the workshop with a pessimistic tone:
There is an emerging consensus among the participants that Hawking is essentially right – that the information loss paradox portends a true revolution in fundamental physics. If so, then one must go further, and develop a sensible “phenomenological” theory of information loss. One must reconcile the fact of information loss with established principles of physics, such as locality and energy conservation. We expect that many people, stimulated by their participation in the workshop, will now focus attention on this challenge.
There was another memorable event a year later, in June 1993, a conference at the ITP in Santa Barbara (there was no “K” back then), also called “Quantum Aspects of Black Holes.” Among those attending were Susskind, Gibbons, Polchinski, Thorne, Wald, Israel, Bekenstein, and many others. By then our mood was brightening. Rather pointedly, Lenny said to me that week: “Why is this meeting so much better than the one you organized last year?” And I replied, “Because now you think you know the answer!”
That week we talked about “black hole complementarity,” our hope that quantum information being available both inside and outside the horizon could be somehow consistent with the linearity of quantum theory. Complementarity then was a less radical, less wildly nonlocal idea than it became later on. We envisioned that information in an infalling body could stick to the stretched horizon, but not, as I recall, that the black hole interior would be somehow encoded in Hawking radiation emitted long ago — that came later. But anyway, we felt encouraged.
Joe Polchinski organized a poll of the participants, where one could choose among four options.
Information is lost (unitarity violated)
Information escapes (causality violated)
Planck-scale black hole remnants
None of the above
The poll results favored unitarity over information loss by a 60-40 margin. Perhaps not coincidentally, the participants self-identified as 60% high energy physicists and 40% relativists.
The following summer in June 1994, there was a program called Geometry and Gravity at the Newton Institute in Cambridge. Hawking, Gibbons, Susskind, Strominger, Harvey, Sorkin, and (Herman) Verlinde were among the participants. I had more discussions with Lenny that month than any time before or since. I recall sending an email to Paul Ginsparg after one such long discussion in which I said, “When I hear Lenny Susskind speak, I truly believe that information can come out of a black hole.” Secretly, though, having learned about Shor’s algorithm shortly before that program began, I was spending my evenings struggling to understand Shor’s paper. After Cambridge, Lenny visited ‘t Hooft in Utrecht, and returned to Stanford all charged up to write his paper on “The world as a hologram,” in which he credits ‘t Hooft with the idea that “the world is in a sense two-dimensional.”
Important things happened in the next few years: D-branes, counting of black hole microstates, M-theory, and AdS/CFT. But I’ll skip ahead to the most memorable of my visits to Perimeter Institute. (Of course, I always like coming here, because in Canada you use the same electrical outlets we do …)
In June 2007, there was a month-long program at PI called “Taming the Quantum World.” I recall that Lucien Hardy objected to that title — he preferred “Let the Beast Loose” — which I guess is a different perspective on the same idea. I talked there about fault-tolerant quantum computing, but more importantly, I shared an office with Patrick Hayden. I already knew Patrick well — he had been a Caltech postdoc — but I was surprised and pleased that he was thinking about black holes. Patrick had already reached crucial insights concerning the behavior of a black hole that is profoundly entangled with its surroundings. That sparked intensive discussions resulting in a paper later that summer called “Black holes as mirrors.” In the acknowledgments you’ll find this passage:
We are grateful for the hospitality of the Perimeter Institute, where we had the good fortune to share an office, and JP thanks PH for letting him use the comfortable chair.
We intended for that paper to pique the interest of both the quantum information and quantum gravity communities, as it seemed to us that the time was ripe to widen the communication channel between the two. Since then, not only has that communication continued, but a deeper synthesis has occurred; most serious quantum gravity researchers are now well acquainted with the core concepts of quantum information science.
That John Schwarz poem I read earlier reminds me that I often used to write poems. I do it less often lately. Still, I feel that you are entitled to hear something that rhymes tonight. But I quickly noticed our field has many words that are quite hard to rhyme, like “chaos” and “dogma.” And perhaps the hardest of all: “Takayanagi.” So I decided to settle for some limericks — that’s easier for me than a full-fledged poem.
This first one captures how I felt when I first heard about AdS/CFT: excited but perplexed.
Spacetime is emergent they say. But emergent in what sort of way? It’s really quite cool, The bulk has a dual! I might understand that someday.
For a quantum information theorist, it was pleasing to learn later on that we can interpret the dictionary as an encoding map, such that the bulk degrees of freedom are protected when a portion of the boundary is erased.
Almheiri and Harlow and Dong Said “you’re thinking about the map wrong.” It’s really a code! That’s the thing that they showed. Should we have known that all along?
(It is easier to rhyme “Dong” than “Takayanagi”.) To see that connection one needed a good grasp of both AdS/CFT and quantum error-correcting codes. In 2014 few researchers knew both, but those guys did.
For all our progress, we still don’t have a complete answer to a key question that inspired IFQ. What’s inside a black hole?
Information loss has been denied. Locality’s been cast aside. When the black hole is gone What fell in’s been withdrawn. I’d still like to know: what’s inside?
We’re also still lacking an alternative nonperturbative formulation of the bulk; we can only say it’s something that’s dual to the boundary. Until we can define both sides of the correspondence, the claim that two descriptions are equivalent, however inspiring, will remain unsatisfying.
Duality I can embrace. Complexity, too, has its place. That’s all a good show But I still want to know: What are the atoms of space?
The question, “What are the atoms of space?” is stolen from Joe Polchinski, who framed it to explain to a popular audience what we’re trying to answer. I miss Joe. He was a founding member of It from Qubit, an inspiring scientific leader, and still an inspiration for all of us today.
The IFQ Simons collaboration may fade away, but the quest that has engaged us these past 8 years goes on. IFQ is the continuation of a long struggle, which took on great urgency with Hawking’s formulation of the information loss puzzle nearly 50 years ago. Understanding quantum gravity and its implications is a huge challenge and a grand quest that humanity is obligated to pursue. And it’s fun and it’s exciting, and I sincerely believe that we’ve made remarkable progress in recent years, thanks in large part to you, the IFQ community. We are privileged to live at a time when truths about the nature of space and time are being unveiled. And we are privileged to be part of this community, with so many like-minded colleagues pulling in the same direction, sharing the joy of facing this challenge.
Where is it all going? Coming back to our pitch to the Simons Foundation in 2015, I was very struck by Juan’s presentation that day, and in particular his final slide. I liked it so much that I stole it and used in my presentations for a while. Juan tried to explain what we’re doing by means of an analogy to biological science. How are the quantumists like the biologists?
Well, bulk quantum gravity is life. We all want to understand life. The boundary theory is chemistry, which underlies life. The quantum information theorists are chemists; they want to understand chemistry in detail. The quantum gravity theorists are biologists, they think chemistry is fine, if it can really help them to understand life. What we want is: molecular biology, the explanation for how life works in terms of the underlying chemistry. The black hole information problem is our fruit fly, the toy problem we need to solve before we’ll be ready to take on a much bigger challenge: finding the cure for cancer; that is, understanding the big bang.
How’s it going? We’ve made a lot of progress since 2015. We haven’t cured cancer. Not yet. But we’re having a lot of fun along the way there.
I’ll end with this hope, addressed especially to those who were not yet born when AdS/CFT was first proposed, or were still scampering around in your playpens. I’ll grant you a reprieve, you have another 8 years. By then: May you cure cancer!
So I propose this toast: To It from Qubit, to our colleagues and friends, to our quest, to curing cancer, to understanding the universe. I wish you all well. Cheers!
Mark Srednicki doesn’t look like a high priest. He’s a professor of physics at the University of California, Santa Barbara (UCSB); and you’ll sooner find him in khakis than in sacred vestments. Humor suits his round face better than channeling divine wrath would; and I’ve never heard him speak in tongues—although, when an idea excites him, his hands rise to shoulder height of their own accord, as though halfway toward a priestly blessing. Mark belongs less on a ziggurat than in front of a chalkboard. Nevertheless, he called himself a high priest.
Specifically, Mark jokingly called himself a high priest of the eigenstate thermalization hypothesis, a framework for understanding how quantum many-body systems thermalize internally. The eigenstate thermalization hypothesis has an unfortunate number of syllables, so I’ll call it the ETH. The ETH illuminates closed quantum many-body systems, such as a clump of ultracold atoms. The clump can begin in a pure product state , then evolve under a chaotic1 Hamiltonian . The time- state will remain pure; its von Neumann entropy will always vanish. Yet entropy grows according to the second law of thermodynamics. Breaking the second law amounts almost to a enacting a miracle, according to physicists. Does the clump of atoms deserve consideration for sainthood?
No—although the clump’s state remains pure, a small subsystem’s state does not. A subsystem consists of, for example, a few atoms. They’ll entangle with the other atoms, which serve as an effective environment. The entanglement will mix the few atoms’ state, whose von Neumann entropy will grow.
The ETH predicts this growth. The ETH is an ansatz about and an operator —say, an observable of the few-atom subsystem. We can represent as a matrix relative to the energy eigenbasis. The matrix elements have a certain structure, if and satisfy the ETH. Suppose that the operators do and that lacks degeneracies—that no two energy eigenvalues equal each other. We can prove that thermalizes: Imagine measuring the expectation value at each of many instants . Averaging over instants produces the time-averaged expectation value .
Another average is the thermal average—the expectation value of in the appropriate thermal state. If conserves just itself,2 the appropriate thermal state is the canonical state, . The average energy defines the inverse temperature , and normalizes the state. Hence the thermal average is .
The time average approximately equals the thermal average, according to the ETH: . The correction is small in the total number of atoms. Through the lens of , the atoms thermalize internally. Local observables tend to satisfy the ETH, and we can easily observe only local observables. We therefore usually observe thermalization, consistently with the second law of thermodynamics.
I agree that Mark Srednicki deserves the title high priest of the ETH. He and Joshua Deutsch independently dreamed up the ETH in 1994 and 1991. Since numericists reexamined it in 2008, studies and applications of the ETH have exploded like a desert religion. Yet Mark had never encountered the question I posed about it in 2021. Next month’s blog post will share the good news about that question.
1Nonintegrable.
2Apart from trivial quantities, such as projectors onto eigenspaces of .