So wrote Philip Pullman, author of The Golden Compass and its sequels. In the series, a girl wanders from the Oxford in another world to the Oxford in ours.
I’ve been honored to wander Oxford this fall. Visiting Oscar Dahlsten and Jon Barrett, I’ve been moonlighting in Vlatko Vedral’s QI group. We’re interweaving 21st-century knowledge about electrons and information with a Victorian fixation on energy and engines. This research program, quantum thermodynamics, should open a window onto our world.
To study our world from another angle, Oxford researchers are jostling the unreal. Oscar, Jon, Andrew Garner, and others are studying generalized probabilistic theories, or GPTs.
What’s a specific probabilistic theory, let alone a generalized one? In everyday, classical contexts, probabilities combine according to rules you know. Suppose you have a 90% chance of arriving in London-Heathrow Airport at 7:30 AM next Sunday. Suppose that, if you arrive in Heathrow at 7:30 AM, you’ll have a 70% chance of catching the 8:05 AM bus to Oxford. You have a probability 0.9 * 0.7 = 0.63 of arriving in Heathrow at 7:30 and catching the 8:05 bus. Why 0.9 * 0.7? Why not 0.90.7, or 0.9/(2 * 0.7)? How might probabilities combine, GPT researchers ask, and why do they combine as they do?
Not that, in GPTs, probabilities combine as in 0.9/(2 * 0.7). Consider the 0.9/(2 * 0.7) plucked from a daydream inspired by this City of Dreaming Spires. But probabilities do combine in ways we wouldn’t expect. By entangling two particles, separating them, and measuring one, you immediately change the probability that a measurement of Particle 2 yields some outcome. John Bell explored, and experimentalists have checked, statistics generated by entanglement. These statistics disobey rules that govern Heathrow-and-bus statistics. As do entanglement statistics, so do effects of quantum phenomena like discord, negative Wigner functions, and weak measurements. Quantum theory and its contrast with classicality force us to reconsider probability.
Quantum theory, as the saying goes, is weird. “Old hat,” followers of this blog should scoff. “I knew that as a wee lass/lad in junior school.” Point taken, my well-informed (apparently northern-British) friends. But quantum theory could be weirder. As two of many GPTs, quantum and classical theories remind me of the patterns formed by bricks in Keble College. Keble is an Oxford institution near my office. Across the college’s mostly-red brick walls, yellow bricks zigzag, white bricks weave lattices, and black bricks polka-dot the white. Like quantum and classical theories, Keble’s brickwork is unique. It characterizes our world. But the polka-dots could have been mauve.
Rival GPTs include Boxworld, which encodes correlations stronger than entanglement. Boxworld’s name derives from the square that illustrates phenomena that can exist in Boxworld. Just as biologists illustrate evolution—who descended from whom, what has a spine, and so on—with a phylogenetic tree, GPT researchers illustrate probabilistic phenomena with shapes. In classical theory, phenomena form triangular pyramids; in quantum theory, spheres. Like geometry, the Second Law of Thermodynamics (information’s tendency to decay) might distinguish classical and quantum theories from their cousins. No Signaling (information’s inability to outrace light) unites the family.
Why don’t cousins describe superconductors and the likelihood of your catching a bus? I don’t know. Other researchers have learned how probabilities could and do behave. I haven’t puzzled out even mundane manifestations of “could” and “do”—how palm trees crown Pasadena while spires cap England; how jeans dominate California while academic gowns crowd Oxford; and how I walk home, after a day crammed with quantum probabilities, on bricks older than my country.
Windows into other worlds, indeed.
With gratitude to Jon Barrett and Oscar Dahlsten for their hospitality and their explanations of GPTs; to Vlatko Vedral and his group for sharing their offices and good cheer; and to Andrew Garner for insights into GPTs and more.