Unknown's avatar

About Nicole Yunger Halpern

I’m a theoretical physicist at the Joint Center for Quantum Information and Computer Science in Maryland. My research group re-envisions 19th-century thermodynamics for the 21st century, using the mathematical toolkit of quantum information theory. We then apply quantum thermodynamics as a lens through which to view the rest of science. I call this research “quantum steampunk,” after the steampunk genre of art and literature that juxtaposes Victorian settings (à la thermodynamics) with futuristic technologies (à la quantum information). For more information, check out my book for the general public, Quantum Steampunk: The Physics of Yesterday’s Tomorrow. I earned my PhD at Caltech under John Preskill’s auspices; one of my life goals is to be the subject of one of his famous (if not Pullitzer-worthy) poems. Follow me on Twitter @nicoleyh11.

Clocking in at a Cambridge conference

Science evolves on Facebook.

On Facebook last fall, I posted about statistical mechanics. Statistical mechanics is the physics of hordes of particles. Hordes of molecules, for example, form the stench seeping from a clogged toilet. Hordes change in certain ways but not in the reverse ways, suggesting time points in a direction. Once a stink diffuses into the hall, it won’t regroup in the bathroom. The molecules’ locations distinguish past from future.

The post attracted a comment by Ian Durham, associate professor of physics at St. Anselm College. Minutes later, we were instant-messaging about infinitely long evolutions.* The next day, I sent Ian a paper draft. His reply made me jump more than a whiff of a toilet would. Would I discuss the paper at a conference he was co-organizing?

I almost replied, Are you sure?

Then I almost replied, Yes, please!

The conference, “Eddington and Wheeler: Information and Interaction,” unfolded this March at the University of Cambridge. Cambridge employed Sir Arthur Eddington, the astronomer whose 1919 observation of starlight during an eclipse catapulted Einstein’s general relativity to fame. Decades later, John Wheeler laid groundwork for quantum information. Though aware of Eddington’s observation, I hadn’t known he’d researched stat mech. I hadn’t known his opinions about time. Time owns a high-rise in my heart; see the fussiness with which I catalogue “last fall,” “minutes later,” and “the next day.” Conference-goers shared news about time in the Old Combination Room at Cambridge’s Trinity College. Against the room’s wig-filled portraits, our projector resembled a souvenir misplaced by a time traveler.

P1040716

Trinity College, Cambridge.

Presenter one, Huw Price, argued that time has no arrow. It appears to in our universe: We remember the past and anticipate the future. Once a stench diffuses, it doesn’t regroup. The stench illustrates the Second Law of Thermodynamics, the assumption that entropy increases.

If “entropy” doesn’t ring a bell, never mind; we’ll dissect it in future articles. Suffice it to say that (1) thermodynamics is a branch of physics related to stat mech; (2) according to the Second Law of Thermodynamics, something called “entropy” increases; (3) entropy’s rise distinguishes the past from the future by associating the former with a low entropy and the latter with a large entropy; and (4) a stench’s diffusion illustrates the Second Law and time’s flow.

In as many universes in which entropy increases (time flows in one direction), in so many universe does entropy decrease (does time flow oppositely). So, said Huw Price, postulated the 19th-century stat-mech founder Ludwig Boltzmann. Why would universes pair up? For the reason why, driving across a pothole, you not only fall, but also rise. Each fluctuation from equilibrium—from a flat road—involves an upward path and a downward. The upward path resembles a universe in which entropy increases; the downward, a universe in which entropy decreases. Every down pairs with an up. Averaged over universes, time has no arrow.

Freidel Weinert, presenter five, argued the opposite. Time has an arrow, he said, and not because of entropy.

Ariel Caticha discussed an impersonator of time. Using a cousin of MaxEnt, he derived an equation identical to Schrödinger’s. MaxEnt, short for “the Maximum Entropy Principle,” is a tool used in stat mech. Schrödinger’s Equation describes how quantum systems evolve. To draw from Schrödinger’s Equation predictions about electrons and atoms, physicists assume that features of reality resemble certain bits of math. We assume, for example, that the t in Schrödinger’s Equation represents time. A t appeared in Ariel’s twin of Schrödinger’s Equation. But Ariel didn’t assume what physicists usually assume. MaxEnt motivated his assumptions. Interpreting Ariel’s equation poses a challenge. If a variable acts like time and smells like time, does it represent time?**

IMG_0064 copy - Version 2

A presenter uses the anachronistic projector. The head between screen and camera belongs to David Finkelstein, who helped develop the theory of general relativity checked by Eddington.

Like Ariel, Bill Wootters questioned time’s role in arguments. The co-creator of quantum teleportation wondered why one tenet of quantum physics has the form it has. Using quantum mechanics, we can’t predict certain experiments’ outcomes. We can predict probabilities—the chance that some experiment will yield Possible Outcome 1, the chance that the experiment will yield Possible Outcome 2, and so on. To calculate these probabilities, we square numbers. Why square? Why don’t the probabilities depend on cubes?

To explore this question, Bill told a story. Suppose some experimenter runs these experiments on Monday and those on Tuesday. When evaluating his story, Bill pointed out a hole: Replacing “Monday” and “Tuesday” with “eight o’clock” and “nine” wouldn’t change his conclusion. Which replacements wouldn’t change it, and which would? To what can we generalize those days? We couldn’t answer his questions on the Sunday he asked them.

Little of presentation twelve concerned time. Rüdiger Schack introduced QBism, an interpretation of quantum mechanics that sounds like “cubism.” Casting quantum physics in terms of experimenters’ actions, Rüdiger mentioned time. By the time of the mention, I couldn’t tell what anyone meant by “time.” Raising a hand, I asked for clarification.

“You are young,” Rüdiger said. “But you will grow old and die.”

The comment clanged like the slam of a door. It echoed when I followed Ian into Ascension Parish Burial Ground. On Cambridge’s outskirts, conference-goers visited Eddington’s headstone. We found Wittgenstein’s near an uneven footpath; near tangles of undergrowth, Nobel laureates’. After debating about time, we marked its footprints. Paths of glory lead but to the grave.

P1040723

Here lies one whose name was writ in a conference title: Sir Arthur Eddington’s grave.

Paths touched by little glory, I learned, have perks. As Rüdiger noted, I was the greenest participant. As he had the manners not to note, I was the least distinguished and the most ignorant. Studenthood freed me to raise my hand, to request clarification, to lack opinions about time. Perhaps I’ll evolve opinions at some t, some Monday down the road. That Monday feels infinitely far off. These days, I’ll stick to evolving science—using that other boon of youth, Facebook.

*You know you’re a theoretical physicist (or a physicist-in-training) when you debate about processes that last till kingdom come.

** As long as the variable doesn’t smell like a clogged toilet.

For videos of the presentations—including the public lecture by best-selling author Neal Stephenson—stay tuned to http://informationandinteraction.wordpress.com. My presentation appears here

With gratitude to Ian Durham and Dean Rickles for organizing “Information and Interaction” and for the opportunity to participate. With thanks to the other participants for sharing their ideas and time.

Tsar Nikita and His Scientists

Once upon a time, a Russian tsar named Nikita had forty daughters:

                Every one from top to toe
                Was a captivating creature,
                Perfect—but for one lost feature.

 
So wrote Alexander Pushkin, the 19th-century Shakespeare who revolutionized Russian literature. In a rhyme, Pushkin imagined forty princesses born without “that bit” “[b]etween their legs.” A courier scours the countryside for a witch who can help. By summoning the devil in the woods, she conjures what the princesses lack into a casket. The tsar parcels out the casket’s contents, and everyone rejoices.

“[N]onsense,” Pushkin calls the tale in its penultimate line. A “joke.”

The joke has, nearly two centuries later, become reality. Researchers have grown vaginas in a lab and implanted them into teenage girls. Thanks to a genetic defect, the girls suffered from Mayer-Rokitansky-Küster-Hauser (MRKH) syndrome: Their vaginas and uteruses had failed to grow to maturity or at all. A team at Wake Forest and in Mexico City took samples of the girls’ cells, grew more cells, and combined their harvest with vagina-shaped scaffolds. Early in the 2000s, surgeons implanted the artificial organs into the girls. The patients, the researchers reported in the journal The Lancet last week, function normally.

I don’t usually write about reproductive machinery. But the implants’ resonance with “Tsar Nikita” floored me. Scientists have implanted much of Pushkin’s plot into labs. The sexually deficient girls, the craftsperson, the replacement organs—all appear in “Tsar Nikita” as in The Lancet. In poetry as in science fiction, we read the future.

Though threads of Pushkin’s plot survive, society’s view of the specialist has progressed. “Deep [in] the dark woods” lives Pushkin’s witch. Upon summoning the devil, she locks her cure in a casket. Today’s vagina-implanters star in headlines. The Wall Street Journal highlighted the implants in its front section. Unless the patients’ health degrades, the researchers will likely list last week’s paper high on their CVs and websites.

http://news.wfu.edu/2011/05/31/research-park-updates-to-be-presented/, http://www.orderwhitemoon.org/goddess/babayaga/BabaYaga.html

Much as Dr. Atlántida Raya-Rivera, the paper’s lead author, differs from Pushkin’s witch, the visage of Pushkin’s magic wears the nose and eyebrows of science. When tsars or millenials need medical help, they seek knowledge-keepers: specialists, a fringe of society. Before summoning the devil, the witch “[l]ocked her door . . . Three days passed.” I hide away to calculate and study (though days alone might render me more like the protagonist in another Russian story, Chekhov’s “The Bet”). Just as the witch “stocked up coal,” some students stockpile Red Bull before hitting the library. Some habits, like the archetype of the wise woman, refuse to die.

From a Russian rhyme, the bones of “Tsar Nikita” have evolved into cutting-edge science. Pushkin and the implants highlight how attitudes toward knowledge have changed, offering a lens onto science in culture and onto science culture. No wonder readers call Pushkin “timeless.”

But what would he have rhymed with “Mayer-Rokitansky-Küster-Hauser”?

 

 

 

“Tsar Nikita” has many nuances—messages about censorship, for example—that I didn’t discuss. To the intrigued, I recommend The Queen of Spades: And selected works, translated by Anthony Briggs and published by Pushkin Press.

 

Oh, the Places You’ll Do Theoretical Physics!

I won’t run lab tests in a box.
I won’t run lab tests with a fox.
But I’ll prove theorems here or there.
Yes, I’ll prove theorems anywhere…

Physicists occupy two camps. Some—theorists—model the world using math. We try to predict experiments’ outcomes and to explain natural phenomena. Others—experimentalists—gather data using supermagnets, superconductors, the world’s coldest atoms, and other instruments deserving of superlatives. Experimentalists confirm that our theories deserve trashing or—for this we pray—might not model the world inaccurately.

Theorists, people say, can work anywhere. We need no million-dollar freezers. We need no multi-pound magnets.* We need paper, pencils, computers, and coffee. Though I would add “quiet,” colleagues would add “iPods.”

Theorists’ mobility reminds me of the book Green Eggs and Ham. Sam-I-am, the antagonist, drags the protagonist to spots as outlandish as our workplaces. Today marks the author’s birthday. Since Theodor Geisel stimulated imaginations, and since imagination drives physics, Quantum Frontiers is paying its respects. In honor of Oh, the Places You’ll Go!, I’m spotlighting places you can do theoretical physics. You judge whose appetite for exotica exceeds whose: Dr. Seuss’s or theorists’.

http://breakfastonthect.com/2012/01/18/dartmouths-winter-carnival-ranked-6th/

I’ve most looked out-of-place doing physics by a dirt road between sheep-populated meadows outside Lancaster, UK. Lancaster, the War of the Roses victor, is a city in northern England. The year after graduating from college, I worked in Lancaster University as a research assistant. I studied a crystal that resembles graphene, a material whose superlatives include “superstrong,” “supercapacitor,” and “superconductor.” From morning to evening, I’d submerse in math till it poured out my ears. Then I’d trek from “uni,” as Brits say, to the “city centre,” as they write.

The trek wound between trees; fields; and, because I was in England, puddles. Many evenings, a rose or a sunset would arrest me. Other evenings, physics would. I’d realize how to solve an equation, or that I should quit banging my head against one. Stepping off the road, I’d fish out a notebook and write. Amidst the puddles and lambs. Cyclists must have thought me the queerest sight since a cloudless sky.

A colleague loves doing theory in the sky. On planes, he explained, hardly anyone interrupts his calculations. And who minds interruptions by pretzels and coffee?

“A mathematician is a device for turning coffee into theorems,” some have said, and theoretical physicists live down the block from mathematicians in the neighborhood of science. Turn a Pasadena café upside-down and shake it, and out will fall theorists. Since Hemingway’s day, the romanticism has faded from the penning of novels in cafés. But many a theorist trumpets about an equation derived on a napkin.

Trumpeting filled my workplace in Oxford. One of Clarendon Lab’s few theorists, I neighbored lasers, circuits, and signs that read “DANGER! RADIATION.” Though radiation didn’t leak through our walls (I hope), what did contributed more to that office’s eccentricity more than radiation would. As early as 9:10 AM, the experimentalists next door blasted “Born to Be Wild” and Animal House tunes. If you can concentrate over there, you can concentrate anywhere.

One paper I concentrated on had a Crumple-Horn Web-Footed Green-Bearded Schlottz of an acknowledgements section. In a physics paper’s last paragraph, one thanks funding agencies and colleagues for support and advice. “The authors would like to thank So-and-So for insightful comments,” papers read. This paper referenced a workplace: “[One coauthor] is grateful to the Half Moon Pub.” Colleagues of the coauthor confirmed the acknowledgement’s aptness.

Though I’ve dwelled on theorists’ physical locations, our minds roost elsewhere. Some loiter in atoms; others, in black holes; some, on four-dimensional surfaces; others, in hypothetical universes. I hobnob with particles in boxes. As Dr. Seuss whisks us to a Bazzim populated by Nazzim, theorists tell of function spaces populated by Rényi entropies.

The next time you see someone standing in a puddle, or in a ditch, or outside Buckingham Palace, scribbling equations, feel free to laugh. You might be seeing a theoretical physicist. You might be seeing me. To me, physics has relevance everywhere. Scribbling there and here should raise eyebrows no more than any setting in a Dr. Seuss book.

The author would like to thank this emporium of Seussoria. And Java & Co.

*We need for them to confirm that our theories deserve trashing, but we don’t need them with us. Just as, when considering quitting school to break into the movie business, you need for your mother to ask, “Are you sure that’s a good idea, dear?” but you don’t need for her to hang on your elbow. Except experimentalists don’t say “dear” when crushing theorists’ dreams.

Guns versus butter in quantum information

From my college’s computer-science club, I received a T-shirt that reads:

while(not_dead){

sleep--;

time--;

awesome++;

}

/*There’s a reason we can’t hang out with you…*/

The message is written in Java, a programming language. Even if you’ve never programmed, you likely catch the drift: CS majors are the bees’ knees because, at the expense of sleep and social lives, they code. I disagree with part of said drift: CS majors hung out with me despite being awesome.

photo-3 copy

The rest of the drift—you have to give some to get some—synopsizes the physics I encountered this fall. To understand tradeoffs, you needn’t study QI. But what trades off with what, according to QI, can surprise us.

The T-shirt haunted me at the University of Nottingham, where researchers are blending QI with Einstein’s theory of relativity. Relativity describes accelerations, gravity, and space-time’s curvature. In other sources, you can read about physicists’ attempts to unify relativity and quantum mechanics, the Romeo and Tybalt of modern physics, into a theory of quantum gravity. In this article, relativity tangos with quantum mechanics in relativistic quantum information (RQI). If I move my quantum computer, RQIers ask, how do I change its information processing? How does space-time’s curvature affect computation? How can motion affect measurements?

Answers to these questions involve tradeoffs.

Nottingham

Nottingham researchers kindly tolerating a seminar by me

For example, acceleration entangles particles. Decades ago, physicists learned that acceleration creates particles. Say you’re gazing into a vacuum—not empty space, but nearly empty space, the lowest-energy system that can exist. Zooming away on a rocket, I accelerate relative to you. From my perspective, more particles than you think—and higher-energy particles—surround us.

Have I created matter? Have I violated the Principle of Conservation of Energy (and Mass)? I created particles in a sense, but at the expense of rocket fuel. You have to give some to get some:

Fuel--;
Particles++;

The math that describes my particles relates to the math that describes entanglement.* Entanglement is a relationship between quantum systems. Say you entangle two particles, then separate them. If you measure one, you instantaneously affect the other, even if the other occupies another city.

Say we encode information in quantum particles stored in a box.** Just as you encode messages by writing letters, we write messages in the ink of quantum particles. Say the box zooms off on a rocket. Just as acceleration led me to see particles in a vacuum, acceleration entangles the particles in our box. Since entanglement facilitates computation, you can process information by shaking a box. And performing another few steps.

When an RQIer told me so, she might as well have added that space-time has 106 dimensions and the US would win the World Cup. Then my T-shirt came to mind. To get some, you have to give some. When you give something, you might get something. Giving fuel gets you entanglement. To prove that statement, I need to do and interpret math. Till I have time to,

Fuel--;
Entanglement++;

offers intuition.

After cropping up in Nottingham, my T-shirt reared its head (collar?) in physics problem after physics problem. By “consuming entanglement”—forfeiting that ability to affect the particle in another city—you can teleport quantum information.

Entanglement--;
Quantum teleportation++;

My research involves tradeoffs between information and energy. As the Hungarian physicist Leó Szilárd showed, you can exchange information for work. Say you learn which half of a box*** a particle occupies, and you trap the particle in that half. Upon freeing the particle—forfeiting your knowledge about its location—you can lift a weight, charge a battery, or otherwise store energy.

Information--;
Energy++;

If you expend energy, Rolf Landauer showed, you can gain knowledge.

Energy--;
Information++;

No wonder my computer-science friends joked about sleep deprivation. But information can energize. For fuel, I forage in the blending of fields like QI and relativity, and in physical intuitions like those encapsulated in the pseudo-Java above. Much as Szilard’s physics enchants me, I’m glad that the pursuit of physics contradicts his conclusion:

while(not_dead){

Information++;

Energy++;

}

The code includes awesome++ implicitly.

*Bogoliubov transformations, to readers familiar with the term.

**In the fields in a cavity, to readers familiar with the terms.

***Physicists adore boxes, you might have noticed.

With thanks to Ivette Fuentes and the University of Nottingham for their hospitality and for their introduction to RQI.

Of sensors and science students

Click click.

Once the clasps unfastened, the tubular black case opened like a yard-long mussel. It might have held a bazooka, a collapsible pole tent, or enough shellfish for three plates of paella.

“This,” said Rob Young, for certain types of light, “is the most efficient detector in the world.”
Continue reading

Jostling the unreal in Oxford

Oxford, where the real and the unreal jostle in the streets, where windows open into other worlds…

So wrote Philip Pullman, author of The Golden Compass and its sequels. In the series, a girl wanders from the Oxford in another world to the Oxford in ours.

I’ve been honored to wander Oxford this fall. Visiting Oscar Dahlsten and Jon Barrett, I’ve been moonlighting in Vlatko Vedral’s QI group. We’re interweaving 21st-century knowledge about electrons and information with a Victorian fixation on energy and engines. This research program, quantum thermodynamics, should open a window onto our world.

Radcliffe camera

A new world. At least, a world new to the author.

To study our world from another angle, Oxford researchers are jostling the unreal. Oscar, Jon, Andrew Garner, and others are studying generalized probabilistic theories, or GPTs.

What’s a specific probabilistic theory, let alone a generalized one? In everyday, classical contexts, probabilities combine according to rules you know. Suppose you have a 90% chance of arriving in London-Heathrow Airport at 7:30 AM next Sunday. Suppose that, if you arrive in Heathrow at 7:30 AM, you’ll have a 70% chance of catching the 8:05 AM bus to Oxford. You have a probability 0.9 * 0.7 = 0.63 of arriving in Heathrow at 7:30 and catching the 8:05 bus. Why 0.9 * 0.7? Why not 0.90.7, or 0.9/(2 * 0.7)? How might probabilities combine, GPT researchers ask, and why do they combine as they do?

Not that, in GPTs, probabilities combine as in 0.9/(2 * 0.7). Consider the 0.9/(2 * 0.7) plucked from a daydream inspired by this City of Dreaming Spires. But probabilities do combine in ways we wouldn’t expect. By entangling two particles, separating them, and measuring one, you immediately change the probability that a measurement of Particle 2 yields some outcome. John Bell explored, and experimentalists have checked, statistics generated by entanglement. These statistics disobey rules that govern Heathrow-and-bus statistics. As do entanglement statistics, so do effects of quantum phenomena like discord, negative Wigner functions, and weak measurements. Quantum theory and its contrast with classicality force us to reconsider probability.
Continue reading

Polarizer: Rise of the Efficiency

How should a visitor to Zürich spend her weekend?

Launch this question at a Swiss lunchtable, and you split diners into two camps. To take advantage of Zürich, some say, visit Geneva, Lucerne, or another spot outside Zürich. Other locals suggest museums, the lake, and the 19th-century ETH building.

P1040429

The 19th-century ETH building

ETH, short for a German name I’ve never pronounced, is the polytechnic from which Einstein graduated. The polytechnic houses a quantum-information (QI) theory group that’s pioneering ideas I’ve blogged about: single-shot information, epsilonification, and small-scale thermodynamics. While visiting the group this August, I triggered an avalanche of tourism advice. Caught between two camps, I chose Option Three: Contemplate polar codes.

Polar codes compress information into the smallest space possible. Imagine you write a message (say, a Zürich travel guide) and want to encode it in the fewest possible symbols (so it fits in my camera bag). The longer the message, the fewer encoding symbols you need per encoded symbol: The more punch each code letter can pack. As the message grows, the encoding-to-encoded ratio decreases. The lowest possible ratio is a number, represented by H, called the Shannon entropy.

So established Claude E. Shannon in 1948. But Shannon didn’t know how to code at efficiency H. Not for 51 years did we know.

I learned how, just before that weekend. ETH student David Sutter walked me through polar codes as though down Zürich’s Banhofstrasse.

P1040385

The Banhofstrasse, one of Zürich’s trendiest streets, early on a Sunday.

Say you’re encoding n copies of a random variable. When I say, “random variable,” think, “character in the travel guide.” Just as each character is one of 26 letters, each variable has one of many possible values.

Suppose the variables are independent and identically distributed. Even if you know some variables’ values, you can’t guess others’. Cryptoquote players might object that we can infer unknown from known letters. For example, a three-letter word that begins with “th” likely ends with “e.” But our message lacks patterns.

Think of the variables as diners at my lunchtable. Asking how to fill a weekend in Zürich—splitting the diners—I resembled the polarizer.

The polarizer is a mathematical object that sounds like an Arnold Schwarzenegger film and acts on the variables. Just as some diners pointed me outside Zürich, the polarizer gives some variables one property. Just some diners pointed me to within Zürich, the polarizer gives some variables another property. Just as I pointed myself at polar codes, the polarizer gives some variables a third property.

These properties involve entropy. Entropy quantifies uncertainty about a variable’s value—about which of the 26 letters a character represents. Even if you know the early variables’ values, you can’t guess the later variables’. But we can guess some polarized variables’ values. Call the first polarized variable u1, the second u2, etc. If we can guess the value of some ui, that ui has low entropy. If we can’t guess the value, ui has high entropy. The Nicole-esque variables have entropies like the earnings of Terminator Salvation: noteworthy but not chart-topping.

To recap: We want to squeeze a message into the tiniest space possible. Even if we know early variables’ values, we can’t infer later variables’. Applying the polarizer, we split the variables into low-, high-, and middling-entropy flocks. We can guess the value of each low-entropy ui, if we know the foregoing uh’s.

Almost finished!

In your camera-size travel guide, transcribe the high-entropy ui’s. These ui’s suggest the values of the low-entropy ui’s. When you want to decode the guide, guess the low-entropy ui’s. Then reverse the polarizer to reconstruct much of the original text.

The longer the original travel guide, the fewer errors you make while decoding, and the smaller the ratio of the encoded guide’s length to the original guide’s length. That ratio shrinks–as the guide’s length grows–to H. You’ve compressed a message maximally efficiently. As the Swiss say: Glückwünsche.

How does compression relate to QI? Quantum states form messages. Polar codes, ETH scientists have shown, compress quantum messages maximally efficiently. Researchers are exploring decoding strategies and relationships among (quantum) polar codes. With their help, Shannon-coded travel guides might fit not only in my camera bag, but also on the tip of my water bottle.

Should you need a Zürich travel guide, I recommend Grossmünster Church. Not only does the name fulfill your daily dose of umlauts. Not only did Ulrich Zwingli channel the Protestant Reformation into Switzerland there. Climbing a church tower affords a panorama of Zürich. After oohing over the hills and ahhing over the lake, you can shift your gaze toward ETH. The worldview being built there bewitches as much as the vista from any tower.

P1040476

A tower with a view.

With gratitude to ETH’s QI-theory group (particularly to Renato Renner) for its hospitality. And for its travel advice. With gratitude to David Sutter for his explanations and patience.

P1040411

The author and her neue Freunde.

The cost and yield of moving from (quantum) state to (quantum) state

The countdown had begun.

In ten days, I’d move from Florida, where I’d spent the summer with family, to Caltech. Unfolded boxes leaned against my dresser, and suitcases yawned on the floor. I was working on a paper. Even if I’d turned around from my desk, I wouldn’t have seen the stacked books and folded sheets. I’d have seen Lorenz curves, because I’d drawn Lorenz curves all week, and the curves seemed imprinted on my eyeballs.

Using Lorenz curves, we illustrate how much we know about a quantum state. Say you have an electron, you’ll measure it using a magnet, and you can’t predict any measurement’s outcome. Whether you orient the magnet up-and-down, left-to-right, etc., you haven’t a clue what number you’ll read out. We represent this electron’s state by a straight line from (0, 0) to (1, 1).

Uniform_state

Say you know the electron’s state. Say you know that, if you orient the magnet up-and-down, you’ll read out +1. This state, we call “pure.” We represent it by a tented curve.

Pure_state

The more you know about a state, the more the state’s Lorenz curve deviates from the straight line.

Arbitrary_state

If Curve A fails to dip below Curve B, we know at least as much about State A as about State B. We can transform State A into State B by manipulating and/or discarding information.

Conversion_yield_part_1_arrow

By the time I’d drawn those figures, I’d listed the items that needed packing. A coauthor had moved from North America to Europe during the same time. If he could hop continents without impeding the paper, I could hop states. I unzipped the suitcases, packed a box, and returned to my desk.

Say Curve A dips below Curve B. We know too little about State A to transform it into State B. But we might combine State A with a state we know lots about. The latter state, C, might be pure. We have so much information about A + C, the amalgam can turn into B.

Yet more conversion costs Yet-more-conversion-costs-part-2

What’s the least amount of information we need about C to ensure that A + C can turn into B? That number, we call the “cost of transforming State A into State B.”

We call it that usually. But late in the evening, after I’d miscalculated two transformation costs and deleted four curves, days before my flight, I didn’t type the cost’s name into emails to coauthors. I typed “the cost of turning A into B” or “the cost of moving from state to state.”
Continue reading

The complementarity (not incompatibility) of reason and rhyme

Shortly after learning of the Institute for Quantum Information and Matter, I learned of its poetry.

I’d been eating lunch with a fellow QI student at the Perimeter Institute for Theoretical Physics. Perimeter’s faculty includes Daniel Gottesman, who earned his PhD at what became Caltech’s IQIM. Perhaps as Daniel passed our table, I wondered whether a liberal-arts enthusiast like me could fit in at Caltech.

“Have you seen Daniel Gottesman’s website?” my friend replied. “He’s written a sonnet.”

Quill

He could have written equations with that quill.

Digesting this news with my chicken wrap, I found the website after lunch. The sonnet concerned quantum error correction, the fixing of mistakes made during computations by quantum systems. After reading Daniel’s sonnet, I found John Preskill’s verses about Daniel. Then I found more verses of John’s.

To my Perimeter friend: You win. I’ll fit in, no doubt.

Exhibit A: the latest edition of The Quantum Times, the newsletter for the American Physical Society’s QI group. On page 10, my enthusiasm for QI bubbles over into verse. Don’t worry if you haven’t heard all the terms in the poem. Consider them guidebook entries, landmarks to visit during a Wikipedia trek.

If you know the jargon, listen to it with a newcomer’s ear. Does anyone other than me empathize with frustrated lattices? Or describe speeches accidentally as “monotonic” instead of as “monotonous”? Hearing jargon outside its natural habitat highlights how not to explain research to nonexperts. Examining names for mathematical objects can reveal properties that we never realized those objects had. Inviting us to poke fun at ourselves, the confrontation of jargon sprinkles whimsy onto the meringue of physics.

No matter your familiarity with physics or poetry: Enjoy. And fifty points if you persuade Physical Review Letters to publish this poem’s sequel.

Quantum information

By Nicole Yunger Halpern

If “CHSH” rings a bell,
you know QI’s fared, lately, well.
Such promise does this field portend!
In Neumark fashion, let’s extend
this quantum-information spring:
dilation, growth, this taking wing.

We span the space of physics types
from spin to hypersurface hype,
from neutron-beam experiment
to Bohm and Einstein’s discontent,
from records of a photon’s path
to algebra and other math
that’s more abstract and less applied—
of platforms’ details, purified.

We function as a refuge, too,
if lattices can frustrate you.
If gravity has got your goat,
momentum cutoffs cut your throat:
Forget regimes renormalized;
our states are (mostly) unit-sized.
Velocities stay mostly fixed;
results, at worst, look somewhat mixed.

Though factions I do not condone,
the action that most stirs my bones
is more a spook than Popov ghosts; 1
more at-a-distance, less quark-close.

This field’s a tot—cacophonous—
like cosine, not monotonous.
Cacophony enlivens thought:
We’ve learned from noise what discord’s not.

So take a chance on wave collapse;
enthuse about the CP maps;
in place of “part” and “piece,” say “bit”;
employ, as yardstick, Hilbert-Schmidt;
choose quantum as your nesting place,
of all the fields in physics space.

1 With apologies to Ludvig Faddeev.

Steampunk quantum

A dark-haired man leans over a marble balustrade. In the ballroom below, his assistants tinker with animatronic elephants that trumpet and with potions for improving black-and-white photographs. The man is an inventor near the turn of the 20th century. Cape swirling about him, he watches technology wed fantasy.

Welcome to the steampunk genre. A stew of science fiction and Victorianism, steampunk has invaded literature, film, and the Wall Street Journal. A few years after James Watt improved the steam engine, protagonists build animatronics, clone cats, and time-travel. At sci-fi conventions, top hats and blast goggles distinguish steampunkers from superheroes.

Photo

The closest the author has come to dressing steampunk.

I’ve never read steampunk other than H. G. Wells’s The Time Machine—and other than the scene recapped above. The scene features in The Wolsenberg Clock, a novel by Canadian poet Jay Ruzesky. The novel caught my eye at an Ontario library.

In Ontario, I began researching the intersection of QI with thermodynamics. Thermodynamics is the study of energy, efficiency, and entropy. Entropy quantifies uncertainty about a system’s small-scale properties, given large-scale properties. Consider a room of air molecules. Knowing that the room has a temperature of 75°F, you don’t know whether some molecule is skimming the floor, poking you in the eye, or elsewhere. Ambiguities in molecules’ positions and momenta endow the gas with entropy. Whereas entropy suggests lack of control, work is energy that accomplishes tasks.
Continue reading