What is Water 2.0

Before I arrived in Los Angeles, I thought I might need to hit the brakes a bit with some of the radical physics theories I’d encountered during my preliminary research. After all, these were scientists I was meeting: people who “engage in a systematic activity to acquire knowledge that describes and predicts the natural world”, according to Wikipedia. It turns out I wasn’t hardly as far-out as they were.

I could recount numerous anecdotes that exemplify my encounter with the frighteningly intelligent and vivid imagination of the people at LIGO with whom I had the great pleasure of working – Prof. Rana X. Adhikari, Maria Okounkova, Eric Quintero, Maximiliano Isi, Sarah Gossan, and Jameson Graef Rollins – but in the end it all boils down to a parable about fish.

Rana’s version, which he recounted to me on our first meeting, goes as follows: “There are these two young fish swimming along, and a scientist approaches the aquarium and proclaims, “We’ve finally discovered the true nature of water!” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes, “What the hell is water?”” In David Foster Wallace’s more famous version, the scientist is not a scientist but an old fish, who greets them saying, “Morning, boys. How’s the water?”

What is Water

The difference is not circumstantial. Foster Wallace’s version is an argument against “unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing” – personified by the young fish – and an urgent call for awareness – personified by the old fish. But in Rana’s version, the matter is more hard-won: as long as they are fish, they haven’t the faintest apprehension of the very concept of water: even a wise old fish would fail to notice. In this adaptation, gaining awareness of that which is “so real and essential, so hidden in plain sight all around us, all the time” as Foster Wallace describes it, demands much more than just an effort in mindfulness. It demands imagining the unimaginable.

Albert Einstein once said that “Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand.” But the question remains of how far our imagination can reach, and where the radius ends for us in “what there ever will be to know and understand”, versus that which happens to be. My earlier remark about LIGO scientists’ being far-out does not at all refer to a speculative disposition, which would characterise amateur anything-goes, and does go over-the-edge pseudo-science. Rather, it refers to the high level of creativity that is demanded of physicists today, and to the untiring curiosity that drives them to expand the limits of that radius, despite all odds.

The possibility of imagination has become an increasingly animating thought within my currently ongoing project:

As an independent curator of contemporary art, I travelled to Caltech for a 6-week period of research, towards developing an exhibition that will invite the public to engage with some of the highly challenging implications around the concept of time in physics. In it, I identify LIGO’s breakthrough detection of gravitational waves as an unparalleled incentive by which to acquire – in broad cultural terms – a new sense of time that departs from the old and now wholly inadequate one. After LIGO’s announcement proved that time fluctuation not only happens, but that it happened here, to us, on a precise date and time, it is finally possible for a broader public to relate, however abstract some of the concepts from the field of physics may remain. More simply put: we can finally sense that the water is moving.[1]

One century after Einstein’s Theory of General Relativity, most people continue to hold a highly impoverished idea of the nature of time, despite it being perhaps the most fundamental element of our existence. For 100 years there was no blame or shame in this. Because within all possible changes to the three main components of the universe – space, time & energy – the fluctuation of time was always the only one that escaped our sensorial capacities, existing exclusively in our minds, and finding its fullest expression in mathematical language. If you don’t speak mathematics, time fluctuation remains impossible to grasp, and painful to imagine.

But on February 11th, 2016, this situation changed dramatically.

On this date, a televised announcement told the world of the first-ever sensory detection of time-fluctuation, made with the aid of the most sensitive machine ever to be built by mankind. Finally, we have sensorial access to variations in all components of the universe as we know it. What is more, we observe the non-static passage of time through sound, thereby connecting it to the most affective of our senses.

Strain-waveforms_v2

Of course, LIGO’s detection is limited to time fluctuation and doesn’t yet make other mind-bending behaviours of time observable. But this is only circumstantial. The key point is that we can take this initial leap, and that it loosens our feet from the cramp of Newtonian fixity. Once in this state, gambolling over to ideas about zero time tunnelling, non-causality, or the future determining the present, for instance, is far more plausible, and no longer painful but rather seductive, at least, perhaps, for the playful at heart.

Taking a slight off-road (to be re-routed in a moment): there is a common misconception about children’s allegedly free-spirited creativity. Watching someone aged between around 4 and 15 draw a figure will demonstrate quite clearly just how taut they really are, and that they apply strict schemes that follow reality as they see and learn to see it. Bodies consistently have eyes, mouths, noses, heads, rumps and limbs, correctly placed and in increasingly realistic colours. Ask them to depart from these conventions – “draw one eye on his forehead”, “make her face green” – like masters such as Pablo Picasso and Henri Matisse have done – and they’ll likely become very upset (young adolescents being particularly conservative, reaching the point of panic when challenged to shed consensus).

This is not to compare the lay public (including myself) to children, but to suggest that there’s no inborn capacity – the unaffected, ‘genius’ naïveté that the modernist movements of Primitivism, Art Brut and Outsider Art exalted – for developing a creativity that is of substance. Arriving at a consequential idea, in both art and physics, entails a great deal of acumen and is far from gratuitous, however whimsical the moment in which it sometimes appears. And it’s also to suggest that there’s a necessary process of acquaintance – the knowledge of something through experience – in taking a cognitive leap away from the seemingly obvious nature of reality. If there’s some truth in this, then LIGO’s expansion of our sensorial access to the fluctuation of time, together with artistic approaches that lift the remaining questions and ambiguities of spacetime onto a relational, experiential plane, lay fertile ground on which to begin to foster a new sense of time – on a broad cultural level – however slowly it unfolds.

The first iteration of this project will be an exhibition, to take place in Berlin, in July 2017. It will feature existing and newly commissioned works by established and upcoming artists from Los Angeles and Berlin, working in sound, installation and video, to stage a series of immersive environments that invite the viewers’ bodily interaction.

Though the full selection cannot be disclosed just yet, I would like here to provide a glimpse of two works-in-progress by artist-duo Evelina Domnitch & Dmitry Gelfand, whom I invited to Los Angeles to collaborate in my research with LIGO, and whose contribution has been of great value to the project.

For more details on the exhibition, please stay tuned, and be warmly welcome to visit Berlin in July!

Text & images: courtesy of the artists.

ORBIHEDRON | 2017Orbihedron

A dark vortex in the middle of a water-filled basin emits prismatic bursts of rotating light. Akin to a radiant ergosphere surrounding a spinning black hole, Orbihedron evokes the relativistic as well as quantum interpretation of gravity – the reconciliation of which is essential for unravelling black hole behaviour and the origins of the cosmos. Descending into the eye of the vortex, a white laser beam reaches an impassible singularity that casts a whirling circular shadow on the basin’s floor. The singularity lies at the bottom of a dimple on the water’s surface, the crown of the vortex, which acts as a concave lens focussing the laser beam along the horizon of the “black hole” shadow. Light is seemingly swallowed by the black hole in accordance with general relativity, yet leaks out as quantum theory predicts.

ER = EPR | 2017ER=EPR

Two co-rotating vortices, joined together via a slender vortical bridge, lethargically drift through a body of water. Light hitting the water’s surface transforms the vortex pair into a dynamic lens, projecting two entangled black holes encircled by shimmering halos. As soon as the “wormhole” link between the black holes rips apart, the vortices immediately dissipate, analogously to the collapse of a wave function. Connecting distant black holes or two sides of the same black hole, might wormholes be an example of cosmic-scale quantum entanglement? This mind-bending conjecture of Juan Maldacena and Leonard Susskind can be traced back to two iconoclastic papers from 1935. Previously thought to be unrelated (both by their authors and numerous generations of readers), one article, the legendary EPR (penned by Einstein, Podolsky and Rosen) engendered the concept of quantum entanglement or “spooky action at a distance”; and the second text theorised Einstein-Rosen (ER) bridges, later known as wormholes. Although the widely read EPR paper has led to the second quantum revolution, currently paving the way to quantum simulation and computation, ER has enjoyed very little readership. By equating ER to EPR, the formerly irreconcilable paradigms of physics have the potential to converge: the phenomenon of gravity is imagined in a quantum mechanical context. The theory further implies, according to Maldacena, that the undivided, “reliable structure of space-time is due to the ghostly features of entanglement”.

 

[1] I am here extending our capacity to sense to that of the technology itself, which indeed measured the warping of spacetime. However, in interpreting gravitational waves from a human frame of reference (moving nowhere near the speed of light at which gravitational waves travel), they would seem to be spatial. In fact, the elongation of space (a longer wavelength) directly implies that time slows down (a longer wave-period), so that the two are indistinguishable.

 

Isabel de Sena

How do you hear electronic oscillations with light

For decades, understanding the origin of high temperature superconductivity has been regarded as the Holy Grail by physicists in the condensed matter community. The importance of high temperature superconductivity resides not only in its technological promises, but also in the dazzling number of exotic phases and elementary excitations it puts on display for physicists. These myriad phases and excitations give physicists new dimensions and building bricks for understanding and exploiting the world of collective phenomena. The pseudogap, charge-density-wave, nematic and spin liquid phases, for examples, are a few exotica that are found in cuprate high temperature superconductors. Understanding these phases is important for understanding the mechanism behind high temperature superconductivity, but they are also interesting in and of themselves.

The charge-density-wave (CDW) phase in the cuprates – a spontaneous emergence of a periodic modulation of charge density in real space – has particularly garnered a lot of attention. It emerges upon the destruction of the parent antiferromagnetic Mott insulating phase with doping and it appears to directly compete with superconductivity. Whether or not these features are generic, or maybe even necessary, for high temperature superconductivty is an important question. Unfortunately, currently there exists no other comparable high temperature superconducting materials family that enables such questions to be answered.

iqim

Recently, the iridates have emerged as a possible analog to the cuprates. The single layer variant Sr2IrO4, for example, exhibits signatures of both a pseudogap phase and a high temperature superconducting phase. However, with an increasing parallel being drawn between the iridates and the cuprates in terms of their electronic phases, CDW has so far eluded detection in any iridate, calling into question the validity of this comparison. Rather than studying the single layer variant, we decided to look at the bilayer iridate Sr3Ir2O7 in which a clear Mott insulator to metal transition has been reported with doping.

While CDW has been observed in many materials, what made it elusive in cuprates for many years is its spatially short-ranged (it extends only a few lattice spacings long) and often temporally short-ranged (it blinks in and out of existence quickly) nature. To get a good view of this order, experimentalists had to literally pin it down using external influences like magnetic fields or chemical dopants to suppress the temporal fluctuations and then use very sensitive diffraction or scanning tunneling based probes to observe them.

But rather than looking in real space for signatures of the CDW order, an alternative approach is to look for them in the time domain. Works by the Gedik group at MIT and the Orenstein group at U.C. Berkeley have shown that one can use ultrafast time-resolved optical reflectivity to “listen” for the tone of a CDW to infer its presence in the cuprates. In these experiments, one impulsively excites a coherent mode of the CDW using a femtosecond laser pulse, much like one would excite the vibrational mode of a tuning fork by impulsively banging it. One then stroboscopically looks for these CDW oscillations via temporally periodic modulations in its optical reflectivity, much like one would listen for the tone produced by the tuning fork. If you manage to hear the tone of the CDW, then you have established its existence!

We applied a similar approach to Sr3Ir2O7 and its doped versions [hear our experiment]. To our delight, the ringing of a CDW mode sounded immediately upon doping across its Mott insulator to metal transition, implying that the electronic liquid born from the doped Mott insulator is unstable to CDW formation, very similar to the case in cuprates. Also like the case of cuprates, this charge-density-wave is of a special nature: it is either very short-ranged, or temporally fluctuating. Whether or not there is a superconducting phase that competes with the CDW in Sr3Ir2O7 remains to be seen. If so, the phenomenology of the cuprates may really be quite generic. If not, the interesting question of why not is worth pursuing. And who knows, maybe the fact that we have a system that can be controllably tuned between the antiferromagnetic order and the CDW order may find use in technology some day.

Building the future

At the start of the academic year, my high school Physics students want an easy lab with simple, clear-cut data.  They are satisfied with a clear-cut conclusion. Open-ended labs, especially those without cookbook procedures are at first daunting and intimidating.  Having to take time to troubleshoot a problem is a painful process for them, as it can be for many.  As the year progresses, they seem to grow more comfortable with their own exploration of Physics trends.

IMG_20160728_092229

Another happy day in Sloan

There is no set manual for real scientific research, for uncharted territory. Exciting, new research has no “right” answer upon which to compare your data. And building your own, unique experimental set-up inherently requires much time to minimize new issues. It is interesting to me that when there is less guidance based on previous research, there is a larger possibility for great, new discoveries.

This summer I again retreated from the summer heat, plunging into the Caltech sub basements to further my understanding of the freshest research, efficient laboratory techniques, and culture in Physics research. The quiet hum of the air conditioner and lights marked an eerie contrast to the non-stop, bustling life of the classroom. It was an even more stark contrast to my 16-month-old daughter’s incessant joyful and curious exploration of the world.

IMG_20160623_101138

The SEM Chamber

My first project this summer focused on helping to get the SEM (Scanning Electron Microscope) up and running. Once the SEM is functional the first samples it will scan are solar cells comprised of graphene nanotubes. If grand scaled and mass produced, methane may be one source of the necessary carbon for graphene. What if we contained methane gases that are already problematically being released into our greenhouse-gas-ridden atmosphere and subsequently used them to make graphene solar cells? What a win-win solution to help the daunting problem of global climate change?

Helping to set up the SEM involved a variety of interesting tasks: I found the working distance from the SEM gun to the sample holder that would soon be loaded into the chamber. I researched Pirani gauge parts and later rubber pads to help with damping. I helped to install copper ConFlat flanges for making low pressure seals. We used sonification to clean parts used at the SEM lab. We found and installed a nitrogen (N2) line to flush out moisture in the SEM chamber. There were numerous rounds of baking out moisture that may have collected in the chamber in the years since this SEM was last in use.

IMG_20160705_102914

A tube scanner head

During “down time”, such as when the SEM chamber was being pumped down to less than one-part-per-billion pressure with multiple vacuum pumps, we directed our attention to two other projects. The first was making parts for the tube scanner head. Due to the possibility of burning out scanner heads in the alignment process when we first turn on the SEM gun, we needed to be prepared with alternative STM parts. This involved drilling, epoxying, baking, sanding, and soldering tiny pieces.  A diminutive coaxial cable with multiple insulating layers was stripped apart so that we could properly connect the gold conducting wire from within.

During the last week I focused my efforts by returning to an interferometer set up in the sub-basement of Sloan. Last summer, part of my time was spent learning about and setting up an interferometer system in order to measure the shift of a piezoelectric stack when particular voltages were applied. Once calibrated, these piezos will be used to control the motion of the tips in our lab’s STM (Scanning Tunneling Microscope). This summer was different because we had additional equipment from Thorlabs in order to move further along with the project.

Piezo Interferometer Pic 2

Overhead view of the interferometer set-up.

On the day of arrival of the much-needed parts, I felt like a child at Christmas. Ready, set, go. Racing against the impending end of the internship and start of the upcoming academic year, I worked to assemble our equipment.  

IMG_20160729_104852

LASER, function generator, amplifier.

This same procedure was completed roughly a decade ago by graduate students in our lab. Now, though, the remaining calibrated piezos have been used. In order to continue doing important STM measurements, new piezo stacks need to be calibrated.

A ray of red, coherent light from our LASER is directed to a beamsplitter. One arm of light is directed to a mirror and reflected back to the beamsplitter. Another arm of light is directed to a mirror fixed upon the piezoelectric stack. Depending on the applied voltage and particular piezo stacks, the orientation and magnitude of the shear varies. A signal generator and amplifier are connected to the opposite end of the piezoelectric stacks to carefully control the voltage signal applied to the piezos.  Once the beams are recombined at the beamsplitter, they should interfere.  An interference pattern should be detected on the oscilloscope.

IMG_20160729_104837

Confirmation that my oscilloscope was working properly

At first it was plain fun setting up the various parts, like fitting puzzle pieces with the various optics devices. The difficulty came later in troubleshooting. I had little issue with adjusting the set-up so that both beams from the LASER landed directly onto the photodetector. Getting a beautiful interference pattern was another case. Making sense of the output signal from the photodetector on the oscilloscope was also a process. Finding joy and benefit in the learning process as opposed to frustration in a trying time is an important lesson in life.  Of course it is inevitable that there will be difficulties in life. Can we grow from the learning opportunity as opposed to complaining about the struggle?

IMG_20160728_161113

What I at first thought was the interference pattern I had been hoping for… Not so fast.

The irony is that just like my students, I wanted an easy, beautiful interference pattern that could be interpreted on our oscilloscope. I had the opportunity to learn through trial and error and from additional research on interferometers. I look forward to hearing from the lab group about the progress that is made on this project during the academic year while I am in the classroom. I am grateful to IQIM and the Yeh Lab Group for allowing me to continue participating in this exciting program.

Upending my equilibrium

Few settings foster equanimity like Canada’s Banff International Research Station (BIRS). Mountains tower above the center, softened by pines. Mornings have a crispness that would turn air fresheners evergreen with envy. The sky looks designed for a laundry-detergent label.

P1040813

Doesn’t it?

One day into my visit, equilibrium shattered my equanimity.

I was participating in the conference “Beyond i.i.d. in information theory.” What “beyond i.i.d.” means is explained in these articles.  I was to present about resource theories for thermodynamics. Resource theories are simple models developed in quantum information theory. The original thermodynamic resource theory modeled systems that exchange energy and information.

Imagine a quantum computer built from tiny, cold, quantum circuits. An air particle might bounce off the circuit. The bounce can slow the particle down, transferring energy from particle to circuit. The bounce can entangle the particle with the circuit, transferring quantum information from computer to particle.

Suppose that particles bounced off the computer for ages. The computer would thermalize, or reach equilibrium: The computer’s energy would flatline. The computer would reach a state called the canonical ensemble. The canonical ensemble looks like this:  e^{ - \beta H }  / { Z }.

Joe Renes and I had extended these resource theories. Thermodynamic systems can exchange quantities other than energy and information. Imagine white-bean soup cooling on a stovetop. Gas condenses on the pot’s walls, and liquid evaporates. The soup exchanges not only heat, but also particles, with its environment. Imagine letting the soup cool for ages. It would thermalize to the grand canonical ensemble, e^{ - \beta (H - \mu N) } / { Z }. Joe and I had modeled systems that exchange diverse thermodynamic observables.*

What if, fellow beyond-i.i.d.-er Jonathan Oppenheim asked, those observables didn’t commute with each other?

Mathematical objects called operators represent observables. Let \hat{H} represent a system’s energy, and let \hat{N} represent the number of particles in the system. The operators fail to commute if multiplying them in one order differs from multiplying them in the opposite order: \hat{H}  \hat{N}  \neq  \hat{N}  \hat{H}.

Suppose that our quantum circuit has observables represented by noncommuting operators \hat{H} and \hat{N}. The circuit cannot have a well-defined energy and a well-defined particle number simultaneously. Physicists call this inability the Uncertainty Principle. Uncertainty and noncommutation infuse quantum mechanics as a Cashmere GlowTM infuses a Downy fabric softener.

Downy Cashmere Glow 2

Quantum uncertainty and noncommutation.

I glowed at Jonathan: All the coolness in Canada couldn’t have pleased me more than finding someone interested in that question.** Suppose that a quantum system exchanges observables \hat{Q}_1 and \hat{Q}_2 with the environment. Suppose that \hat{Q}_1 and \hat{Q}_2 don’t commute, like components \hat{S}_x and \hat{S}_y of quantum angular momentum. Would the system thermalize? Would the thermal state have the form e^{ \mu_1 \hat{Q}_1 + \mu_2 \hat{Q}_2 } / { Z }? Could we model the system with a resource theory?

Jonathan proposed that we chat.

The chat sucked in beyond-i.i.d.-ers Philippe Faist and Andreas Winter. We debated strategies while walking to dinner. We exchanged results on the conference building’s veranda. We huddled over a breakfast table after colleagues had pushed their chairs back. Information flowed from chalkboard to notebook; energy flowed in the form of coffee; food particles flowed onto the floor as we brushed crumbs from our notebooks.

Coffee, crumbs

Exchanges of energy and particles.

The idea morphed and split. It crystallized months later. We characterized, in three ways, the thermal state of a quantum system that exchanges noncommuting observables with its environment.

First, we generalized the microcanonical ensemble. The microcanonical ensemble is the thermal state of an isolated system. An isolated system exchanges no observables with any other system. The quantum computer and the air molecules can form an isolated system. So can the white-bean soup and its kitchen. Our quantum system and its environment form an isolated system. But they cannot necessarily occupy a microcanonical ensemble, thanks to noncommutation.

We generalized the microcanonical ensemble. The generalization involves approximation, unlikely measurement outcomes, and error tolerances. The microcanonical ensemble has a simple definition—sharp and clean as Banff air. We relaxed the definition to accommodate noncommutation. If the microcanonical ensemble resembles laundry detergent, our generalization resembles fabric softener.

Detergent vs. softener

Suppose that our system and its environment occupy this approximate microcanonical ensemble. Tracing out (mathematically ignoring) the environment yields the system’s thermal state. The thermal state basically has the form we expected, \gamma = e^{ \sum_j  \mu_j \hat{Q}_j } / { Z }.

This exponential state, we argued, follows also from time evolution. The white-bean soup equilibrates upon exchanging heat and particles with the kitchen air for ages. Our quantum system can exchange observables \hat{Q}_j with its environment for ages. The system equilibrates, we argued, to the state \gamma. The argument relies on a quantum-information tool called canonical typicality.

Third, we defined a resource theory for thermodynamic exchanges of noncommuting observables. In a thermodynamic resource theory, the thermal states are the worthless states: From a thermal state, one can’t extract energy usable to lift a weight or to power a laptop. The worthless states, we showed, have the form of \gamma.

Three path lead to the form \gamma of the thermal state of a quantum system that exchanges noncommuting observables with its environment. We published the results this summer.

Not only was Team Banff spilling coffee over \gamma. So were teams at Imperial College London and the University of Bristol. Our conclusions overlap, suggesting that everyone calculated correctly. Our methodologies differ, generating openings for exploration. The mountain passes between our peaks call out for mapping.

So does the path to physical reality. Do these thermal states form in labs? Could they? Cold atoms offer promise for realizations. In addition to experiments and simulations, master equations merit study. Dynamical typicality, Team Banff argued, suggests that \gamma results from equilibration. Master equations model equilibration. Does some Davies-type master equation have \gamma as its fixed point? Email me if you have leads!

Spins

Experimentalists, can you realize the thermal state e^{ \sum_j \mu_j \hat{Q}_j } / Z whose charges \hat{Q}_j don’t commute?

 

A photo of Banff could illustrate Merriam-Webster’s entry for “equanimity.” Banff equanimity deepened our understanding of quantum equilibrium. But we wouldn’t have understood quantum equilibrium if questions hadn’t shattered our tranquility. Give me the disequilibrium of recognizing problems, I pray, and the equilibrium to solve them.

 

*By “observable,” I mean “property that you can measure.”

**Teams at Imperial College London and Bristol asked that question, too. More pleasing than three times the coolness in Canada!

March madness and quantum memory

Madness seized me this March. It pounced before newspaper and Facebook feeds began buzzing about basketball.1 I haven’t bought tickets or bet on teams. I don’t obsess over jump-shot statistics. But madness infected me two weeks ago. I began talking with condensed-matter physicists.

Condensed-matter physicists study collections of many particles. Example collections include magnets and crystals. And the semiconductors in the iPhones that report NCAA updates.

Caltech professor Gil Refael studies condensed matter. He specializes in many-body localization. By “many-body,” I mean “involving lots of quantum particles.” By “localization,” I mean “each particle anchors itself to one spot.” We’d expect these particles to spread out, like the eau de hotdog that wafts across a basketball court. But Gil’s particles stay put.

Hot-dog smell

How many-body-localized particles don’t behave.

Experts call many-body localization “MBL.” I’ve accidentally been calling many-body localization “MLB.” Hence the madness. You try injecting baseball into quantum discussions without sounding one out short of an inning.2

I wouldn’t have minded if the madness had erupted in October. The World Series began in October. The World Series involves Major League Baseball, what normal people call “the MLB.” The MLB dominates October; the NCAA dominates March. Preoccupation with the MLB during basketball season embarrasses me. I feel like I’ve bet on the last team that I could remember winning the championship, then realized that that team had last won in 2002.

March madness has been infecting my thoughts about many-body localization. I keep envisioning a localized particle as dribbling a basketball in place, opponents circling, fans screaming, “Go for it!” Then I recall that I’m pondering MBL…I mean, MLB…or the other way around. The dribbler gives way to a baseball player who refuses to abandon first base for second. Then I recall that I should be pondering particles, not playbooks.

Baseball diamond

Localized particles.

Recollection holds the key to MBL’s importance. Colleagues of Gil’s want to build quantum computers. Computers store information in memories. Memories must retain their contents; information mustn’t dribble away.

Consider recording halftime scores. You could encode the scores in the locations of the particles that form eau de hotdog. (Imagine you have advanced technology that manipulates scent particles.) If Duke had scored one point, you’d put this particle here; if Florida had scored two, you’d put that particle there. The particles—as smells too often do—would drift. You’d lose the information you’d encoded. Better to paint the scores onto scorecards. Dry paint stays put, preserving information.

The quantum particles studied by Gil stay put. They inspire scientists who develop memories for quantum computers. Quantum computation is gunning for a Most Valuable Player plaque in the technology hall of fame. Many-body localized systems could contain Most Valuable Particles.

MVP medal

Remembering the past, some say, one can help one read the future. I don’t memorize teams’ records. I can’t advise you about whom root for. But prospects for quantum memories are brightening. Bet on quantum information science.

1Non-American readers: University basketball teams compete in a tournament each March. The National Collegiate Athletic Association (NCAA) hosts the tournament. Fans glue themselves to TVs, tweet exaltations and frustrations, and excommunicate friends who support opposing teams.

2Without being John Preskill.

SQuInTing in the Southwest

The 18th Annual Southwest Quantum Information and Technology (SQuInT) Workshop is an outreach and service activity of the Center for Quantum Information and Control (CQuIC), and is about to take place this February 18-20, 2016 in Albuquerque New Mexico, SQuInT2016.  With over 160 participants, 45 talks, and 60 posters, SQuInT has become one of the largest and most diverse meetings in Quantum Information Science in the United States.  Under Chief SQuInT Organizer, Prof. Akimasa Miyake, this year’s program includes reports on the ground breaking experiments in loophole-free violations Bell’s Inequalities, the latest developments in quantum dots, superconductors, and ion and neutral atom traps, and a wide range of quantum information theory.  The SQuInT 2016 Keynote will be delivered by IQIM’s very own, Prof. John Preskill.

How did SQuInT get here? Its origin stems from the history of Quantum Information Science (QIS) itself. I joined the faculty at UNM in 1995.  Those were heady times, on the heels of Shor’s  algorithm and new developments in quantum information theory, which  occurred at inflationary speeds.  Simultaneously, Bose Einstein Condensation had just been observed.  These two developments caused a revolution in quantum optics and AMO-physics from with SQuInT was founded.

I, together with my colleague and now 20-year academic partner, Prof. Poul Jessen at the College of Optical Science, University of Arizona, focused on “optical lattices,” a brand new idea at that time, and the subject of Poul’s PhD thesis.  In Poul’s dissertation, he demonstrated that the motion of laser-cooled atoms,  trapped at the antinodes of standing waves, was quantized.  This quantum motion was reminiscent of that seen in atomic ions in Paul traps, and we set out to exploit this in optical lattices.  Indeed, a hot development of the 1990s was the ability to engineer nonclassical states of motion of ions, leveraging off of the analogy with the Jaynes-Cummings model of cavity QED.  As a side note, this capability was at the heart of the 1995 proposal by  Ignacio Cirac & Peter Zoller  for ion trap quantum computing and the immediate demonstration by Chris Monroe & Dave Wineland of the first CNOT gate.  Given these connections, in 1997 I organized a small workshop at UNM entitled Quantum Control of Atomic Motion, which brought together neutral atom trappers, ion trappers, and quantum opticians. Among the participants were Rainer Blatt, Hideo Mabuchi, Hersch Rabitz, and Dave Wineland.  Hersch’s presence was a new dimension, as we began to understand that the tools of quantum optimal control, previously developed mostly in the context of NMR and in physical chemistry, would be important for quantum control of atoms.  The meeting was repeated in 1998, as Quantum Control of Atomic Motion II.  By that time quantum computing was fully taking hold in the community.  Chris Monroe presented his logic gate results and we presented the first ideas for quantum computing in optical lattices.  The attendees decided we should be broadening the scope of the meeting to Quantum Information Science and Technology.  Hideo Mabuchi corresponded with Ike Chuang, who was at IBM-Almaden in San Jose California at the time.  Ike, of course, was at the center of the QI revolution and in December 1998 assembled a meeting of some of the key players including: Carl Caves, Richard Cleve, Chris Fuchs, Paul Kwiat, Poul Jessen, Hideo Mabuchi, David Meyer, Chris Monroe, John Preskill, Lu Sham, and  Birgitta Whaley.

 

SQ

SQuInT Founders Meeting, IBM Almaden, San Jose CA, December 1998

 

And thus SQuInT was born.  The first meeting was held in 1999 (SQuInT99) in Albuquerque New Mexico at a budget hotel known as the Holiday Inn “Mountain View.”  Mostly we had a view of the nearby truck stop. But the meeting was of the highest quality.  Our first session was Chaired by Dave Wineland.  The speakers were Serge Haroche, Jeff Kimble, and Hideo Mabuchi.  I’d say we were on the right track!

First Annual SQuInT Workshop

First Annual SQuInT Workshop, February 1999, Albuquerque NM

At this first meeting we voted on the SQuInT Logo, created by Jon Dowling

squintlogo

Here’s the backstory. Alice and Bob Kokepelli, the Hopi fertility deities, play their flutes to the dreamcatcher.   What has the dreamcathcer caught?  Part of the circuit diagram for quantum teleportation of course!

At the time, SQuInT was envisioned to be a regional network.  As QIS was a new field, the plan was to facilitate collaborations and exchange of information given the local strength in the southwestern United States.  Some of the key nodes of the SQuInT Network at the time included Caltech, IBM-Almaden, Los Alamos, NIST Boulder, UA, UCB, UCSB, UCSD, and UNM.  SQuInT took as its mission two key objectives: (1) building a network where the interdisciplinary subject matter of QIS would grow through direct interactions of theoretical and experimental physicists and computer scientists, as well as chemists, engineers, and mathematicians; (2) provide training of students, postdocs, and others who were entering a newly emerging discipline.  In line with goal (2), the Annual SQuInT Workshop has been a forum friendly to young scientists, where students and postdocs give talks alongside senior leaders in the field, and where new networks and collaborations can build.  In addition, students organized “summer retreats,” which essentially served as summer schools, since there were few courses in QIS at that time.

After its initial founding, SQuInT grew and the Annual Workshop traveled amongst the node institutions.  By the fourth meeting, we had grown to over 75 participants.

SQuInT2000s

 

After its establishment in 2007, CQuIC became the official administrative home of SQuInT.  The Annual Meeting alternates between New Mexico and one of the Node Institutions, of which there are now 30 across the United States and some international. These Nodes include universities, national laboratories, and industry, the latter of which has an increasing presence given the rapid developments in QI technologies (SQuInTNodes).   SQuInT Node institutions serve on the SQuInT Steering Committee and are the core participants in SQuInT, and can act as local hosts of the Annual Workshop.  Last year’s meeting took place in Berkeley CA with over 200 participants.

SQuInT2016

Seventeenth Annual SQuInT Meeting, February 2015, Berkeley CA

 

After 17 years serving as the Chief SQuInT Coordinator (plus 2 years of proto-SQuInT organization), I am proud to hand over the reigns to Prof. Akimasa Miyake.  SQuInT remains true to its goals of training, education, and growth of an interdisciplinary subject. Under Akimasa’s organization, we have a top-notch program, and I look forward to attending SQuInT, as a participant!

 

 

BTZ black holes for #BlackHoleFriday

Yesterday was a special day. And no I’m not referring to #BlackFriday — but rather to #BlackHoleFriday. I just learned that NASA spawned this social media campaign three years ago. The timing of this year’s Black Hole Friday is particularly special because we are exactly 100 years + 2 days after Einstein published his field equations of general relativity (GR). When Einstein introduced his equations he only had an exact solution describing “flat space.” These equations are notoriously difficult to solve so their introduction sent out a call-to-arms to mathematically-minded-physicists and physically-minded-mathematicians who scrambled to find new solutions.

If I had to guess, Karl Schwarzschild probably wasn’t sleeping much exactly a century ago. Not only was he deployed to the Russian Front as a solider in the German Army, but a little more than one month after Einstein introduced his equations, Schwarzschild was the first to find another solution. His solution describes the curvature of spacetime outside of a spherically symmetric mass. It has the incredible property that if the spherical mass is compact enough then spacetime will be so strongly curved that nothing will be able to escape (at least from the perspective of GR; we believe that there are corrections to this when you add quantum mechanics to the mix.) Schwarzchild’s solution took black holes from the realm of clever thought experiments to the status of being a testable prediction about how Nature behaves.

It’s worth mentioning that between 1916-1918 Reissner and Nordstrom generalized Schwarzschild’s solution to one which also has electric charge. Kerr found a solution in 1963 which describes a spinning black hole and this was generalized by Newman et al in 1965 to a solution which includes both spin (angular momentum) and electric charge. These solutions are symmetric about their spin axis. It’s worth mentioning that we can also write sensible equations which describe small perturbations around these solutions.

And that’s pretty much all that we’ve got in terms of exact solutions which are physically relevant to the 3+1 dimensional spacetime that we live in (it takes three spatial coordinates to specify a meeting location and another +1 to specify the time.) This is the setting that’s closest to our everyday experiences and these solutions are the jumping off points for trying to understand the role that black holes play in astrophysics. As I already mentioned, studying GR using pen and paper is quite challenging. But one exciting direction in the study of astrophysical black holes comes from recent progresses in the field of numerical relativity; which discretizes the calculations and then uses supercomputers to study approximate time dynamics.

Screen Shot 2015-11-27 at 10.39.41 AM

Artist’s rendition of dust+gas in an “accretion disk” orbiting a spinning black hole. Friction in the accretion disk generates temperatures oftentimes exceeding 10M degrees C (2000 times the temperature of the Sun.) This high temperature region emits x-rays and other detectable EM radiation. The image also shows a jet of plasma. The mechanism for this plasma jet is not yet well understood. Studying processes like this requires all of tools that we have available to us: from numerical relativity; to cutting edge space observatories like NuSTAR; to LIGO in the immediate future *hopefully.* Image credit: NASA/Caltech-JPL

I don’t expect many of you to be experts in the history outlined above. And I expect even fewer of you to know that Einstein’s equations still make sense in any number of dimensions. In this context, I want to briefly introduce a 2+1 dimensional solution called the BTZ black hole and outline why it has been astonishingly important since it was introduced 23 years ago by Bañados, Teteilboim and Zanelli (their paper has been cited over 2k times which is a tremendous number for theoretical physics.)

There are many different viewpoints which yield the BTZ black hole and this is one of them. This is a  time=0 slice of the BTZ black hole obtained by gluing together special curves (geodesics) related to each other by a translation symmetry. The BTZ black hole is a solution of Einstein’s equations in 2+1d which has two asymptotic regions which are causally separated from each other by an event horizon. The arrows leading to “quantum states” come into play when you use the BTZ black hole as a toy model for thinking about quantum gravity.

One of the most striking implications of Einstein’s theory of general relativity is that our universe is described by a curved geometry which we call spacetime. Einstein’s equations describe the dynamical interplay between the curvature of spacetime and the distribution of energy+matter. This may be counterintuitive, but there are many solutions even when there is no matter or energy in the spacetime. We call these vacuum solutions. Vacuum solutions can have positive, negative or zero “curvature.

As 2d surfaces: the sphere is positively curved; a saddle has negative curvature; and a plane has zero curvature.

It came as a great surprise when BTZ showed in 1992 that there is a vacuum solution in 2+1d which has many of the same properties as the more physical 3+1d black holes mentioned above. But most excitingly — and something that I can’t imagine BTZ could have anticipated — is that their solution has become the toy model of all toy models for trying to understand “quantum gravity.

GR in 2+1d has many convenient properties. Two beautiful things that happen in 2+1d are that:

  • There are no gravitational waves. Technically, this is because the Riemann tensor is fully determined by the Ricci tensor — the number of degrees of freedom in this system is exactly equal to the number of constraints given by Einstein’s equations. This makes GR in 2+1d something called a “topological field theory” which is much easier to quantize than its full blown gauge theory cousin in 3+1d.
  • The maximally symmetric vacuum solution with negative curvature, which we call Anti de-Sitter space, has a beautiful symmetry. This manifold is exactly equal to the “group manifold” SL(2,R). This enables us to translate many challenging analytical questions into simple algebraic computations. In particular, it enables us to find a huge category of solutions which we call multiboundary wormholes, with BTZ being the most famous example.
Some "multiboundary wormhole" pictures that I made.

Some “multiboundary wormhole” pictures that I made. The left shows the constant time=0 slice for a few different solutions and what you are left with after gluing according to the equations on the right. These are solutions to GR in 2+1d.

These properties make 2+1d GR particularly useful as a sandbox for making progress towards a theory of quantum gravity. As examples of what this might entail:

  • Classically, a particle is in one definite location. In quantum mechanics, a particle can be in a superposition of places. In quantum gravity, can spacetime be in a superposition of geometries? How does this work?
  • When you go from classical physics to quantum physics, tunneling becomes a thing. Can the same thing happen with quantum gravity? Where we tunnel from one spacetime geometry to another? What controls the transition amplitudes?
  • The holographic principle is an incredibly important idea in modern theoretical physics. It stems from the fact that the entropy of a black hole is proportional to the area of its event horizon — whereas the entropy of a glass of water is proportional to the volume of water inside the glass. We believe that this reduction in dimensionality is wildly significant.

A few years after the holographic principle was introduced in the early 1990’s, by Gerard ‘t Hooft and Lenny Susskind, Juan Maldacena came up with a concrete manifestation which is now called the AdS/CFT correspondence. Maldacena’s paper has been cited over 14k times making it one of the most cited theoretical physics papers of all time. However, despite having a “correspondence” it’s still very hard to translate questions back and forth between the “gravity and quantum sides” in practice. The BTZ black hole is the gravity solution where this correspondence is best understood. Its quantum dual is a state called the thermofield double, which is given by: |\Psi_{CFT}\rangle = \frac{1}{\sqrt{Z}} \sum_{n=1}^{\infty} e^{-\beta E_n/2} |n\rangle_1 \otimes |n \rangle_2 . This describes a quantum state which lives on two circles (see my BTZ picture above.) There is entanglement between the two circles. If an experimentalist only had access to one of the circles and if they were asked to try to figure out what state they have, their best guess would be a “thermal state.” A state that has been exposed to a heat-bath for too long and has lost all of its initial quantum coherence.

It is in this sense that the BTZ black hole has been hugely important. It’s also evidence of how mysterious Einstein’s equations still remain, even to this day. We still don’t have exact solutions for many settings of interest, like for two black holes merging in 3+1d. It was only in 1992 that BTZ came up with their solution–77 years after Einstein formulated his theory! Judging by historical precedence, exactly solvable toy models are profoundly useful and BTZ has already proven to be an important signpost as we continue on our quest to understand quantum gravity. There’s already broad awareness that astrophysical black holes are fascinating objects. In this post I hope I conveyed a bit of the excitement surrounding how black holes are useful in a different setting — in aiding our understanding of quantum gravity. And all of this is in the spirit of #BlackHoleFriday, of course.