Memories of things past

My best friend—who’s held the title of best friend since kindergarten—calls me the keeper of her childhood memories. I recall which toys we played with, the first time I visited her house,1 and which beverages our classmates drank during snack time in kindergarten.2 She wouldn’t be surprised to learn that the first workshop I’ve co-organized centered on memory.

Memory—and the loss of memory—stars in thermodynamics. As an example, take what my husband will probably do this evening: bake tomorrow’s breakfast. I don’t know whether he’ll bake fruit-and-oat cookies, banana muffins, pear muffins, or pumpkin muffins. Whichever he chooses, his baking will create a scent. That scent will waft across the apartment, seep into air vents, and escape into the corridor—will disperse into the environment. By tomorrow evening, nobody will be able to tell by sniffing what my husband will have baked. 

That is, the kitchen’s environment lacks a memory. This lack contributes to our experience of time’s arrow: We sense that time passes partially by smelling less and less of breakfast. Physicists call memoryless systems and processes Markovian.

Our kitchen’s environment is Markovian because it’s large and particles churn through it randomly. But not all environments share these characteristics. Metaphorically speaking, a dispersed memory of breakfast may recollect, return to a kitchen, and influence the following week’s baking. For instance, imagine an atom in a quantum computer, rather than a kitchen in an apartment. A few other atoms may form our atom’s environment. Quantum information may leak from our atom into that environment, swish around in the environment for a time, and then return to haunt our atom. We’d call the atom’s evolution and environment non-Markovian.

I had the good fortune to co-organize a workshop about non-Markovianity—about memory—this February. The workshop took place at the Banff International Research Station, abbreviated BIRS, which you pronounce like the plural of what you say when shivering outdoors in Canada. BIRS operates in the Banff Centre for Arts and Creativity, high in the Rocky Mountains. The Banff Centre could accompany a dictionary entry for pristine, to my mind. The air feels crisp, the trees on nearby peaks stand out against the snow like evergreen fringes on white velvet, and the buildings balance a rustic-mountain-lodge style with the avant-garde. 

The workshop balanced styles, too, but skewed toward the theoretical and abstract. We learned about why the world behaves classically in our everyday experiences; about information-theoretic measures of the distances between quantum states; and how to simulate, on quantum computers, chemical systems that interact with environments. One talk, though, brought our theory back down to (the snow-dusted) Earth.

Gabriela Schlau-Cohen runs a chemistry lab at MIT. She wants to understand how plants transport energy. Energy arrives at a plant from the sun in the form of light. The light hits a pigment-and-protein complex. If the plant is lucky, the light transforms into a particle-like packet of energy called an exciton. The exciton traverses the receptor complex, then other complexes. Eventually, the exciton finds a spot where it can enable processes such as leaf growth. 

A high fraction of the impinging photons—85%—transform into excitons. How do plants convert and transport energy as efficiently as they do?

Gabriela’s group aims to find out—not by testing natural light-harvesting complexes, but by building complexes themselves. The experimentalists mimic the complex’s protein using DNA. You can fold DNA into almost any shape you want, by choosing the DNA’s base pairs (basic units) adroitly and by using “staples” formed from more DNA scraps. The sculpted molecules are called DNA origami.

Gabriela’s group engineers different DNA structures, analogous to complexes’ proteins, to have different properties. For instance, the experimentalists engineer rigid structures and flexible structures. Then, the group assesses how energy moves through each structure. Each structure forms an environment that influences excitons’ behaviors, similarly to how a memory-containing environment influences an atom.

Courtesy of Gabriela Schlau-Cohen

The Banff environment influenced me, stirring up memories like powder displaced by a skier on the slopes above us. I first participated in a BIRS workshop as a PhD student, and then I returned as a postdoc. Now, I was co-organizing a workshop to which I brought a PhD student of my own. Time flows, as we’re reminded while walking down the mountain from the Banff Centre into town: A cemetery borders part of the path. Time flows, but we belong to that thermodynamically remarkable class of systems that retain memories…memories and a few other treasures that resist change, such as friendships held since kindergarten.

1Plushy versions of Simba and Nala from The Lion King. I remain grateful to her for letting me play at being Nala.

2I’d request milk, another kid would request apple juice, and everyone else would request orange juice.

A (quantum) complex legacy: Part deux

I didn’t fancy the research suggestion emailed by my PhD advisor.

A 2016 email from John Preskill led to my publishing a paper about quantum complexity in 2022, as I explained in last month’s blog post. But I didn’t explain what I thought of his email upon receiving it.

It didn’t float my boat. (Hence my not publishing on it until 2022.)

The suggestion contained ingredients that ordinarily would have caulked any cruise ship of mine: thermodynamics, black-hole-inspired quantum information, and the concept of resources. John had forwarded a paper drafted by Stanford physicists Adam Brown and Lenny Susskind. They act as grand dukes of the community sussing out what happens to information swallowed by black holes. 

From Rare-Gallery

We’re not sure how black holes work. However, physicists often model a black hole with a clump of particles squeezed close together and so forced to interact with each other strongly. The interactions entangle the particles. The clump’s quantum state—let’s call it | \psi(t) \rangle—grows not only complicated with time (t), but also complex in a technical sense: Imagine taking a fresh clump of particles and preparing it in the state | \psi(t) \rangle via a sequence of basic operations, such as quantum gates performable with a quantum computer. The number of basic operations needed is called the complexity of | \psi(t) \rangle. A black hole’s state has a complexity believed to grow in time—and grow and grow and grow—until plateauing. 

This growth echoes the second law of thermodynamics, which helps us understand why time flows in only one direction. According to the second law, every closed, isolated system’s entropy grows until plateauing.1 Adam and Lenny drew parallels between the second law and complexity’s growth.

The less complex a quantum state is, the better it can serve as a resource in quantum computations. Recall, as we did last month, performing calculations in math class. You needed clean scratch paper on which to write the calculations. So does a quantum computer. “Scratch paper,” to a quantum computer, consists of qubits—basic units of quantum information, realized in, for example, atoms or ions. The scratch paper is “clean” if the qubits are in a simple, unentangled quantum state—a low-complexity state. A state’s greatest possible complexity, minus the actual complexity, we can call the state’s uncomplexity. Uncomplexity—a quantum state’s blankness—serves as a resource in quantum computation.

Manny Knill and Ray Laflamme realized this point in 1998, while quantifying the “power of one clean qubit.” Lenny arrived at a similar conclusion while reasoning about black holes and firewalls. For an introduction to firewalls, see this blog post by John. Suppose that someone—let’s call her Audrey—falls into a black hole. If it contains a firewall, she’ll burn up. But suppose that someone tosses a qubit into the black hole before Audrey falls. The qubit kicks the firewall farther away from the event horizon, so Audrey will remain safe for longer. Also, the qubit increases the uncomplexity of the black hole’s quantum state. Uncomplexity serves as a resource also to Audrey.

A resource is something that’s scarce, valuable, and useful for accomplishing tasks. Different things qualify as resources in different settings. For instance, imagine wanting to communicate quantum information to a friend securely. Entanglement will serve as a resource. How can we quantify and manipulate entanglement? How much entanglement do we need to perform a given communicational or computational task? Quantum scientists answer such questions with a resource theory, a simple information-theoretic model. Theorists have defined resource theories for entanglement, randomness, and more. In many a blog post, I’ve eulogized resource theories for thermodynamic settings. Can anyone define, Adam and Lenny asked, a resource theory for quantum uncomplexity?

Resource thinking pervades our world.

By late 2016, I was a quantum thermodynamicist, I was a resource theorist, and I’d just debuted my first black-hole–inspired quantum information theory. Moreover, I’d coauthored a review about the already-extant resource theory that looked closest to what Adam and Lenny sought. Hence John’s email, I expect. Yet that debut had uncovered reams of questions—questions that, as a budding physicist heady with the discovery of discovery, I could own. Why would I answer a question of someone else’s instead?

So I thanked John, read the paper draft, and pondered it for a few days. Then, I built a research program around my questions and waited for someone else to answer Adam and Lenny.

Three and a half years later, I was still waiting. The notion of uncomplexity as a resource had enchanted the black-hole-information community, so I was preparing a resource-theory talk for a quantum-complexity workshop. The preparations set wheels churning in my mind, and inspiration struck during a long walk.2

After watching my workshop talk, Philippe Faist reached out about collaborating. Philippe is a coauthor, a friend, and a fellow quantum thermodynamicist and resource theorist. Caltech’s influence had sucked him, too, into the black-hole community. We Zoomed throughout the pandemic’s first spring, widening our circle to include Teja Kothakonda, Jonas Haferkamp, and Jens Eisert of Freie University Berlin. Then, Anthony Munson joined from my nascent group in Maryland. Physical Review A published our paper, “Resource theory of quantum uncomplexity,” in January.

The next four paragraphs, I’ve geared toward experts. An agent in the resource theory manipulates a set of n qubits. The agent can attempt to perform any gate U on any two qubits. Noise corrupts every real-world gate implementation, though. Hence the agent effects a gate chosen randomly from near U. Such fuzzy gates are free. The agent can’t append or discard any system for free: Appending even a maximally mixed qubit increases the state’s uncomplexity, as Knill and Laflamme showed. 

Fuzzy gates’ randomness prevents the agent from mapping complex states to uncomplex states for free (with any considerable probability). Complexity only grows or remains constant under fuzzy operations, under appropriate conditions. This growth echoes the second law of thermodynamics. 

We also defined operational tasks—uncomplexity extraction and expenditure analogous to work extraction and expenditure. Then, we bounded the efficiencies with which the agent can perform these tasks. The efficiencies depend on a complexity entropy that we defined—and that’ll star in part trois of this blog-post series.

Now, I want to know what purposes the resource theory of uncomplexity can serve. Can we recast black-hole problems in terms of the resource theory, then leverage resource-theory results to solve the black-hole problem? What about problems in condensed matter? Can our resource theory, which quantifies the difficulty of preparing quantum states, merge with the resource theory of magic, which quantifies that difficulty differently?

Unofficial mascot for fuzzy operations

I don’t regret having declined my PhD advisor’s recommendation six years ago. Doing so led me to explore probability theory and measurement theory, collaborate with two experimental labs, and write ten papers with 21 coauthors whom I esteem. But I take my hat off to Adam and Lenny for their question. And I remain grateful to the advisor who kept my goals and interests in mind while checking his email. I hope to serve Anthony and his fellow advisees as well.

1…en route to obtaining a marriage license. My husband and I married four months after the pandemic throttled government activities. Hours before the relevant office’s calendar filled up, I scored an appointment to obtain our license. Regarding the metro as off-limits, my then-fiancé and I walked from Cambridge, Massachusetts to downtown Boston for our appointment. I thank him for enduring my requests to stop so that I could write notes.

2At least, in the thermodynamic limit—if the system is infinitely large. If the system is finite-size, its entropy grows on average.

A (quantum) complex legacy

Early in the fourth year of my PhD, I received a most John-ish email from John Preskill, my PhD advisor. The title read, “thermodynamics of complexity,” and the message was concise the way that the Amazon River is damp: “Might be an interesting subject for you.” 

Below the signature, I found a paper draft by Stanford physicists Adam Brown and Lenny Susskind. Adam is a Brit with an accent and a wit to match his Oxford degree. Lenny, known to the public for his books and lectures, is a New Yorker with an accent that reminds me of my grandfather. Before the physicists posted their paper online, Lenny sought feedback from John, who forwarded me the email.

The paper concerned a confluence of ideas that you’ve probably encountered in the media: string theory, black holes, and quantum information. String theory offers hope for unifying two physical theories: relativity, which describes large systems such as our universe, and quantum theory, which describes small systems such as atoms. A certain type of gravitational system and a certain type of quantum system participate in a duality, or equivalence, known since the 1990s. Our universe isn’t such a gravitational system, but never mind; the duality may still offer a toehold on a theory of quantum gravity. Properties of the gravitational system parallel properties of the quantum system and vice versa. Or so it seemed.

The gravitational system can have two black holes linked by a wormhole. The wormhole’s volume can grow linearly in time for a time exponentially long in the black holes’ entropy. Afterward, the volume hits a ceiling and approximately ceases changing. Which property of the quantum system does the wormhole’s volume parallel?

Envision the quantum system as many particles wedged close together, so that they interact with each other strongly. Initially uncorrelated particles will entangle with each other quickly. A quantum system has properties, such as average particle density, that experimentalists can measure relatively easily. Does such a measurable property—an observable of a small patch of the system—parallel the wormhole volume? No; such observables cease changing much sooner than the wormhole volume does. The same conclusion applies to the entanglement amongst the particles.

What about a more sophisticated property of the particles’ quantum state? Researchers proposed that the state’s complexity parallels the wormhole’s volume. To grasp complexity, imagine a quantum computer performing a computation. When performing computations in math class, you needed blank scratch paper on which to write your calculations. A quantum computer needs the quantum equivalent of blank scratch paper: qubits (basic units of quantum information, realized, for example, as atoms) in a simple, unentangled, “clean” state. The computer performs a sequence of basic operations—quantum logic gates—on the qubits. These operations resemble addition and subtraction but can entangle the qubits. What’s the minimal number of basic operations needed to prepare a desired quantum state (or to “uncompute” a given state to the blank state)? The state’s quantum complexity.1 

Quantum complexity has loomed large over multiple fields of physics recently: quantum computing, condensed matter, and quantum gravity. The latter, we established, entails a duality between a gravitational system and a quantum system. The quantum system begins in a simple quantum state that grows complicated as the particles interact. The state’s complexity parallels the volume of a wormhole in the gravitational system, according to a conjecture.2 

The conjecture would hold more water if the quantum state’s complexity grew similarly to the wormhole’s volume: linearly in time, for a time exponentially large in the quantum system’s size. Does the complexity grow so? The expectation that it does became the linear-growth conjecture.

Evidence supported the conjecture. For instance, quantum information theorists modeled the quantum particles as interacting randomly, as though undergoing a quantum circuit filled with random quantum gates. Leveraging probability theory,3 the researchers proved that the state’s complexity grows linearly at short times. Also, the complexity grows linearly for long times if each particle can store a great deal of quantum information. But what if the particles are qubits, the smallest and most ubiquitous unit of quantum information? The question lingered for years.

Jonas Haferkamp, a PhD student in Berlin, dreamed up an answer to an important version of the question.4 I had the good fortune to help formalize that answer with him and members of his research group: master’s student Teja Kothakonda, postdoc Philippe Faist, and supervisor Jens Eisert. Our paper, published in Nature Physics last year, marked step one in a research adventure catalyzed by John Preskill’s email 4.5 years earlier.

Imagine, again, qubits undergoing a circuit filled with random quantum gates. That circuit has some architecture, or arrangement of gates. Slotting different gates into the architecture effects different transformations5 on the qubits. Consider the set of all transformations implementable with one architecture. This set has some size, which we defined and analyzed.

What happens to the set’s size if you add more gates to the circuit—let the particles interact for longer? We can bound the size’s growth using the mathematical toolkits of algebraic geometry and differential topology. Upon bounding the size’s growth, we can bound the state’s complexity. The complexity, we concluded, grows linearly in time for a time exponentially long in the number of qubits.

Our result lends weight to the complexity-equals-volume hypothesis. The result also introduces algebraic geometry and differential topology into complexity as helpful mathematical toolkits. Finally, the set size that we bounded emerged as a useful concept that may elucidate circuit analyses and machine learning.

John didn’t have machine learning in mind when forwarding me an email in 2017. He didn’t even have in mind proving the linear-growth conjecture. The proof enables step two of the research adventure catalyzed by that email: thermodynamics of quantum complexity, as the email’s title stated. I’ll cover that thermodynamics in its own blog post. The simplest of messages can spin a complex legacy.

The links provided above scarcely scratch the surface of the quantum-complexity literature; for a more complete list, see our paper’s bibliography. For a seminar about the linear-growth paper, see this video hosted by Nima Lashkari’s research group.

1The term complexity has multiple meanings; forget the rest for the purposes of this article.

2According to another conjecture, the quantum state’s complexity parallels a certain space-time region’s action. (An action, in physics, isn’t a motion or a deed or something that Hamlet keeps avoiding. An action is a mathematical object that determines how a system can and can’t change in time.) The first two conjectures snowballed into a paper entitled “Does complexity equal anything?” Whatever it parallels, complexity plays an important role in the gravitational–quantum duality. 

3Experts: Such as unitary t-designs.

4Experts: Our work concerns quantum circuits, rather than evolutions under fixed Hamiltonians. Also, our work concerns exact circuit complexity, the minimal number of gates needed to prepare a state exactly. A natural but tricky extension eluded us: approximate circuit complexity, the minimal number of gates needed to approximate the state.

5Experts: Unitary operators.

Rocks that roll

In Terry Pratchett’s fantasy novel Soul Music, rock ’n roll arrives in Ankh-Morpork. Ankh-Morpork resembles the London of yesteryear—teeming with heroes and cutthroats, palaces and squalor—but also houses vampires, golems, wizards, and a sentient suitcase. Against this backdrop, a young harpist stumbles upon a mysterious guitar. He forms a band with a dwarf and with a troll who plays tuned rocks, after which the trio calls its style “Music with Rocks In.” The rest of the story consists of satire, drums, and rocks that roll. 

The topic of rolling rocks sounds like it should elicit more yawns than an Elvis concert elicited screams. But rocks’ rolling helped recent University of Maryland physics PhD student Zackery Benson win a National Research Council Fellowship. He and his advisor, Wolfgang Losert, converted me into a fan of granular flow.

What I’ve been studying recently. Kind of.

Grains make up materials throughout the galaxy, such as the substance of avalanches. Many granular materials undergo repeated forcing by their environments. For instance, the grains that form an asteroid suffer bombardment from particles flying through outer space. The gravel beneath train tracks is compressed whenever a train passes. 

Often, a pattern characterizes the forces in a granular system’s environment. For instance, trains in a particular weight class may traverse some patch of gravel, and the trains may arrive with a particular frequency. Some granular systems come to encode information about those patterns in their microscopic configurations and large-scale properties. So granular flow—little rocks that roll—can impact materials science, engineering, geophysics, and thermodynamics.

Granular flow sounds so elementary, you might expect us to have known everything about it since long before the Beatles’ time. But we didn’t even know until recently how to measure rolling in granular flows. 

Envision a grain as a tiny sphere, like a globe of the Earth. Scientists focused mostly on how far grains are translated through space in a flow, analogouslly to how far a globe travels across a desktop if flicked. Recently, scientists measured how far a grain rotates about one axis, like a globe fixed in a frame. Sans frame, though, a globe can spin about more than one axis—about three independent axes. Zack performed the first measurement of all the rotations and translations of all the particles in a granular flow.

Each grain was an acrylic bead about as wide as my pinky nail. Two holes were drilled into each bead, forming an X, for reasons I’ll explain. 

Image credit: Benson et al., Phys. Rev. Lett. 129, 048001 (2022).

Zack dumped over 10,000 beads into a rectangular container. Then, he poured in a fluid that filled the spaces between the grains. Placing a weight atop the grains, he exerted a constant pressure on them. Zack would push one of the container’s walls inward, compressing the grains similarly to how a train compresses gravel. Then, he’d decompress the beads. He repeated this compression cycle many times.

Image credit: Benson et al., Phys. Rev. E 103, 062906 (2021).

Each cycle consisted of many steps: Zack would compress the beads a tiny amount, pause, snap pictures, and then compress a tiny amount more. During each pause, the camera activated a fluorescent dye in the fluid, which looked clear in the photographs. Lacking the fluorescent dye, the beads showed up as dark patches. Clear X’s cut through the dark patches, as dye filled the cavities drilled into the beads. From the X’s, Zack inferred every grain’s orientation. He inferred how every grain rotated by comparing the orientation in one snapshot with the orientation in the next snapshot. 

Image credit: Benson et al., Phys. Rev. Lett. 129, 048001 (2022).

Wolfgang’s lab had been trying for fifteen years to measure all the motions in a granular flow. The feat required experimental and computational skill. I appreciated the chance to play a minor role, in analyzing the data. Physical Review Letters published our paper last month.

From Zack’s measurements, we learned about the unique roles played by rotations in granular flow. For instance, rotations dominate the motion in a granular system’s bulk, far from the container’s walls. Importantly, the bulk dissipates the most energy. Also, whereas translations are reversible—however far grains shift while compressed, they tend to shift oppositely while decompressed—rotations are not. Such irreversibility can contribute to materials’ aging.

In Soul Music, the spirit of rock ’n roll—conceived of as a force in its own right—offers the guitarist the opportunity to never age. He can live fast, die young, and enjoy immortality as a legend, for his guitar comes from a dusty little shop not entirely of Ankh-Morpork’s world. Such shops deal in fate and fortune, the author maintains. Doing so, he takes a dig at the River Ankh, which flows through the city of Ankh-Morpork. The Ankh’s waters hold so much garbage, excrement, midnight victims, and other muck that they scarcely count as waters:

And there was even an Ankh-Morpork legend, wasn’t there, about some old drum [ . . . ] that was supposed to bang itself if an enemy fleet was seen sailing up the Ankh? The legend had died out in recent centuries, partly because this was the Age of Reason and also because no enemy fleet could sail up the Ankh without a gang of men with shovels going in front.

Such a drum would qualify as magic easily, but don’t underestimate the sludge. As a granular-flow system, it’s more incredible than you might expect.

Quantum connections

We were seated in the open-air back of a boat, motoring around the Stockholm archipelago. The Swedish colors fluttered above our heads; the occasional speedboat zipped past, rocking us in its wake; and wildflowers dotted the bank on either side. Suddenly, a wood-trimmed boat glided by, and the captain waved from his perch.

The gesture surprised me. If I were in a vehicle of the sort most familiar to me—a car—I wouldn’t wave to other drivers. In a tram, I wouldn’t wave to passengers on a parallel track. Granted, trams and cars are closed, whereas boats can be open-air. But even as a pedestrian in a downtown crossing, I wouldn’t wave to everyone I passed. Yet, as boat after boat pulled alongside us, we received salutation after salutation.

The outing marked the midpoint of the Quantum Connections summer school. Physicists Frank Wilczek, Antti Niemi, and colleagues coordinate the school, which draws students and lecturers from across the globe. Although sponsored by Stockholm University, the school takes place at a century-old villa whose name I wish I could pronounce: Högberga Gård. The villa nestles atop a cliff on an island in the archipelago. We ventured off the island after a week of lectures.

Charlie Marcus lectured about materials formed from superconductors and semiconductors; John Martinis, about superconducting qubits; Jianwei Pan, about quantum advantages; and others, about symmetries, particle statistics, and more. Feeling like an ant among giants, I lectured about quantum thermodynamics. Two other lectures linked quantum physics with gravity—and in a way you might not expect. I appreciated the opportunity to reconnect with the lecturer: Igor Pikovski.

Cruising around Stockholm

Igor doesn’t know it, but he’s one of the reasons why I joined the Harvard-Smithsonian Institute for Theoretical Atomic, Molecular, and Optical Physics (ITAMP) as an ITAMP Postdoctoral Fellow in 2018. He’d held the fellowship beginning a few years before, and he’d earned a reputation for kindness and consideration. Also, his research struck me as some of the most fulfilling that one could undertake.

If you’ve heard about the intersection of quantum physics and gravity, you’ve probably heard of approaches other than Igor’s. For instance, physicists are trying to construct a theory of quantum gravity, which would describe black holes and the universe’s origin. Such a “theory of everything” would reduce to Einstein’s general theory of relativity when applied to planets and would reduce to quantum theory when applied to atoms. In another example, physicists leverage quantum technologies to observe properties of gravity. Such technologies enabled the observatory LIGO to register gravitational waves—ripples in space-time. 

Igor and his colleagues pursue a different goal: to observe phenomena whose explanations depend on quantum theory and on gravity.

In his lectures, Igor illustrated with an experiment first performed in 1975. The experiment relies on what happens if you jump: You gain energy associated with resisting the Earth’s gravitational pull—gravitational potential energy. A quantum object’s energy determines how the object’s quantum state changes in time. The experimentalists applied this fact to a beam of neutrons. 

They put the beam in a superposition of two locations: closer to the Earth’s surface and farther away. The closer component changed in time in one way, and the farther component changed another way. After a while, the scientists recombined the components. The two interfered with each other similarly to the waves created by two raindrops falling near each other on a puddle. The interference evidenced gravity’s effect on the neutrons’ quantum state.

Summer-school venue. I’d easily say it’s gorgeous but not easily pronounce its name.

The experimentalists approximated gravity as dominated by the Earth alone. But other masses can influence the gravitational field noticeably. What if you put a mass in a superposition of different locations? What would happen to space-time?

Or imagine two quantum particles too far apart to interact with each other significantly. Could a gravitational field entangle the particles by carrying quantum correlations from one to the other?

Physicists including Igor ponder these questions…and then ponder how experimentalists could test their predictions. The more an object influences gravity, the more massive the object tends to be, and the more easily the object tends to decohere—to spill the quantum information that it holds into its surroundings.

The “gravity-quantum interface,” as Igor entitled his lectures, epitomizes what I hoped to study in college, as a high-school student entranced by physics, math, and philosophy. What’s more curious and puzzling than superpositions, entanglement, and space-time? What’s more fundamental than quantum theory and gravity? Little wonder that connecting them inspires wonder.

But we humans are suckers for connections. I appreciated the opportunity to reconnect with a colleague during the summer school. Boaters on the Stockholm archipelago waved to our cohort as they passed. And who knows—gravitational influences may even have rippled between the boats, entangling us a little.

Requisite physicist-visiting-Stockholm photo

With thanks to the summer-school organizers, including Pouya Peighami and Elizabeth Yang, for their invitation and hospitality.

One equation to rule them all?

In lieu of composing a blog post this month, I’m publishing an article in Quanta Magazine. The article provides an introduction to fluctuation relations, souped-up variations on the second law of thermodynamics, which helps us understand why time flows in only one direction. The earliest fluctuation relations described classical systems, such as single strands of DNA. Many quantum versions have been proved since. Their proliferation contrasts with the stereotype of physicists as obsessed with unification—with slimming down a cadre of equations into one über-equation. Will one quantum fluctuation relation emerge to rule them all? Maybe, and maybe not. Maybe the multiplicity of quantum fluctuation relations reflects the richness of quantum thermodynamics.

You can read more in Quanta Magazine here and yet more in chapter 9 of my book. For recent advances in fluctuation relations, as opposed to the broad introduction there, check out earlier Quantum Frontiers posts here, here, here, here, and here.

The power of being able to say “I can explain that”

Caltech condensed-matter theorist Gil Refael explained his scientific raison dê’tre early in my grad-school career: “What really gets me going is seeing a plot [of experimental data] and being able to say, ‘I can explain that.’” The quote has stuck with me almost word for word. When I heard it, I was working deep in abstract quantum information theory and thermodynamics, proving theorems about thought experiments. Embedding myself in pure ideas has always held an aura of romance for me, so I nodded along without seconding Gil’s view.

Roughly nine years later, I concede his point.

The revelation walloped me last month, as I was polishing a paper with experimental collaborators. Members of the Institute for Quantum Optics and Quantum Information (IQOQI) in Innsbruck, Austria—Florian Kranzl, Manoj Joshi, and Christian Roos—had performed an experiment in trapped-ion guru Rainer Blatt’s lab. Their work realized an experimental proposal that I’d designed with fellow theorists near the beginning of my postdoc stint. We aimed to observe signatures of particularly quantum thermalization

Throughout the universe, small systems exchange stuff with their environments. For instance, the Earth exchanges heat and light with the rest of the solar system. After exchanging stuff for long enough, the small system equilibrates with the environment: Large-scale properties of the small system (such as its volume and energy) remain fairly constant; and as much stuff enters the small system as leaves, on average. The Earth remains far from equilibrium, which is why we aren’t dead yet

Far from equilibrium and proud of it

In many cases, in equilibrium, the small system shares properties of the environment, such as the environment’s temperature. In these cases, we say that the small system has thermalized and, if it’s quantum, has reached a thermal state.

The stuff exchanged can consist of energy, particles, electric charge, and more. Unlike classical planets, quantum systems can exchange things that participate in quantum uncertainty relations (experts: that fail to commute). Quantum uncertainty mucks up derivations of the thermal state’s mathematical form. Some of us quantum thermodynamicists discovered the mucking up—and identified exchanges of quantum-uncertain things as particularly nonclassical thermodynamics—only a few years ago. We reworked conventional thermodynamic arguments to accommodate this quantum uncertainty. The small system, we concluded, likely equilibrates to near a thermal state whose mathematical form depends on the quantum-uncertain stuff—what we termed a non-Abelian thermal state. I wanted to see this equilibration in the lab. So I proposed an experiment with theory collaborators; and Manoj, Florian, and Christian took a risk on us.

The experimentalists arrayed between six and fifteen ions in a line. Two ions formed the small system, and the rest formed the quantum environment. The ions exchanged the x-, y-, and z-components of their spin angular momentum—stuff that participates in quantum uncertainty relations. The ions began with a fairly well-defined amount of each spin component, as described in another blog post. The ions exchanged stuff for a while, and then the experimentalists measured the small system’s quantum state.

The small system equilibrated to near the non-Abelian thermal state, we found. No conventional thermal state modeled the results as accurately. Score!

My postdoc and numerical-simulation wizard Aleks Lasek modeled the experiment on his computer. The small system, he found, remained farther from the non-Abelian thermal state in his simulation than in the experiment. Aleks plotted the small system’s distance to the non-Abelian thermal state against the ion chain’s length. The points produced experimentally sat lower down than the points produced numerically. Why?

I think I can explain that, I said. The two ions exchange stuff with the rest of the ions, which serve as a quantum environment. But the two ions exchange stuff also with the wider world, such as stray electromagnetic fields. The latter exchanges may push the small system farther toward equilibrium than the extra ions alone do.

Fortunately for the development of my explanatory skills, collaborators prodded me to hone my argument. The wider world, they pointed out, effectively has a very high temperature—an infinite temperature.1 Equilibrating with that environment, the two ions would acquire an infinite temperature themselves. The two ions would approach an infinite-temperature thermal state, which differs from the non-Abelian thermal state we aimed to observe.

Fair, I said. But the extra ions probably have a fairly high temperature themselves. So the non-Abelian thermal state is probably close to the infinite-temperature thermal state. Analogously, if someone cooks goulash similarly to his father, and the father cooks goulash similarly to his grandfather, then the youngest chef cooks goulash similarly to his grandfather. If the wider world pushes the two ions to equilibrate to infinite temperature, then, because the infinite-temperature state lies near the non-Abelian thermal state, the wider world pushes the two ions to equilibrate to near the non-Abelian thermal state.

Tasty, tasty thermodynamicis

I plugged numbers into a few equations to check that the extra ions do have a high temperature. (Perhaps I should have done so before proposing the argument above, but my collaborators were kind enough not to call me out.) 

Aleks hammered the nail into the problem’s coffin by incorporating into his simulations the two ions’ interaction with an infinite-temperature wider world. His numerical data points dropped to near the experimental data points. The new plot supported my story.

I can explain that! Aleks’s results buoyed me the whole next day; I found myself smiling at random times throughout the afternoon. Not that I’d explained a grand mystery, like the unexpected hiss heard by Arno Penzias and Robert Wilson when they turned on a powerful antenna in 1964. The hiss turned out to come from the cosmic microwave background (CMB), a collection of photons that fill the visible universe. The CMB provided evidence for the then-controversial Big Bang theory of the universe’s origin. Discovering the CMB earned Penzias and Wilson a Nobel Prize. If the noise caused by the CMB was music to cosmologists’ ears, the noise in our experiment is the quiet wailing of a shy banshee. But it’s our experiment’s noise, and we understand it now.

The experience hasn’t weaned me off the romance of proving theorems about thought experiments. Theorems about thermodynamic quantum uncertainty inspired the experiment that yielded the plot that confused us. But I now second Gil’s sentiment. In the throes of an experiment, “I can explain that” can feel like a battle cry.

1Experts: The wider world effectively has an infinite temperature because (i) the dominant decoherence is dephasing relative to the \sigma_z product eigenbasis and (ii) the experimentalists rotate their qubits often, to simulate a rotationally invariant Hamiltonian evolution. So the qubits effectively undergo dephasing relative to the \sigma_x, \sigma_y, and \sigma_z eigenbases.

Building a Koi pond with Lie algebras

When I was growing up, one of my favourite places was the shabby all-you-can-eat buffet near our house. We’d walk in, my mom would approach the hostess to explain that, despite my being abnormally large for my age, I qualified for kids-eat-free, and I would peel away to stare at the Koi pond. The display of different fish rolling over one another was bewitching. Ten-year-old me would have been giddy to build my own Koi pond, and now I finally have. However, I built one using Lie algebras.

The different fish swimming in the Koi pond are, in many ways, like charges being exchanged between subsystems. A “charge” is any globally conserved quantity. Examples of charges include energy, particles, electric charge, or angular momentum. Consider a system consisting of a cup of coffee in your office. The coffee will dynamically exchange charges with your office in the form of heat energy. Still, the total energy of the coffee and office is conserved (assuming your office walls are really well insulated). In this example, we had one type of charge (heat energy) and two subsystems (coffee and office). Consider now a closed system consisting of many subsystems and many different types of charges. The closed system is like the finite Koi pond with different charges like the different fish species. The charges can move around locally, but the total number of charges is globally fixed, like how the fish swim around but can’t escape the pond. Also, the presence of one type of charge can alter another’s movement, just as a big fish might block a little one’s path. 

Unfortunately, the Koi pond analogy reaches its limit when we move to quantum charges. Classically, charges commute. This means that we can simultaneously determine the amount of each charge in our system at each given moment. In quantum mechanics, this isn’t necessarily true. In other words, classically, I can count the number of glossy fish and matt fish. But, in quantum mechanics, I can’t.

So why does this matter? Subsystems exchanging charges are prevalent in thermodynamics. Quantum thermodynamics extends thermodynamics to include small systems and quantum effects. Noncommutation underlies many important quantum phenomena. Hence, studying the exchange of noncommuting charges is pivotal in understanding quantum thermodynamics. Consequently, noncommuting charges have emerged as a rapidly growing subfield of quantum thermodynamics. Many interesting results have been discovered from no longer assuming that charges commute (such as these). Until recently, most of these discoveries have been theoretical. Bridging these discoveries to experimental reality requires Hamiltonians (functions that tell you how your system evolves in time) that move charges locally but conserve them globally. Last year it was unknown whether these Hamiltonians exist, what they look like generally, how to build them, and for what charges you could find them.

Nicole Yunger Halpern (NIST physicist, my co-advisor, and Quantum Frontiers blogger) and I developed a prescription for building Koi ponds for noncommuting charges. Our prescription allows you to systematically build Hamiltonians that overtly move noncommuting charges between subsystems while conserving the charges globally. These Hamiltonians are built using Lie algebras, abstract mathematical tools that can describe many physical quantities (including everything in the standard model of particle physics and space-time metric). Our results were recently published in npj QI. We hope that our prescription will bolster the efforts to bridge the results of noncommuting charges to experimental reality.

In the end, a little group theory was all I needed for my Koi pond. Maybe I’ll build a treehouse next with calculus or a remote control car with combinatorics.

Space-time and the city

I felt like a gum ball trying to squeeze my way out of a gum-ball machine. 

I was one of 50-ish physicists crammed into the lobby—and in the doorway, down the stairs, and onto the sidewalk—of a Manhattan hotel last December. Everyone had received a COVID vaccine, and the omicron variant hadn’t yet begun chewing up North America. Everyone had arrived on the same bus that evening, feeding on the neon-bright views of Fifth Avenue through dinnertime. Everyone wanted to check in and offload suitcases before experiencing firsthand the reason for the nickname “the city that never sleeps.” So everyone was jumbled together in what passed for a line.

We’d just passed the halfway point of the week during which I was pretending to be a string theorist. I do that whenever my research butts up against black holes, chaos, quantum gravity (the attempt to unify quantum physics with Einstein’s general theory of relativity), and alternative space-times. These topics fall under the heading “It from Qubit,” which calls for understanding puzzling physics (“It”) by analyzing how quantum systems process information (“Qubit”). The “It from Qubit” crowd convenes for one week each December, to share progress and collaborate.1 The group spends Monday through Wednesday at Princeton’s Institute for Advanced Study (IAS), dogged by photographs of Einstein, busts of Einstein, and roads named after Einstein. A bus ride later, the group spends Thursday and Friday at the Simons Foundation in New York City.

I don’t usually attend “It from Qubit” gatherings, as I’m actually a quantum information theorist and quantum thermodynamicist. Having admitted as much during the talk I presented at the IAS, I failed at pretending to be a string theorist. Happily, I adore being the most ignorant person in a roomful of experts, as the experience teaches me oodles. At lunch and dinner, I’d plunk down next to people I hadn’t spoken to and ask what they see as trending in the “It from Qubit” community. 

One buzzword, I’d first picked up on shortly before the pandemic had begun (replicas). Having lived a frenetic life, that trend seemed to be declining. Rising buzzwords (factorization and islands), I hadn’t heard in black-hole contexts before. People were still tossing around terms from when I’d first forayed into “It from Qubit” (scrambling and out-of-time-ordered correlator), but differently from then. Five years ago, the terms identified the latest craze. Now, they sounded entrenched, as though everyone expected everyone else to know and accept their significance.

One buzzword labeled my excuse for joining the workshops: complexity. Complexity wears as many meanings as the stereotypical New Yorker wears items of black clothing. Last month, guest blogger Logan Hillberry wrote about complexity that emerges in networks such as brains and social media. To “It from Qubit,” complexity quantifies the difficulty of preparing a quantum system in a desired state. Physicists have conjectured that a certain quantum state’s complexity parallels properties of gravitational systems, such as the length of a wormhole that connects two black holes. The wormhole’s length grows steadily for a time exponentially large in the gravitational system’s size. So, to support the conjecture, researchers have been trying to prove that complexity typically grows similarly. Collaborators and I proved that it does, as I explained in my talk and as I’ll explain in a future blog post. Other speakers discussed experimental complexities, as well as the relationship between complexity and a simplified version of Einstein’s equations for general relativity.

Inside the Simons Foundation on Fifth Avenue in Manhattan

I learned a bushel of physics, moonlighting as a string theorist that week. The gum-ball-machine lobby, though, retaught me something I’d learned long before the pandemic. Around the time I squeezed inside the hotel, a postdoc struck up a conversation with the others of us who were clogging the doorway. We had a decent fraction of an hour to fill; so we chatted about quantum thermodynamics, grant applications, and black holes. I asked what the postdoc was working on, he explained a property of black holes, and it reminded me of a property of thermodynamics. I’d nearly reached the front desk when I realized that, out of the sheer pleasure of jawing about physics with physicists in person, I no longer wanted to reach the front desk. The moment dangles in my memory like a crystal ornament from the lobby’s tree—pendant from the pandemic, a few inches from the vaccines suspended on one side and from omicron on the other. For that moment, in a lobby buoyed by holiday lights, wrapped in enough warmth that I’d forgotten the December chill outside, I belonged to the “It from Qubit” community as I hadn’t belonged to any community in 22 months.

Happy new year.

Presenting at the IAS was a blast. Photo credit: Jonathan Oppenheim.

1In person or virtually, pandemic-dependently.

Thanks to the organizers of the IAS workshop—Ahmed Almheiri, Adam Bouland, Brian Swingle—for the invitation to present and to the organizers of the Simons Foundation workshop—Patrick Hayden and Matt Headrick—for the invitation to attend.

Balancing the tradeoff

So much to do, so little time. Tending to one task is inevitably at the cost of another, so how does one decide how to spend their time? In the first few years of my PhD, I balanced problem sets, literature reviews, and group meetings, but at the detriment to my hobbies. I have played drums my entire life, but I largely fell out of practice in graduate school. Recently, I made time to play with a group of musicians, even landing a couple gigs in downtown Austin, Texas, “live music capital of the world.” I have found attending to my non-physics interests makes my research hours more productive and less taxing. Finding the right balance of on- versus off-time has been key to my success as my PhD enters its final year.

Of course, life within physics is also full of tradeoffs. My day job is as an experimentalist. I use tightly focused laser beams, known as optical tweezers, to levitate micrometer-sized glass spheres. I monitor a single microsphere’s motion as it undergoes collisions with air molecules, and I study the system as an environmental sensor of temperature, fluid flow, and acoustic waves; however, by night I am a computational physicist. I code simulations of interacting qubits subject to kinetic constraints, so-called quantum cellular automata (QCA). My QCA work started a few years ago for my Master’s degree, but my interest in the subject persists. I recently co-authored one paper summarizing the work so far and another detailing an experimental implementation.

The author doing his part to “keep Austin weird” by playing the drums dressed as grackle (note the beak), the central-Texas bird notorious for overrunning grocery store parking lots.
Balancing research interests: Trapping a glass microsphere with optical tweezers.
Balancing research interests: Visualizing the time evolution of four different QCA rules.

QCA, the subject of this post, are themselves tradeoff-aware systems. To see what I mean, first consider their classical counterparts cellular automata. In their simplest construction, the system is a one-dimensional string of bits. Each bit takes a value of 0 or 1 (white or black). The bitstring changes in discrete time steps based on a simultaneously-applied local update rule: Each bit, along with its two nearest-neighbors, determine the next state of the central bit. Put another way, a bit either flips, i.e., changes 0 to 1 or 1 to 0, or remains unchanged over a timestep depending on the state of that bit’s local neighborhood. Thus, by choosing a particular rule, one encodes a trade off between activity (bit flips) and inactivity (bit remains unchanged). Despite their simple construction, cellular automata dynamics are diverse; they can produce fractals and encryption-quality random numbers. One rule even has the ability to run arbitrary computer algorithms, a property known as universal computation.

Classical cellular automata. Left: rule 90 producing the fractal Sierpiński’s triangle. Middle: rule 30 can be used to generate random numbers. Right: rule 110 is capable of universal computation.

In QCA, bits are promoted to qubits. Instead of being just 0 or 1 like a bit, a qubit can be a continuous mixture of both 0 and 1, a property called superposition. In QCA, a qubit’s two neighbors being 0 or 1 determine whether or not it changes. For example, when in an active neighborhood configuration, a qubit can be coded to change from 0 to “0 plus 1” or from 1 to “0 minus 1”. This is already a head-scratcher, but things get even weirder. If a qubit’s neighbors are in a superposition, then the center qubit can become entangled with those neighbors. Entanglement correlates qubits in a way that is not possible with classical bits.

Do QCA support the emergent complexity observed in their classical cousins? What are the effects of a continuous state space, superposition, and entanglement? My colleagues and I attacked these questions by re-examining many-body physics tools through the lens of complexity science. Singing the lead, we have a workhorse of quantum and solid-state physics: two-point correlations. Singing harmony we have the bread-and-butter of network analysis: complex-network measures. The duet between the two tells the story of structured correlations in QCA dynamics.

In a bit more detail, at each QCA timestep we calculate the mutual information between all qubits i and all other qubits j. Doing so reveals how much there is to learn about one qubit by measuring another, including effects of quantum entanglement. Visualizing each qubit as a node, the mutual information can be depicted as weighted links between nodes: the more correlated two qubits are, the more strongly they are linked. The collection of nodes and links makes a network. Some QCA form unstructured, randomly-linked networks while others are highly structured. 

Complex-network measures are designed to highlight certain structural patterns within a network. Historically, these measures have been used to study diverse networked-systems like friend groups on Facebook, biomolecule pathways in metabolism, and functional-connectivity in the brain. Remarkably, the most structured QCA networks we observed quantitatively resemble those of the complex systems just mentioned despite their simple construction and quantum unitary dynamics. 

Visualizing mutual information networks. Left: A Goldilocks-QCA generated network. Right: a random network.

What’s more, the particular QCA that generate the most complex networks are those that balance the activity-inactivity trade-off. From this observation, we formulate what we call the Goldilocks principle: QCA that generate the most complexity are those that change a qubit if and only if the qubit’s neighbors contain an equal number of 1’s and 0’s. The Goldilocks rules are neither too inactive nor too active, balancing the tradeoff to be “just right.”  We demonstrated the Goldilocks principle for QCA with nearest-neighbor constraints as well as QCA with nearest-and-next-nearest-neighbor constraints.

To my delight, the scientific conclusions of my QCA research resonate with broader lessons-learned from my time as a PhD student: Life is full of trade-offs, and finding the right balance is key to achieving that “just right” feeling.