Happy 200th birthday, Carnot’s theorem!

In Kenneth Grahame’s 1908 novel The Wind in the Willows, a Mole meets a Water Rat who lives on a River. The Rat explains how the River permeates his life: “It’s brother and sister to me, and aunts, and company, and food and drink, and (naturally) washing.” As the River plays many roles in the Rat’s life, so does Carnot’s theorem play many roles in a thermodynamicist’s.

Nicolas Léonard Sadi Carnot lived in France during the turn of the 19th century. His father named him Sadi after the 13th-century Persian poet Saadi Shirazi. Said father led a colorful life himself,1 working as a mathematician, engineer, and military commander for and before the Napoleonic Empire. Sadi Carnot studied in Paris at the École Polytechnique, whose members populate a “Who’s Who” list of science and engineering. 

As Carnot grew up, the Industrial Revolution was humming. Steam engines were producing reliable energy on vast scales; factories were booming; and economies were transforming. France’s old enemy Britain enjoyed two advantages. One consisted of inventors: Englishmen Thomas Savery and Thomas Newcomen invented the steam engine. Scotsman James Watt then improved upon Newcomen’s design until rendering it practical. Second, northern Britain contained loads of coal that industrialists could mine to power her engines. France had less coal. So if you were a French engineer during Carnot’s lifetime, you should have cared about engines’ efficiencies—how effectively engines used fuel.2

Carnot proved a fundamental limitation on engines’ efficiencies. His theorem governs engines that draw energy from heat—rather than from, say, the motional energy of water cascading down a waterfall. In Carnot’s argument, a heat engine interacts with a cold environment and a hot environment. (Many car engines fall into this category: the hot environment is burning gasoline. The cold environment is the surrounding air into which the car dumps exhaust.) Heat flows from the hot environment to the cold. The engine siphons off some heat and converts it into work. Work is coordinated, well-organized energy that one can directly harness to perform a useful task, such as turning a turbine. In contrast, heat is the disordered energy of particles shuffling about randomly. Heat engines transform random heat into coordinated work.

In The Wind and the Willows, Toad drives motorcars likely powered by internal combustion, rather than by a steam engine of the sort that powered the Industrial Revolution.

An engine’s efficiency is the bang we get for our buck—the upshot we gain, compared to the cost we spend. Running an engine costs the heat that flows between the environments: the more heat flows, the more the hot environment cools, so the less effectively it can serve as a hot environment in the future. An analogous statement concerns the cold environment. So a heat engine’s efficiency is the work produced, divided by the heat spent.

Carnot upper-bounded the efficiency achievable by every heat engine of the sort described above. Let T_{\rm C} denote the cold environment’s temperature; and T_{\rm H}, the hot environment’s. The efficiency can’t exceed 1 - \frac{ T_{\rm C} }{ T_{\rm H} }. What a simple formula for such an extensive class of objects! Carnot’s theorem governs not only many car engines (Otto engines), but also the Stirling engine that competed with the steam engine, its cousin the Ericsson engine, and more.

In addition to generality and simplicity, Carnot’s bound boasts practical and fundamental significances. Capping engine efficiencies caps the output one can expect of a machine, factory, or economy. The cap also prevents engineers from wasting their time on daydreaming about more-efficient engines. 

More fundamentally than these applications, Carnot’s theorem encapsulates the second law of thermodynamics. The second law helps us understand why time flows in only one direction. And what’s deeper or more foundational than time’s arrow? People often cast the second law in terms of entropy, but many equivalent formulations express the law’s contents. The formulations share a flavor often synopsized with “You can’t win.” Just as we can’t grow younger, we can’t beat Carnot’s bound on engines. 

Video courtesy of FQxI

One might expect no engine to achieve the greatest efficiency imaginable: 1 - \frac{ T_{\rm C} }{ T_{\rm H} }, called the Carnot efficiency. This expectation is incorrect in one way and correct in another. Carnot did design an engine that could operate at his eponymous efficiency: an eponymous engine. A Carnot engine can manifest as the thermodynamicist’s favorite physical system: a gas in a box topped by a movable piston. The gas undergoes four strokes, or steps, to perform work. The strokes form a closed cycle, returning the gas to its initial conditions.3 

Steampunk artist Todd Cahill beautifully illustrated the Carnot cycle for my book. The gas performs useful work because a teapot sits atop the piston. Pushing the piston upward, the gas lifts the teapot. You can find a more detailed description of Carnot’s engine in Chapter 4 of the book, but I’ll recap the cycle here.

The gas expands during stroke 1, pushing the piston and so outputting work. Maintaining contact with the hot environment, the gas remains at the temperature T_{\rm H}. The gas then disconnects from the hot environment. Yet the gas continues to expand throughout stroke 2, lifting the teapot further. Forfeiting energy, the gas cools. It ends stroke 2 at the temperature T_{\rm C}.

The gas contacts the cold environment throughout stroke 3. The piston pushes on the gas, compressing it. At the end of the stroke, the gas disconnects from the cold environment. The piston continues compressing the gas throughout stroke 4, performing more work on the gas. This work warms the gas back up to T_{\rm H}.

In summary, Carnot’s engine begins hot, performs work, cools down, has work performed on it, and warms back up. The gas performs more work on the piston than the piston performs on it. Therefore, the teapot rises (during strokes 1 and 2) more than it descends (during strokes 3 and 4). 

At what cost, if the engine operates at the Carnot efficiency? The engine mustn’t waste heat. One wastes heat by roiling up the gas unnecessarily—by expanding or compressing it too quickly. The gas must stay in equilibrium, a calm, quiescent state. One can keep the gas quiescent only by running the cycle infinitely slowly. The cycle will take an infinitely long time, outputting zero power (work per unit time). So one can achieve the perfect efficiency only in principle, not in practice, and only by sacrificing power. Again, you can’t win.

Efficiency trades off with power.

Carnot’s theorem may sound like the Eeyore of physics, all negativity and depression. But I view it as a companion and backdrop as rich, for thermodynamicists, as the River is for the Water Rat. Carnot’s theorem curbs diverse technologies in practical settings. It captures the second law, a foundational principle. The Carnot cycle provides intuition, serving as a simple example on which thermodynamicists try out new ideas, such as quantum engines. Carnot’s theorem also provides what physicists call a sanity check: whenever a researcher devises a new (for example, quantum) heat engine, they can confirm that the engine obeys Carnot’s theorem, to help confirm their proposal’s accuracy. Carnot’s theorem also serves as a school exercise and a historical tipping point: the theorem initiated the development of thermodynamics, which continues to this day. 

So Carnot’s theorem is practical and fundamental, pedagogical and cutting-edge—brother and sister, and aunts, and company, and food and drink. I just wouldn’t recommend trying to wash your socks in Carnot’s theorem.

1To a theoretical physicist, working as a mathematician and an engineer amounts to leading a colorful life.

2People other than Industrial Revolution–era French engineers should care, too.

3A cycle doesn’t return the hot and cold environments to their initial conditions, as explained above.

Now published: Building Quantum Computers

Building Quantum Computers: A Practical Introduction by Shayan Majidy, Christopher Wilson, and Raymond Laflamme has been published by Cambridge University Press and will be released in the US on September 30. The authors invited me to write a Foreword for the book, which I was happy to do. The publisher kindly granted permission for me to post the Foreword here on Quantum Frontiers.

Foreword

The principles of quantum mechanics, which as far as we know govern all natural phenomena, were discovered in 1925. For 99 years we have built on that achievement to reach a comprehensive understanding of much of the physical world, from molecules to materials to elementary particles and much more. No comparably revolutionary advance in fundamental science has occurred since 1925. But a new revolution is in the offing.

Up until now, most of what we have learned about the quantum world has resulted from considering the behavior of individual particles — for example a single electron propagating as a wave through a crystal, unfazed by barriers that seem to stand in its way. Understanding that single-particle physics has enabled us to explore nature in unprecedented ways, and to build information technologies that have profoundly transformed our lives.

What’s happening now is we’re learning how to instruct particles to evolve in coordinated ways that can’t be accurately described in terms of the behavior of one particle at a time. The particles, as we like to say, can become entangled. Many particles, like electrons or photons or atoms, when highly entangled, exhibit an extraordinary complexity that we can’t capture with the most powerful of today’s supercomputers, or with our current theories of how nature works. That opens extraordinary opportunities for new discoveries and new applications.

Most temptingly, we anticipate that by building and operating large-scale quantum computers, which control the evolution of very complex entangled quantum systems, we will be able to solve some computational problems that are far beyond the reach of today’s digital computers. The concept of a quantum computer was proposed over 40 years ago, and the task of building quantum computing hardware has been pursued in earnest since the 1990s. After decades of steady progress, quantum information processors with hundreds of qubits have become feasible and are scientifically valuable. But we may need quantum processors with millions of qubits to realize practical applications of broad interest. There is still a long way to go.

Why is it taking so long? A conventional computer processes bits, where each bit could be, say, a switch which is either on or off. To build highly complex entangled quantum states, the fundamental information-carrying component of a quantum computer must be what we call a “qubit” rather than a bit. The trouble is that qubits are much more fragile than bits — when a qubit interacts with its environment, the information it carries is irreversibly damaged, a process called decoherence. To perform reliable logical operations on qubits, we need to prevent decoherence by keeping the qubits nearly perfectly isolated from their environment. That’s very hard to do. And because a qubit, unlike a bit, can change continuously, precisely controlling a qubit is a further challenge, even when decoherence is in check.

While theorists may find it convenient to regard a qubit (or a bit) as an abstract object, in an actual processor a qubit needs to be encoded in a particular physical system. There are many options. It might, for example, be encoded in a single atom which can be in either one of two long-lived internal states. Or the spin of a single atomic nucleus or electron which points either up or down along some axis. Or a single photon that occupies either one of two possible optical modes. These are all remarkable encodings, because the qubit resides in a very simple single quantum system, yet, thanks to technical advances over several decades, we have learned to control such qubits reasonably well. Alternatively, the qubit could be encoded in a more complex system, like a circuit conducting electricity without resistance at very low temperature. This is also remarkable, because although the qubit involves the collective motion of billions of pairs of electrons, we have learned to make it behave as though it were a single atom.

To run a quantum computer, we need to manipulate individual qubits and perform entangling operations on pairs of qubits. Once we can perform such single-qubit and two-qubit “quantum gates” with sufficient accuracy, and measure and initialize the qubits as well, then in principle we can perform any conceivable quantum computation by assembling sufficiently many qubits and executing sufficiently many gates.

It’s a daunting engineering challenge to build and operate a quantum system of sufficient complexity to solve very hard computation problems. That systems engineering task, and the potential practical applications of such a machine, are both beyond the scope of Building Quantum Computers. Instead the focus is on the computer’s elementary constituents for four different qubit modalities: nuclear spins, photons, trapped atomic ions, and superconducting circuits. Each type of qubit has its own fascinating story, told here expertly and with admirable clarity.

For each modality a crucial question must be addressed: how to produce well-controlled entangling interactions between two qubits. Answers vary. Spins have interactions that are always on, and can be “refocused” by applying suitable pulses. Photons hardly interact with one another at all, but such interactions can be mocked up using appropriate measurements. Because of their Coulomb repulsion, trapped ions have shared normal modes of vibration that can be manipulated to generate entanglement. Couplings and frequencies of superconducting qubits can be tuned to turn interactions on and off. The physics underlying each scheme is instructive, with valuable lessons for the quantum informationists to heed.

Various proposed quantum information processing platforms have characteristic strengths and weaknesses, which are clearly delineated in this book. For now it is important to pursue a variety of hardware approaches in parallel, because we don’t know for sure which ones have the best long term prospects. Furthermore, different qubit technologies might be best suited for different applications, or a hybrid of different technologies might be the best choice in some settings. The truth is that we are still in the early stages of developing quantum computing systems, and there is plenty of potential for surprises that could dramatically alter the outlook.

Building large-scale quantum computers is a grand challenge facing 21st-century science and technology. And we’re just getting started. The qubits and quantum gates of the distant future may look very different from what is described in this book, but the authors have made wise choices in selecting material that is likely to have enduring value. Beyond that, the book is highly accessible and fun to read. As quantum technology grows ever more sophisticated, I expect the study and control of highly complex many-particle systems to become an increasingly central theme of physical science. If so, Building Quantum Computers will be treasured reading for years to come.

John Preskill
Pasadena, California

Version 1.0.0

Announcing the quantum-steampunk creative-writing course!

Why not run a quantum-steampunk creative-writing course?

Quantum steampunk, as Quantum Frontiers regulars know, is the aesthetic and spirit of a growing scientific field. Steampunk is a subgenre of science fiction. In it, futuristic technologies invade Victorian-era settings: submarines, time machines, and clockwork octopodes populate La Belle Èpoque, a recently liberated Haiti, and Sherlock Holmes’s London. A similar invasion characterizes my research field, quantum thermodynamics: thermodynamics is the study of heat, work, temperature, and efficiency. The Industrial Revolution spurred the theory’s development during the 1800s. The theory’s original subject—nineteenth-century engines—were large, were massive, and contained enormous numbers of particles. Such engines obey the classical mechanics developed during the 1600s. Hence thermodynamics needs re-envisioning for quantum systems. To extend the theory’s laws and applications, quantum thermodynamicists use mathematical and experimental tools from quantum information science. Quantum information science is, in part, the understanding of quantum systems through how they store and process information. The toolkit is partially cutting-edge and partially futuristic, as full-scale quantum computers remain under construction. So applying quantum information to thermodynamics—quantum thermodynamics—strikes me as the real-world incarnation of steampunk.

But the thought of a quantum-steampunk creative-writing course had never occurred to me, and I hesitated over it. Quantum-steampunk blog posts, I could handle. A book, I could handle. Even a short-story contest, I’d handled. But a course? The idea yawned like the pitch-dark mouth of an unknown cavern in my imagination.

But the more I mulled over Edward Daschle’s suggestion, the more I warmed to it. Edward was completing a master’s degree in creative writing at the University of Maryland (UMD), specializing in science fiction. His mentor Emily Brandchaft Mitchell had sung his praises via email. In 2023, Emily had served as a judge for the Quantum-Steampunk Short-Story Contest. She works as a professor of English at UMD, writes fiction, and specializes in the study of genre. I reached out to her last spring about collaborating on a grant for quantum-inspired art, and she pointed to her protégé.

Who won me over. Edward and I are co-teaching “Writing Quantum Steampunk: Science-Fiction Workshop” during spring 2025.

The course will alternate between science and science fiction. Under Edward’s direction, we’ll read and discuss published fiction. We’ll also learn about what genres are and how they come to be. Students will try out writing styles by composing short stories themselves. Everyone will provide feedback about each other’s writing: what works, what’s confusing, and opportunities for improvement. 

The published fiction chosen will mirror the scientific subjects we’ll cover: quantum physics; quantum technologies; and thermodynamics, including quantum thermodynamics. I’ll lead this part of the course. The scientific studies will interleave with the story reading, writing, and workshopping. Students will learn about the science behind the science fiction while contributing to the growing subgenre of quantum steampunk.

We aim to attract students from across campus: physics, English, the Jiménez-Porter Writers’ House, computer science, mathematics, and engineering—plus any other departments whose students have curiosity and creativity to spare. The course already has four cross-listings—Arts and Humanities 270, Physics 299Q, Computer Science 298Q, and Mechanical Engineering 299Q—and will probably acquire a fifth (Chemistry 298Q). You can earn a Distributive Studies: Scholarship in Practice (DSSP) General Education requirement, and undergraduate and graduate students are welcome. QuICS—the Joint Center for Quantum Information and Computer Science, my home base—is paying Edward’s salary through a seed grant. Ross Angelella, the director of the Writers’ House, arranged logistics and doused us with enthusiasm. I’m proud of how organizations across the university are uniting to support the course.

The diversity we seek, though, poses a challenge. The course lacks prerequisites, so I’ll need to teach at a level comprehensible to the non-science students. I’d enjoy doing so, but I’m concerned about boring the science students. Ideally, the science students will help me teach, while the non-science students will challenge us with foundational questions that force us to rethink basic concepts. Also, I hope that non-science students will galvanize discussions about ethical and sociological implications of quantum technologies. But how can one ensure that conversation will flow?

This summer, Edward and I traded candidate stories for the syllabus. Based on his suggestions, I recommend touring science fiction under an expert’s guidance. I enjoyed, for a few hours each weekend, sinking into the worlds of Ted Chiang, Ursula K. LeGuinn, N. K. Jemison, Ken Liu, and others. My scientific background informed my reading more than I’d expected. Some authors, I could tell, had researched their subjects thoroughly. When they transitioned from science into fiction, I trusted and followed them. Other authors tossed jargon into their writing but evidenced a lack of deep understanding. One author nailed technical details about quantum computation, initially impressing me, but missed the big picture: his conflict hinged on a misunderstanding about entanglement. I see all these stories as affording opportunities for learning and teaching, in different ways.

Students can begin registering for “Writing Quantum Steampunk: Science-Fiction Workshop” on October 24. We can offer only 15 seats, due to Writers’ House standards, so secure yours as soon as you can. Part of me still wonders how the Hilbert space I came to be co-teaching a quantum-steampunk creative-writing course.1 But I look forward to reading with you next spring!


1A Hilbert space is a mathematical object that represents a quantum system. But you needn’t know that to succeed in the course.

Always appropriate

I met boatloads of physicists as a master’s student at the Perimeter Institute for Theoretical Physics in Waterloo, Canada. Researchers pass through Perimeter like diplomats through my current neighborhood—the Washington, DC area—except that Perimeter’s visitors speak math instead of legalese and hardly any of them wear ties. But Nilanjana Datta, a mathematician at the University of Cambridge, stood out. She was one of the sharpest, most on-the-ball thinkers I’d ever encountered. Also, she presented two academic talks in a little black dress.

The academic year had nearly ended, and I was undertaking research at the intersection of thermodynamics and quantum information theory for the first time. My mentors and I were applying a mathematical toolkit then in vogue, thanks to Nilanjana and colleagues of hers: one-shot quantum information theory. To explain one-shot information theory, I should review ordinary information theory. Information theory is the study of how efficiently we can perform information-processing tasks, such as sending messages over a channel. 

Say I want to send you n copies of a message. Into how few bits (units of information) can I compress the n copies? First, suppose that the message is classical, such that a telephone could convey it. The average number of bits needed per copy equals the message’s Shannon entropy, a measure of your uncertainty about which message I’m sending. Now, suppose that the message is quantum. The average number of quantum bits needed per copy is the von Neumann entropy, now a measure of your uncertainty. At least, the answer is the Shannon or von Neumann entropy in the limit as n approaches infinity. This limit appears disconnected from reality, as the universe seems not to contain an infinite amount of anything, let alone telephone messages. Yet the limit simplifies the mathematics involved and approximates some real-world problems.

But the limit doesn’t approximate every real-world problem. What if I want to send only one copy of my message—one shot? One-shot information theory concerns how efficiently we can process finite amounts of information. Nilanjana and colleagues had defined entropies beyond Shannon’s and von Neumann’s, as well as proving properties of those entropies. The field’s cofounders also showed that these entropies quantify the optimal rates at which we can process finite amounts of information.

My mentors and I were applying one-shot information theory to quantum thermodynamics. I’d read papers of Nilanjana’s and spoken with her virtually (we probably used Skype back then). When I learned that she’d visit Waterloo in June, I was a kitten looking forward to a saucer of cream.

Nilanjana didn’t disappoint. First, she presented a seminar at Perimeter. I recall her discussing a resource theory (a simple information-theoretic model) for entanglement manipulation. One often models entanglement manipulators as experimentalists who can perform local operations and classical communications: each experimentalist can poke and prod the quantum system in their lab, as well as link their labs via telephone. We abbreviate the set of local operations and classical communications as LOCC. Nilanjana broadened my view to the superset SEP, the operations that map every separable (unentangled) state to a separable state.

Kudos to John Preskill for hunting down this screenshot of the video of Nilanjana’s seminar. The author appears on the left.

Then, because she eats seminars for breakfast, Nilanjana presented an even more distinguished talk the same day: a colloquium. It took place at the University of Waterloo’s Institute for Quantum Computing (IQC), a nearly half-hour walk from Perimeter. Would I be willing to escort Nilanjana between the two institutes? I most certainly would.

Nilanjana and I arrived at the IQC auditorium before anyone else except the colloquium’s host, Debbie Leung. Debbie is a University of Waterloo professor and another of the most rigorous quantum information theorists I know. I sat a little behind the two of them and marveled. Here were two of the scions of the science I was joining. Pinch me.

My relationship with Nilanjana deepened over the years. The first year of my PhD, she hosted a seminar by me at the University of Cambridge (although I didn’t present a colloquium later that day). Afterward, I wrote a Quantum Frontiers post about her research with PhD student Felix Leditzky. The two of them introduced me to second-order asymptotics. Second-order asymptotics dictate the rate at which one-shot entropies approach standard entropies as n (the number of copies of a message I’m compressing, say) grows large. 

The following year, Nilanjana and colleagues hosted me at “Beyond i.i.d. in Information Theory,” an annual conference dedicated to one-shot information theory. We convened in the mountains of Banff, Canada, about which I wrote another blog post. Come to think of it, Nilanjana lies behind many of my blog posts, as she lies behind many of my papers.

But I haven’t explained about the little black dress. Nilanjana wore one when presenting at Perimeter and the IQC. That year, I concluded that pants and shorts caused me so much discomfort, I’d wear only skirts and dresses. So I stuck out in physics gatherings like a theorem in a newspaper. My mother had schooled me in the historical and socioeconomic significance of the little black dress. Coco Chanel invented the slim, simple, elegant dress style during the 1920s. It helped free women from stifling, time-consuming petticoats and corsets: a few decades beforehand, dressing could last much of the morning—and then one would change clothes for the afternoon and then for the evening. The little black dress offered women freedom of movement, improved health, and control over their schedules. Better, the little black dress could suit most activities, from office work to dinner with friends.

Yet I didn’t recall ever having seen anyone present physics in a little black dress.

I almost never use this verb, but Nilanjana rocked that little black dress. She imbued it with all the professionalism and competence ever associated with it. Also, Nilanjana had long, dark hair, like mine (although I’ve never achieved her hair’s length); and she wore it loose, as I liked to. I recall admiring the hair hanging down her back after she received a question during the IQC colloquium. She’d whirled around to write the answer on the board, in the rapid-fire manner characteristic of her intellect. If one of the most incisive scientists I knew could wear dresses and long hair, then so could I.

Felix is now an assistant professor at the University of Illinois in Urbana-Champaign. I recently spoke with him and Mark Wilde, another one-shot information theorist and a guest blogger on Quantum Frontiers. The conversation led me to reminisce about the day I met Nilanjana. I haven’t visited Cambridge in years, and my research has expanded from one-shot thermodynamics into many-body physics. But one never forgets the classics.

Building a Visceral Understanding of Quantum Phenomena

A great childhood memory that I have comes from first playing “The Incredible Machine” on PC in the early 90’s. For those not in the know, this is a physics-based puzzle game about building Rube Goldberg style contraptions to achieve given tasks. What made this game a standout for me was the freedom that it granted players. In many levels you were given a disparate set of components (e.g. strings, pulleys, rubber bands, scissors, conveyor belts, Pokie the Cat…) and it was entirely up to you to “MacGuyver” your way to some kind of solution (incidentally, my favorite TV show from that time period). In other words, it was often a creative exercise in designing your own solution, rather than “connecting the dots” to find a single intended solution. Growing up with games like this undoubtedly had significant influence in directing me to my profession as a research scientist: a job which is often about finding novel or creative solutions to a task given a limited set of tools.

From the late 90’s onwards puzzle games like “The Incredible Machine” largely went out of fashion as developers focused more on 3D games that exploited that latest hardware advances. However, this genre saw a resurgence in 2010’s spearheaded by developer “Zachtronics” who released a plethora of popular, and exceptionally challenging, logic and programming based puzzle games (some of my favorites include Opus Magnum and TIS-100). Zachtronics games similarly encouraged players to solve problems through creative designs, but also had the side-effect of helping players to develop and practice tangible programming skills (e.g. design patterns, control flow, optimization). This is a really great way to learn, I thought to myself.

So, fast-forward several years, while teaching undergraduate/graduate quantum courses at Georgia Tech I began thinking about whether it would be possible to incorporate quantum mechanics (and specifically quantum circuits) into a Zachtronics-style puzzle game. My thinking was that such a game might provide an opportunity for students to experiment with quantum through a hands-on approach, one that encouraged creativity and self-directed exploration. I was also hoping that representing quantum processes through a visual language that emphasized geometry, rather than mathematical language, could help students develop intuition in this setting. These thoughts ultimately led to the development of The Qubit Factory. At its core, this is a quantum circuit simulator with a graphic interface (not too dissimilar to the Quirk quantum circuit simulator) but providing a structured sequence of challenges, many based on tasks of real-life importance to quantum computing, that players must construct circuits to solve.

An example level of The Qubit Factory in action, showcasing a potential solution to a task involving quantum error correction. The column of “?” tiles represents a noisy channel that has a small chance of flipping any qubit that passes through. Players are challenged to send qubits from the input on the left to the output on the right while mitigating errors that occur due to this noisy channel. The solution shown here is based on a bit-flip code, although a more advanced strategy is required to earn a bonus star for the level!

Quantum Gamification and The Qubit Factory

My goal in designing The Qubit Factory was to provide an accurate simulation of quantum mechanics (although not necessarily a complete one), such that players could learn some authentic, working knowledge about quantum computers and how they differ from regular computers. However, I also wanted to make a game that was accessible to the layperson (i.e. without a prior knowledge of quantum mechanics or the underlying mathematical foundations like linear algebra). These goals, which are largely opposing one-another, are not easy to balance!

A key step in achieving this balance was to find a suitable visual depiction of quantum states and processes; here the Bloch sphere, which provides a simple geometric representation of qubit states, was ideal. However, it is also here that I made my first major compromise to the scope of the physics within the game by restricting the game state to real-valued wave-functions (which in turn implies that only gates which transform qubits within the X-Z plane can be allowed). I feel that this compromise was ultimately the correct choice: it greatly enhanced the visual clarity by allowing qubits to be represented as arrows on a flat disk rather than on a sphere, and similarly allowed the action of single-qubit gates to depicted clearly (i.e. as rotations and flips on the disk). Some purists may object to this limitation on grounds that it prevents universal quantum computation, but my counterpoint would be that there are still many interesting quantum tasks and algorithms that can be performed within this restricted scope. In a similar spirit, I decided to forgo the standard quantum circuit notation: instead I used stylized circuits to emphasize the geometric interpretation as demonstrated in the example below. This choice was made with the intention of allowing players to infer the action of gates from the visual design alone.

A quantum circuit in conventional notation versus the same circuit depicted in The Qubit Factory.

Okay, so while the Bloch sphere provides a nice way to represent (unentangled) single qubit states, we also need a way to represent entangled states of multiple qubits. Here I made use of some creative license to show entangled states as blinking through the basis states. I found this visualization to work well for conveying simple states such as the singlet state presented below, but players are also able to view the complete list of wave-function amplitudes if necessary.

\textrm{Singlet: }\left| \psi \right\rangle = \tfrac{1}{\sqrt{2}} \left( \left| \uparrow \downarrow \right\rangle - \left| \downarrow \uparrow \right\rangle \right)

A singlet state is created by entangling a pair of qubits via a CNOT gate.

Although the blinking effect is not a perfect solution for displaying superpositions, I think that it is useful in conveying key aspects like uncertainty and correlation. The animation below shows an example of the entangled wave-function collapsing when one of the qubits is measured.

A single qubit from a singlet is measured. While each qubit has a 50/50 chance of giving ▲ or ▼ when measured individually, once one qubit is measured the other qubit collapses to the anti-aligned state.

So, thus far, I have described a quantum circuit simulator with some added visual cues and animations, but how can this be turned into a game? Here, I leaned heavily on the existing example of Zachtronic (and Zachtronic-like) games: each level in The Qubit Factory provides the player with some input bits/qubits and requires the player to perform some logical task in order to produce a set of desired outputs. Some of the levels within the game are highly structured, similar to textbook exercises. They aim to teach a specific concept and may only have a narrow set of potential solutions. An example of such a structured level is the first quantum level (lvl QI.A) which tasks the player with inverting a sequence of single qubit gates. Of course, this problem would be trivial to those of you already familiar with quantum mechanics: you could use the linear algebra result (AB)^\dag = B^\dag A^\dag together with the knowledge that quantum gates are unitary, so the Hermitian conjugate of each gate doubles as its inverse. But what if you didn’t know quantum mechanics, or even linear algebra? Could this problem be solved through logical reasoning alone? This is where I think that the visuals really help; players should be able to infer several key points from geometry alone:

  • the inverse of a flip (or mirroring about some axis) is another equal flip.
  • the inverse of a rotation is an equal rotation in the opposite direction.
  • the last transformation done on each qubit should be the first transformation to be inverted.

So I think it is plausible that, even without prior knowledge in quantum mechanics or linear algebra, a player could not only solve the level but also grasp some important concepts (i.e. that quantum gates are invertible and that the order in which they are applied matters).

An early level challenges the player to invert the action of the 3 gates on the left. A solution is given on the right, formed by composing the inverse of each gate in reverse order.

Many of the levels in The Qubit Factory are also designed to be open-ended. Such levels, which often begin with a blank factory, have no single intended solution. The player is instead expected to use experimentation and creativity to design their own solution; this is the setting where I feel that the “game” format really shines. An example of an open-ended level is QIII.E, which gives the player 4 copies of a single qubit state \left| \psi \right\rangle, guaranteed to be either the +Z or +X eigenstate, and tasks the player to determine which state they have been given. Those familiar with quantum computing will recognize this as a relatively simple problem in state tomography. There are many viable strategies that could be employed to solve this task (and I am not even sure of the optimal one myself). However, by circumventing the need for a mathematical calculation, the Qubit Factory allows players to easily and quickly explore different approaches. Hopefully this could allow players to find effective strategies through trial-and-error, gaining some understanding of state tomography (and why it is challenging) in the process.

An example of a level in action! This level challenges the player to construct a circuit that can identify an unknown qubit state given several identical copies; a task in state tomography. The solution shown here uses a cascaded sequence of measurements, where the result of one measurement is used to control the axis of a subsequent measurement.

The Qubit Factory begins with levels covering the basics of qubits, gates and measurements. It later progresses to more advanced concepts like superpositions, basis changes and entangled states. Finally it culminates with levels based on introductory quantum protocols and algorithms (including quantum error correction, state tomography, super-dense coding, quantum repeaters, entanglement distillation and more). Even if you are familiar with the aforementioned material you should still be in for a substantial challenge, so please check it out if that sounds like your thing!

The Potential of Quantum Games

I believe that interactive games have great potential to provide new opportunities for people to better understand the quantum realm (a position shared by the IQIM, members of which have developed several projects in this area). As young children, playing is how we discover the world around us and build intuition for the rules that govern it. This is perhaps a significant reason why quantum mechanics is often a challenge for new students to learn; we don’t have direct experience or intuition with the quantum world in the same way that we do with the classical world. A quote from John Preskill puts it very succinctly:

“Perhaps kids who grow up playing quantum games will acquire a visceral understanding of quantum phenomena that our generation lacks.”


The Qubit Factory can be played at www.qubitfactory.io

My favorite rocket scientist

Whenever someone protests, “I’m not a rocket scientist,” I think of my friend Jamie Rankin. Jamie is a researcher at Princeton University, and she showed me her lab this June. When I first met Jamie, she was testing instruments to be launched on NASA’s Parker Solar Probe. The spacecraft has approached closer to the sun than any of its predecessors. It took off in August 2018—fittingly, from my view, as I’d completed my PhD a few months earlier and met Jamie near the beginning of my PhD.

During my first term of Caltech courses, I noticed Jamie in one of my classes. She seemed sensible and approachable, so I invited her to check our answers against each other on homework assignments. Our homework checks evolved into studying together for qualifying exams—tests of basic physics knowledge, which serve as gateways to a PhD. The studying gave way to eating lunch together on weekends. After a quiet morning at my desk, I’d bring a sandwich to a shady patch of lawn in front of Caltech’s institute for chemical and biological research. (Pasadena lawns are suitable for eating on regardless of the season.) Jamie would regale me—as her token theorist friend—with tales of suiting up to use clean rooms; of puzzling out instrument breakages; and of working for the legendary Ed Stone, who’d headed NASA’s Jet Propulsion Laboratory (JPL).1

The Voyager probes were constructed at JPL during the 1970s. I’m guessing you’ve heard of Voyager, given how the project captured the public’s imagination. I heard about it on an educational audiotape when I was little. The probes sent us data about planets far out in our solar system. For instance, Voyager 2 was the first spacecraft to approach Neptune, as well as the first to approach four planets past Earth (Jupiter, Saturn, Uranus, and Neptune). But the probes’ mission still hasn’t ended. In 2012, Voyager 1 became the first human-made object to enter interstellar space. Both spacecrafts continue to transmit data. They also carry Golden Records, disks that encode sounds from Earth—a greeting to any intelligent aliens who find the probes.

Jamie published the first PhD thesis about data collected by Voyager. She now serves as Deputy Project Scientist for Voyager, despite her early-career status. The news didn’t surprise me much; I’d known for years how dependable and diligent she is.

A theorist intrudes on Jamie’s Princeton lab

As much as I appreciated those qualities in Jamie, though, what struck me more was her good-heartedness. In college, I found fellow undergrads to be interested and interesting, energetic and caring, open to deep conversations and self-evaluation—what one might expect of Dartmouth. At Caltech, I found grad students to be candid, generous, and open-hearted. Would you have expected as much from the tech school’s tech school—the distilled essence of the purification of concentrated Science? I didn’t. But I appreciated what I found, and Jamie epitomized it.

The back of the lab coat I borrowed

Jamie moved to Princeton after graduating. I’d moved to Harvard, and then I moved to NIST. We fell out of touch; the pandemic prevented her from attending my wedding, and we spoke maybe once a year. But, this June, I visited Princeton for the annual workshop of the Institute for Robust Quantum Simulation. We didn’t eat sandwiches on a lawn, but we ate dinner together, and she showed me around the lab she’d built. (I never did suit up for a clean-room tour at Caltech.)

In many ways, Jamie Rankin remains my favorite rocket scientist.


1Ed passed away between the drafting and publishing of this post. He oversaw my PhD class’s first-year seminar course. Each week, one faculty member would present to us about their research over pizza. Ed had landed the best teaching gig, I thought: continual learning about diverse, cutting-edge physics. So I associate Ed with intellectual breadth, curiosity, and the scent of baked cheese.

Quantum Frontiers salutes an English teacher

If I ever mention a crazy high-school English teacher to you, I might be referring to Mr. Lukacs. One morning, before the first bell rang, I found him wandering among the lockers, wearing a white beard and a mischievous grin. (The school had pronounced the day “Dress Up as Your Favorite Writer” Day, or some such designation, but still.1) Mr. Lukacs was carrying a copy of Leaves of Grass, a book by the nineteenth-century American poet Walt Whitman, and yawping. To yawp is to cry out, and Whitman garnered acclaim for weaving such colloquialisms into his poetry. “I sound my barbaric yawp over the roofs of the world,” he wrote in Leaves of Grass—as Mr. Lukacs illustrated until the bells rang for class. And, for all I know, until the final bell.

I call Mr. Lukacs one of my crazy high-school English teachers despite never having taken any course of his.2 He served as the faculty advisor for the school’s literary magazine, on whose editorial board I served. As a freshman and sophomore, I kept my head down and scarcely came to know Mr. Lukacs. He wore small, round glasses and a bowtie. As though to ham up the idiosyncrasy, he kept a basket of bowties in his classroom. His hair had grayed, he spoke slowly, and he laughed in startling little bursts that resembled gasps. 

Junior year, I served as co-editor-in-chief of the literary magazine; and, senior year, as editor-in-chief. I grew to conjecture that Mr. Lukacs spoke slowly because he was hunting for the optimal word to use next. Finding that word cost him a pause, but learning his choice enriched the listener. And Mr. Lukacs adored literature. You could hear, when he read aloud, how he invested himself in it. 

I once submitted to the literary magazine a poem about string theory, inspired by a Brian Greene book.3 As you might expect, if you’ve ever read about string theory, the poem invoked music. Mr. Lukacs pretended to no expertise in science; he even had a feud with the calculus teacher.4 But he wrote that the poem made him feel like dancing.

You might fear that Mr. Lukacs too strongly echoed the protagonist of Dead Poets Society to harbor any originality. The 1989 film Dead Poets Society stars Robin Williams as an English teacher who inspires students to discover their own voices, including by yawping à la Whitman. But Mr. Lukacs leaned into the film, with a gleeful sort of exultation. He even interviewed one of the costars, who’d left acting to teach, for a job. The interview took place beside a cardboard-cutout advertisement for Dead Poets Society—a possession, I’m guessing, of Mr. Lukacs’s.

This winter, friends of Mr. Lukacs’s helped him create a Youtube video for his former students. He sounded as he had twenty years before. But he said goodbye, expecting his cancer journey to end soon. Since watching the video, I’ve been waffling between reading Goodbye, Mr. Chips—a classic novella I learned of around the time the video debuted—and avoiding it. I’m not sure what Mr. Lukacs would advise—probably to read, rather than not to read. But I like the thought of saluting a literary-magazine advisor on Quantum Frontiers. We became Facebook friends years ago; and, although I’ve rarely seen activity by him, he’s occasionally effused over some physics post of mine.

Physics brought me to the Washington, DC area, where a Whitman quote greets entrants to the Dupont Circle metro station. The DC area also houses Abraham Lincoln’s Cottage, where the president moved with his wife. They sought quietude to mourn their son Willie, who’d succumbed to an illness. Lincoln rode from the cottage to the White House every day. Whitman lived along his commute, according to a panel in the visitors’ center. I was tickled to learn that the two men used to exchange bows during that commute—one giant of politics and one giant of literature.

I wrote the text above this paragraph, as well as the text below, within a few weeks of watching the Youtube video. The transition between the two bothered me; it felt too abrupt. But I asked Mr. Lukacs via email whether he’d mind my posting the story. I never heard back. I learned why this weekend: he’d passed away on Friday. The announcement said, “please consider doing something that reminds you of George in the coming days. Read a few lines of a cherished text. Marvel at a hummingbird…” So I determined to publish the story without approval. I can think of no tribute more fitting than a personal essay published on a quantum blog that’s charted my intellectual journey of the past decade.

Here’s to another giant of literature. Goodbye, Mr. Lukacs.

Image from wmata.com

1I was too boring to dress up as anyone.

2I call him one of my crazy high-school English teachers because his wife merits the epithet, too. She called herself senile, enacted the climax of Jude the Obscure with a student’s person-shaped pencil case, and occasionally imitated a chipmunk; but damn, do I know my chiasmus from my caesura because of her.

3That fact sounds hackneyed to me now. But I’m proud never to have entertained grand dreams of discovering a theory of everything.

4AKA my crazy high-school calculus teacher. My high school had loads of crazy teachers, but it also had loads of excellent teachers, and the crazy ones formed a subset of the excellent ones.

How I didn’t become a philosopher (but wound up presenting a named philosophy lecture anyway)

Many people ask why I became a theoretical physicist. The answer runs through philosophy—which I thought, for years, I’d left behind in college.

My formal relationship with philosophy originated with Mr. Bohrer. My high school classified him as a religion teacher, but he co-opted our junior-year religion course into a philosophy course. He introduced us to Plato’s cave, metaphysics, and the pursuit of the essence beneath the skin of appearance. The essence of reality overlaps with quantum theory and relativity, which fascinated him. Not that he understood them, he’d hasten to clarify. But he passed along that fascination to me. I’d always loved dealing in abstract ideas, so the notion of studying the nature of the universe attracted me. A friend and I joked about growing up to be philosophers and—on account of not being able to find jobs—living in cardboard boxes next to each other.

After graduating from high school, I searched for more of the same in Dartmouth College’s philosophy department. I began with two prerequisites for the philosophy major: Moral Philosophy and Informal Logic. I adored those courses, but I adored all my courses.

As a sophomore, I embarked upon an upper-level philosophy course: philosophy of mind. I was one of the course’s youngest students, but the professor assured me that I’d accumulated enough background information in science and philosophy classes. Yet he and the older students threw around technical terms, such as qualia, that I’d never heard of. Those terms resurfaced in the assigned reading, again without definitions. I struggled to follow the conversation.

Meanwhile, I’d been cycling through the sciences. I’d taken my high school’s highest-level physics course, senior year—AP Physics C: Mechanics and Electromagnetism. So, upon enrolling in college, I made the rounds of biology, chemistry, and computer science. I cycled back to physics at the beginning of sophomore year, taking Modern Physics I in parallel with Informal Logic. The physics professor, Miles Blencowe, told me, “I want to see physics in your major.” I did, too, I assured him. But I wanted to see most subjects in my major.

Miles, together with department chair Jay Lawrence, helped me incorporate multiple subjects into a physics-centric program. The major, called “Physics Modified,” stood halfway between the physics major and the create-your-own major offered at some American liberal-arts colleges. The program began with heaps of prerequisite courses across multiple departments. Then, I chose upper-level physics courses, a math course, two history courses, and a philosophy course. I could scarcely believe that I’d planted myself in a physics department; although I’d loved physics since my first course in it, I loved all subjects, and nobody in my family did anything close to physics. But my major would provide a well-rounded view of the subject.

From shortly after I declared my Physics Modified major. Photo from outside the National Academy of Sciences headquarters in Washington, DC.

The major’s philosophy course was an independent study on quantum theory. In one project, I dissected the “EPR paper” published by Einstein, Podolsky, and Rosen (EPR) in 1935. It introduced the paradox that now underlies our understanding of entanglement. But who reads the EPR paper in physics courses nowadays? I appreciated having the space to grapple with the original text. Still, I wanted to understand the paper more deeply; the philosophy course pushed me toward upper-level physics classes.

What I thought of as my last chance at philosophy evaporated during my senior spring. I wanted to apply to graduate programs soon, but I hadn’t decided which subject to pursue. The philosophy and history of physics remained on the table. A history-of-physics course, taught by cosmologist Marcelo Gleiser, settled the matter. I worked my rear off in that course, and I learned loads—but I already knew some of the material from physics courses. Moreover, I knew the material more deeply than the level at which the course covered it. I couldn’t stand the thought of understanding the rest of physics only at this surface level. So I resolved to burrow into physics in graduate school. 

Appropriately, Marcelo published a book with a philosopher (and an astrophysicist) this March.

Burrow I did: after a stint in condensed-matter research, I submerged up to my eyeballs in quantum field theory and differential geometry at the Perimeter Scholars International master’s program. My research there bridged quantum information theory and quantum foundations. I appreciated the balance of fundamental thinking and possible applications to quantum-information-processing technologies. The rigorous mathematical style (lemma-theorem-corollary-lemma-theorem-corollary) appealed to my penchant for abstract thinking. Eating lunch with the Perimeter Institute’s quantum-foundations group, I felt at home.

Craving more research at the intersection of quantum thermodynamics and information theory, I enrolled at Caltech for my PhD. As I’d scarcely believed that I’d committed myself to my college’s physics department, I could scarcely believe that I was enrolling in a tech school. I was such a child of the liberal arts! But the liberal arts include the sciences, and I ended up wrapping Caltech’s hardcore vibe around myself like a favorite denim jacket.

Caltech kindled interests in condensed matter; atomic, molecular, and optical physics; and even high-energy physics. Theorists at Caltech thought not only abstractly, but also about physical platforms; so I started to, as well. I began collaborating with experimentalists as a postdoc, and I’m now working with as many labs as I can interface with at once. I’ve collaborated on experiments performed with superconducting qubits, photons, trapped ions, and jammed grains. Developing an abstract idea, then nursing it from mathematics to reality, satisfies me. I’m even trying to redirect quantum thermodynamics from foundational insights to practical applications.

At the University of Toronto in 2022, with my experimental collaborator Batuhan Yılmaz—and a real optics table!

So I did a double-take upon receiving an invitation to present a named lecture at the University of Pittsburgh Center for Philosophy of Science. Even I, despite not being a philosopher, had heard of the cache of Pitt’s philosophy-of-science program. Why on Earth had I received the invitation? I felt the same incredulity as when I’d handed my heart to Dartmouth’s physics department and then to a tech school. But now, instead of laughing at the image of myself as a physicist, I couldn’t see past it.

Why had I received that invitation? I did a triple-take. At Perimeter, I’d begun undertaking research on resource theories—simple, information-theoretic models for situations in which constraints restrict the operations one can perform. Hardly anyone worked on resource theories then, although they form a popular field now. Philosophers like them, and I’ve worked with multiple classes of resource theories by now.

More recently, I’ve worked with contextuality, a feature that distinguishes quantum theory from classical theories. And I’ve even coauthored papers about closed timelike curves (CTCs), hypothetical worldlines that travel backward in time. CTCs are consistent with general relativity, but we don’t know whether they exist in reality. Regardless, one can simulate CTCs, using entanglement. Collaborators and I applied CTC simulations to metrology—to protocols for measuring quantities precisely. So we kept a foot in practicality and a foot in foundations.

Perhaps the idea of presenting a named lecture on the philosophy of science wasn’t hopelessly bonkers. All right, then. I’d present it.

Presenting at the Center for Philosophy of Science

This March, I presented an ALS Lecture (an Annual Lecture Series Lecture, redundantly) entitled “Field notes on the second law of quantum thermodynamics from a quantum physicist.” Scientists formulated the second law the early 1800s. It helps us understand why time appears to flow in only one direction. I described three enhancements of that understanding, which have grown from quantum thermodynamics and nonequilibrium statistical mechanics: resource-theory results, fluctuation theorems, and thermodynamic applications of entanglement. I also enjoyed talking with Center faculty and graduate students during the afternoon and evening. Then—being a child of the liberal arts—I stayed in Pittsburgh for half the following Saturday to visit the Carnegie Museum of Art.

With a copy of a statue of the goddess Sekhmet. She lives in the Carnegie Museum of Natural History, which shares a building with the art museum, from which I detoured to see the natural-history museum’s ancient-Egypt area (as Quantum Frontiers regulars won’t be surprised to hear).

Don’t get me wrong: I’m a physicist, not a philosopher. I don’t have the training to undertake philosophy, and I have enough work to do in pursuit of my physics goals. But my high-school self would approve—that self is still me.

The quantum gold rush

Even if you don’t recognize the name, you probably recognize the saguaro cactus. It’s the archetype of the cactus, a column from which protrude arms bent at right angles like elbows. As my husband pointed out, the cactus emoji is a saguaro: 🌵. In Tucson, Arizona, even the airport has a saguaro crop sufficient for staging a Western short film. I didn’t have a film to shoot, but the garden set the stage for another adventure: the ITAMP winter school on quantum thermodynamics.

Tucson airport

ITAMP is the Institute for Theoretical Atomic, Molecular, and Optical Physics (the Optical is silent). Harvard University and the Smithsonian Institute share ITAMP, where I worked as a postdoc. ITAMP hosted the first quantum-thermodynamics conference to take place on US soil, in 2017. Also, ITAMP hosts a winter school in Arizona every February. (If you lived in the Boston area, you might want to escape to the southwest then, too.) The winter school’s topic varies from year to year. 

How about a winter school on quantum thermodynamics? ITAMP’s director, Hossein Sadeghpour, asked me when I visited Cambridge, Massachusetts last spring.

Let’s do it, I said. 

Lecturers came from near and far. Kanu Sinha, of the University of Arizona, spoke about how electric charges fluctuate in the quantum vacuum. Fluctuations feature also in extensions of the second law of thermodynamics, which helps explain why time flows in only one direction. Gabriel Landi, from the University of Rochester, lectured about these fluctuation relations. ITAMP Postdoctoral Fellow Ceren Dag explained why many-particle quantum systems register time’s arrow. Ferdinand Schmidt-Kaler described the many-particle quantum systems—the trapped ions—in his lab at the University of Mainz.

Ronnie Kosloff, of Hebrew University in Jerusalem, lectured about quantum engines. Nelly Ng, an Assistant Professor at Nanyang Technological University, has featured on Quantum Frontiers at least three times. She described resource theories—information-theoretic models—for thermodynamics. Information and energy both serve as resources in thermodynamics and computation, I explained in my lectures.

The 2024 ITAMP winter school

The winter school took place at the conference center adjacent to Biosphere 2. Biosphere 2 is an enclosure that contains several miniature climate zones, including a coastal fog desert, a rainforest, and an ocean. You might have heard of Biosphere 2 due to two experiments staged there during the 1990s: in each experiment, a group of people was sealed in the enclosure. The experimentalists harvested their own food and weren’t supposed to receive any matter from outside. The first experiment lasted for two years. The group, though, ran out of oxygen, which a support crew pumped in. Research at Biosphere 2 contributes to our understanding of ecosystems and space colonization.

Fascinating as the landscape inside Biosphere 2 is, so is the landscape outside. The winter school included an afternoon hike, and my husband and I explored the territory around the enclosure.

Did you see any snakes? my best friend asked after I returned home.

No, I said. But we were chased by a vicious beast. 

On our first afternoon, my husband and I followed an overgrown path away from the biosphere to an almost deserted-looking cluster of buildings. We eventually encountered what looked like a warehouse from which noises were emanating. Outside hung a sign with which I resonated.

Scientists, I thought. Indeed, a researcher emerged from the warehouse and described his work to us. His group was preparing to seal off a building where they were simulating a Martian environment. He also warned us about the territory we were about to enter, especially the creature that roosted there. We were too curious to retreat, though, so we set off into a ghost town.

At least, that’s what the other winter-school participants called the area, later in the week—a ghost town. My husband and I had already surveyed the administrative offices, conference center, and other buildings used by biosphere personnel today. Personnel in the 1980s used a different set of buildings. I don’t know why one site gave way to the other. But the old buildings survive—as what passes for ancient ruins to many Americans. 

Weeds have grown up in the cracks in an old parking lot’s tarmac. A sign outside one door says, “Classroom”; below it is a sign that must not have been correct in decades: “Class in progress.” Through the glass doors of the old visitors’ center, we glimpsed cushioned benches and what appeared to be a diorama exhibit; outside, feathers and bird droppings covered the ground. I searched for a tumbleweed emoji, to illustrate the atmosphere, but found only a tumbler one: 🥃.

After exploring, my husband and I rested in the shade of an empty building, drank some of the water we’d brought, and turned around. We began retracing our steps past the defunct visitors’ center. Suddenly, a monstrous Presence loomed on our right. 

I can’t tell you how large it was; I only glimpsed it before turning and firmly not running away. But the Presence loomed. And it confirmed what I’d guessed upon finding the feathers and droppings earlier: the old visitors’ center now served as the Lair of the Beast.

The Mars researcher had warned us about the aggressive male turkey who ruled the ghost town. The turkey, the researcher had said, hated men—especially men wearing blue. My husband, naturally, was wearing a blue shirt. You might be able to outrun him, the researcher added pensively.

My husband zipped up his black jacket over the blue shirt. I advised him to walk confidently and not too quickly. Hikes in bear country, as well as summers at Busch Gardens Zoo Camp, gave me the impression that we mustn’t run; the turkey would probably chase us, get riled up, and excite himself to violence. So we walked, and the monstrous turkey escorted us. For surprisingly and frighteningly many minutes. 

The turkey kept scolding us in monosyllabic squawks, which sounded increasingly close to the back of my head. I didn’t turn around to look, but he sounded inches away. I occasionally responded in the soothing voice I was taught to use on horses. But my husband and I marched increasingly quickly.

We left the old visitors’ center, curved around, and climbed most of a hill before ceasing to threaten the turkey—or before he ceased to threaten us. He squawked a final warning and fell back. My husband and I found ourselves amid the guest houses of workshops past, shaky but unmolested. Not that the turkey wreaks much violence, according to the Mars researcher: at most, he beats his wings against people and scratches up their cars (especially blue ones). But we were relieved to return to civilization.

Afternoon hike at Catalina State Park, a drive away from Biosphere 2. (Yes, that’s a KITP hat.)

The ITAMP winter school reminded me of Roughing It, a Mark Twain book I finished this year. Twain chronicled the adventures he’d experienced out West during the 1860s. The Gold Rush, he wrote, attracted the top young men of all nations. The quantum-technologies gold rush has been attracting the top young people of all nations, and the winter school evidenced their eagerness. Yet the winter school also evidenced how many women have risen to the top: 10 of the 24 registrants were women, as were four of the seven lecturers.1 

The winter-school participants in the shuttle I rode from the Tucson airport to Biosphere 2

We’ll see to what extent the quantum-technologies gold rush plays out like Mark Twain’s. Ours at least involves a ghost town and ferocious southwestern critters.

1For reference, when I applied to graduate programs, I was told that approximately 20% of physics PhD students nationwide were women. The percentage of women drops as one progresses up the academic chain to postdocs and then to faculty members. And primarily PhD students and postdocs registered for the winter school.

The rain in Portugal

My husband taught me how to pronounce the name of the city where I’d be presenting a talk late last July: Aveiro, Portugal. Having studied Spanish, I pronounced the name as Ah-VEH-roh, with a v partway to a hard b. But my husband had studied Portuguese, so he recommended Ah-VAI-roo

His accuracy impressed me when I heard the name pronounced by the organizer of the conference I was participating in—Theory of Quantum Computation, or TQC. Lídia del Rio grew up in Portugal and studied at the University of Aveiro, so I bow to her in matters of Portuguese pronunciation. I bow to her also for organizing one of the world’s largest annual quantum-computation conferences (with substantial help—fellow quantum physicist Nuriya Nurgalieva shared the burden). But Lídia cofounded Quantum, a journal that’s risen from a Gedankenexperiment to a go-to venue in six years. So she gives the impression of being able to manage anything.

Aveiro architecture

Watching Lídia open TQC gave me pause. I met her in 2013, the summer before beginning my PhD at Caltech. She was pursuing her PhD at ETH Zürich, which I was visiting. Lídia took me dancing at an Argentine-tango studio one evening. Now, she’d invited me to speak at an international conference that she was coordinating.

Lídia and me in Zürich as PhD students
Lídia opening TQC

Not only Lídia gave me pause; so did the three other invited speakers. Every one of them, I’d met when each of us was a grad student or a postdoc. 

Richard Küng described classical shadows, a technique for extracting information about quantum states via measurements. Suppose we wish to infer about diverse properties of a quantum state \rho (about diverse observables’ expectation values). We have to measure many copies of \rho—some number n of copies. The community expected n to grow exponentially with the system’s size—for instance, with the number of qubits in a quantum computer’s register. We can get away with far fewer, Richard and collaborators showed, by randomizing our measurements. 

Richard postdocked at Caltech while I was a grad student there. Two properties of his stand out in my memory: his describing, during group meetings, the math he’d been exploring and the Austrian accent in which he described that math.

Did this restaurant’s owners realize that quantum physicists were descending on their city? I have no idea.

Also while I was a grad student, Daniel Stilck França visited Caltech. Daniel’s TQC talk conveyed skepticism about whether near-term quantum computers can beat classical computers in optimization problems. Near-term quantum computers are NISQ (noisy, intermediate-scale quantum) devices. Daniel studied how noise (particularly, local depolarizing noise) propagates through NISQ circuits. Imagine a quantum computer suffering from a 1% noise error. The quantum computer loses its advantage over classical competitors after 10 layers of gates, Daniel concluded. Nor does he expect error mitigation—a bandaid en route to the sutures of quantum error correction—to help much.

I’d coauthored a paper with the fourth invited speaker, Adam Bene Watts. He was a PhD student at MIT, and I was a postdoc. At the time, he resembled the 20th-century entanglement guru John Bell. Adam still resembles Bell, but he’s moved to Canada.

Adam speaking at TQC
From a 2021 Quantum Frontiers post of mine. I was tickled to see that TQC’s organizers used the photo from my 2021 post as Adam’s speaker photo.

Adam distinguished what we can compute using simple quantum circuits but not using simple classical ones. His results fall under the heading of complexity theory, about which one can rarely prove anything. Complexity theorists cling to their jobs by assuming conjectures widely expected to be true. Atop the assumptions, or conditions, they construct “conditional” proofs. Adam proved unconditional claims in complexity theory, thanks to the simplicity of the circuits he compared.

In my estimation, the talks conveyed cautious optimism: according to Adam, we can prove modest claims unconditionally in complexity theory. According to Richard, we can spare ourselves trials while measuring certain properties of quantum systems. Even Daniel’s talk inspired more optimism than he intended: a few years ago, the community couldn’t predict how noisy short-depth quantum circuits could perform. So his defeatism, rooted in evidence, marks an advance.

Aveiro nurtures optimism, I expect most visitors would agree. Sunshine drenches the city, and the canals sparkle—literally sparkle, as though devised by Elsa at a higher temperature than usual. Fresh fruit seems to wend its way into every meal.1 Art nouveau flowers scale the architecture, and fanciful designs pattern the tiled sidewalks.

What’s more, quantum information theorists of my generation were making good. Three riveted me in their talks, and another co-orchestrated one of the world’s largest quantum-computation gatherings. To think that she’d taken me dancing years before ascending to the global stage.

My husband and I made do, during our visit, by cobbling together our Spanish, his Portuguese, and occasional English. Could I hold a conversation with the Portuguese I gleaned? As adroitly as a NISQ circuit could beat a classical computer. But perhaps we’ll return to Portugal, and experimentalists are doubling down on quantum error correction. I remain cautiously optimistic.

1As do eggs, I was intrigued to discover. Enjoyed a hardboiled egg at breakfast? Have a fried egg on your hamburger at lunch. And another on your steak at dinner. And candied egg yolks for dessert.

This article takes its title from a book by former US Poet Laureate Billy Collins. The title alludes to a song in the musical My Fair Lady, “The Rain in Spain.” The song has grown so famous that I don’t think twice upon hearing the name. “The rain in Portugal” did lead me to think twice—and so did TQC.

With thanks to Lídia and Nuriya for their hospitality. You can submit to TQC2024 here.