When I went to school in the 20th century, “quantum measurements” in the laboratory were typically performed on ensembles of similarly prepared systems. In the 21st century, it is becoming increasingly routine to perform quantum measurements on single atoms, photons, electrons, or phonons. The 2012 Nobel Prize in Physics recognizes two of the heros who led these revolutionary advances, Serge Haroche and Dave Wineland. Good summaries of their outstanding achievements can be found at the Nobel Prize site, and at Physics Today.
Serge Haroche developed cavity quantum electrodynamics in the microwave regime. Among other impressive accomplishments, his group has performed “nondemolition” measurements of the number of photons stored in a cavity (that is, the photons can be counted without any of the photons being absorbed). The measurement is done by preparing a Rubidium atom in a superposition of two quantum states. As the Rb atom traverses the cavity, the energy splitting of these two states is slightly perturbed by the cavity’s quantized electromagnetic field, resulting in a detectable phase shift that depends on the number of photons present. (Caltech’s Jeff Kimble, the Director of IQIM, has pioneered the development of analogous capabilities for optical photons.)
Dave Wineland developed the technology for trapping individual atomic ions or small groups of ions using electromagnetic fields, and controlling the ions with laser light. His group performed the first demonstration of a coherent quantum logic gate, and they have remained at the forefront of quantum information processing ever since. They pioneered and mastered the trick of manipulating the internal quantum states of the ions by exploiting the coupling between these states and the quantized vibrational modes (phonons) of the trapped ions. They have also used quantum logic to realize the world’s most accurate clock (17 decimal places of accuracy), which exploits the frequency stability of an aluminum ion by transferring its quantum state to a magnesium ion that can be more easily detected with lasers. This clock is sensitive enough to detect the slowing of time due to the gravitational red shift when lowered by 30 cm in the earth’s gravitational field.
With his signature mustache and self-effacing manner, Dave Wineland is not only one of the world’s greatest experimental physicists, but also one of the nicest. His brilliant experiments and crystal clear talks have inspired countless physicists working in quantum science, not just ion trappers but also those using a wide variety of other experimental platforms.
Dave has spent most of his career at the National Institute of Standards and Technology (NIST) in Boulder, Colorado. I once heard Dave say that he liked working at NIST because “in 30 years nobody told me what to do.” I don’t know whether that is literally true, but if it is even partially true it may help to explain why Dave joins three other NIST-affiliated physicists who have received Nobel Prizes: Bill Phillips, Eric Cornell, and “Jan” Hall.
I don’t know Serge Haroche very well, but I once spent a delightful evening sitting next to him at dinner in an excellent French restaurant in Leiden. The occasion, almost exactly 10 years ago, was a Symposium to celebrate the 100th anniversary of H. A. Lorentz’s Nobel Prize in Physics, and the dinner guests (there were about 20 of us) included the head of the Royal Dutch Academy of Sciences and the Rector Magnificus of the University of Leiden (which I suppose is what we in the US would call the “President”). I was invited because I happened to be a visiting professor in Leiden at the time, but I had not anticipated such a classy gathering, so had not brought a jacket or tie. When I realized what I had gotten myself into I rushed to a nearby store and picked up a tie and a black V-neck sweater to pull over my levis, but I was under-dressed to put it mildly. Looking back, I don’t understand why I was not more embarrassed.
Anyway, among other things we discussed, Serge filled me in on the responsibilities of a Professor at the College de France. It’s a great honor, but also a challenge, because each year one must lecture on fresh material, without repeating any topic from lectures in previous years. In 2001 he had taught quantum computing using my online lecture notes, so I was pleased to hear that I had eased his burden, at least for one year.
On another memorable occasion, Serge and I both appeared in a panel discussion at a conference on quantum computing in 1996, at the Institute for Theoretical Physics (now the KITP) in Santa Barbara. Serge and a colleague had published a pessimistic article in Physics Today: Quantum computing: dream or nightmare? In his remarks for the panel, he repeated this theme, warning that overcoming the damaging effects of decoherence (uncontrolled interactions with the environment which make quantum systems behave classically, and which Serge had studied experimentally in great detail) is a far more daunting task than theorists imagined. I struck a more optimistic note, hoping that the (then) recently discovered principles of quantum error correction might be the sword that could slay the dragon. I’m not sure how Haroche feels about this issue now. Wineland, too, has often cautioned that the quest for large-scale quantum computers will be a long and difficult struggle.
This exchange provided me with an opportunity to engage in some cringe-worthy rhetorical excess when I wrote up a version of my remarks. Having (apparently) not learned my lesson, I’ll quote the concluding paragraph, which somehow seems appropriate as we celebrate Haroche’s and Wineland’s well earned prizes:
“Serge Haroche, while a leader at the frontier of experimental quantum computing, continues to deride the vision of practical quantum computers as an impossible dream that can come to fruition only in the wake of some as yet unglimpsed revolution in physics. As everyone at this meeting knows well, building a quantum computer will be an enormous technical challenge, and perhaps the naysayers will be vindicated in the end. Surely, their skepticism is reasonable. But to me, quantum computing is not an impossible dream; it is a possible dream. It is a dream that can be held without flouting the laws of physics as currently understood. It is a dream that can stimulate an enormously productive collaboration of experimenters and theorists seeking deep insights into the nature of decoherence. It is a dream that can be pursued by responsible scientists determined to explore, without prejudice, the potential of a fascinating and powerful new idea. It is a dream that could change the world. So let us dream.”
I think I am a year or two younger than John Preskill – but what I think is interesting is that all those old 20th century textbooks from which I learned quantum mechanics rarely mentioned that all the experiments supporting QM were on ensembles. I guess I was a naive theorist but in the beginning I thought all those descriptions of single particles in a superposition of states were actually supported by experiments and not just extrapolations from ensemble experiments.
Anyway glad to see these guys win a Noble prize. And one of them Haroche, has a nice easy to read textbook on much of his work – Exploring the Quantum, Atoms, Cavities, and Photons.
Apologies, Gentlemen, but I don’t understand this “novelty about being individual”. Since the very beginning, the 1920s, quantum mechanics excelled especially in describing small microscopic i.e. “individual” systems but since the beginning, its description of these small systems was statistical in character, so to verify the predictions – predicted probabilities – one needed to repeat the same experiment many times (or do many experiments, or many of “anything” after which the probabilities are converted to observables with a small error error margin). Does someone really claim here that something has changed about these basic facts? I think that “individual” really means “attempting to delay the measurement/decoherence as much as possible”, but it’s still true that the individual microscopic events are only predicted probabilistically, right?
Dear Lubos, could you elaborate your question? Those microscopic events were probabilistically predicted in the works under consideration (and if it is a purpose of such experiments after all)? If statistical methods may be successfully applied for control of individual quantum systems and how?
Yes, quantum mechanics makes probabilistic predictions concerning the possible outcomes of quantum measurements — that hasn’t changed. But once we can make repeated measurements of a single quantum system like a trapped ion, new phenomena occur.
For example, with lasers one can simultaneously drive the transitions 1 -> 2 and 1 -> 3 in a single atom, where 1 is the ground state, and the decay 3 -> 1 is fast while the decay 2 -> 1 is slow. The atom glows due to the rapid 1 -> 3 -> 1 cycling transition, as photons are absorbed from the laser and then re-emitted, but intermittently the atoms turns dark because it has become “shelved” in the long-lived level 2. In this and other ways, repeated observations of a single atom are quite different from observations of an ensemble of many atoms. (For a discussion, see for example: http://tf.nist.gov/general/pdf/1098.pdf.)
You are right that an important goal is to delay decoherence, and in particular to preserve coherence in a system with many atoms (or qubits realized by other physical systems), where we can control the atoms individually. In particular, individual control is essential for operating a quantum computer.
Thanks Prof Preskill, understood. So the new essence is that the repetition needed to do statistics is done with a single system instead of with many parallel ones. Still, it’s governed by the same statistical laws. In this business, there’s a question whether it’s a pure science or an applied science. I would still think that all the new things have to be new as an applied science only and this may only be properly evaluated as a new advance once it becomes useful.
Lubos, how to describe an individual system governed by statistical laws without a risk to produce a hidden variable model based on such a governance?
Dear Alexander, I am not sure whether I understand what you wrote. But if I do, there is no risk to produce a *viable* hidden-variable model because such models either contradict locality which follows from special relativity, or they fail to predict correlations exceeding Bell’s bound, so they’re dead. But if they were alive, it would be interesting, not a “risk”.
Dear Lubos, maybe it is a trivial remark, but I think it is better to say that individual quantum system is governed by quantum laws, not by statistical ones.
Pingback: Guest Post: John Preskill on Individual Quantum Systems | Cosmic Variance | Discover Magazine
Pingback: O Nobel de Física 2012 | True Singularity
I heard that Gravitational wave detection community (Braginsky and may be some other people) first proposed QND in 70’s. Is it true that Harcohe demonstrated it for the first time in 90’s and there was no experimental verification of this idea for 20 years ?
Pingback: Guest Post: John Preskill on Individual Quantum Systems » Gocnhin Archive
Pingback: Guest Post: John Preskill on Individual Quantum Systems | Cosmic Variance » Gocnhin Archive
Pingback: Guest Post: John Preskill on Individual Quantum Systems – 30th Edition » Gocnhin Archive
Pingback: Ignacio Cirac and Peter Zoller get what they deserve | Quantum Frontiers
Pingback: Who named the qubit? | Quantum Frontiers