3. The First Quantum Physicists

By Zeus, Soddy, they’d have us out as alchemists!

—ERNEST RUTHERFORD

 

 

 

As A COLLEGE freshman I had my first contact with the quantum theory by purchasing a copy of Quantum Mechanics by Leonard Schiff, who became my teacher in graduate school. I read his book and worked out the problems. Quantum mechanics was for me an exercise in solving differential equations. To my freshman’s mind, unencumbered by any bias from the older, classical physics, the quantum theory presented no problems. It was simply an abstract mathematical description of atomic processes. I had no sense of the “quantum weirdness” of the atomic world; it was the earlier theory of special relativity, with its space contractions and time dilations, that seemed bizarre to me. But as I continued my study my reaction reversed—relativity seemed less bizarre and more in accord with common sense, while the quantum theory seemed more and more “weird.” Pursuing the mathematics of the quantum theory, I felt pushed beyond common sense into unimagined areas. Later, I found out that my experience paralleled that of the physicists who first discovered the new quantum theory. First they discovered the mathematical equations of the quantum theory which worked experimentally; then they pondered the equations and their meaning for the real world, developing an interpretation which departed radically from naive realism. As I realized what the abstract mathematics of quantum theory was actually saying, the world became a very strange place indeed. I became uncomfortable. I would like to share that discomfort with you.

What is this quantum weirdness? The physics of the new quantum theory can be contrasted with the older Newtonian physics which it replaced. Newton’s laws brought order to the visible world of ordinary objects and events like stones falling, the motion of the planets, the flow of rivers and the tides. The primary characteristics of the Newtonian world view were its determinism —the clockwork universe determined from the beginning to the end of time—and its objectivity—the assumption that stones and planets objectively exist even if we do not directly observe them; turn your back on them and they are still there.

In the quantum theory these common-sense interpretations of the world (like determinism and objectivity) cannot be maintained. Although the quantum world is rationally comprehensible, it cannot be visualized like the Newtonian world. And that is not just because the atomic and subatomic world of quanta is very small, but because the visual conventions we adopt from the world of ordinary objects do not apply to quantum objects. For example, we can visualize that a stone can be both at rest and at a precise place. But it is meaningless to speak of a quantum particle such as an electron resting at a point in space. Furthermore, electrons can materialize in places where Newton’s laws say they can’t be. Physicists and mathematicians have shown that thinking about quantum particles as ordinary objects is in conflict with experiment.

Not only does quantum theory deny the standard idea of objectivity but it also has destroyed the deterministic world view. According to the quantum theory, some events such as electrons jumping around atoms occur at random. There just isn’t any physical law that will ever tell us when an electron is going to jump; the best we can do is to give the probability of a jump. The smallest wheels of the great clockwork, the atoms, do not obey deterministic laws.

The inventors of the quantum theory found yet another contrast with the Newtonian world view—the observer-created reality. They found that the quantum theory requires that what an observer decides to measure influences the measurement. What is actually going on in the quantum world depends on how we decide to observe it. The world just isn’t “there” independent of our observing it; what is “there” depends in part on what we choose to see—reality is partially created by the observer.

These properties of the quantum world—its lack of objectivity, its indeterminacy, and the observer-created reality—which distinguish it from the ordinary world perceived by our senses I refer to as “quantum weirdness.” Einstein resisted quantum weirdness, especially the notion of an observer-created reality. The fact that an observer was directly involved with the outcome of measurements clashed with his deterministic world view that nature was indifferent to human choices.

Something inside of us doesn’t want to understand quantum reality. Intellectually we accept it because it is mathematically consistent and agrees brilliantly with experiment. Yet the mind is not able to rest. The way in which physicists and others have trouble grasping quantum reality reminds me of the way children respond when confronting a concept they do not yet grasp. Jean Piaget, the psychologist, studied this phenomenon in children. If a child of a certain age is shown a collection of transparent vessels all of very different shape filled with a liquid to the same level, the child thinks that all the vessels have the same amount of liquid. The child does not yet grasp that the amount of liquid has to do with volume, not just height. If the correct way of viewing the problem is explained to the child, the child will often understand it but immediately reverts back to the old way of thinking. Only after a specific age, around six or seven years, is the child able to grasp the relation between amount and volume. Coming to grips with quantum reality is like that. After you think you have grasped it and some picture of quantum reality forms in your mind, you immediately revert back to the old, classical way of thinking, just like the children in Piaget’s experiment.

It is important to realize that the microworld of atoms, electrons, and elementary particles is not entirely unlike the classical world, the physical world of naive realism. A single atom can be isolated in a box; electrons and other particles leave tracks on photographic emulsions or in cloud chambers. We can push them around using electric and magnetic fields. Experimentalists can measure certain properties of these tiny objects, such as their mass, electric charge, spin, and magnetization. Physicists, like most people, think of microworld particles in just that way. They are just tiny little things. We can make particle beams of them or bounce them off each other and make them dance to our tune. Where is the quantum weirdness? What is so hard to grasp?

The quantum weirdness comes when you start to ask certain kinds of questions about atoms, electrons, and photons. And it comes only when you ask these special questions and set up experiments to try to answer them. For example, if you try to measure precisely both the position of an electron and its velocity by repeated measurements, you find it can’t be done. Every time you measure its position the velocity changes, and vice versa—the electron has a kind of quantum slipperiness. If the electron were an ordinary object, you would be able to determine simultaneously both its position and its velocity. But the electron is a quantum particle, and the ordinary idea of objectivity fails. Until you start asking detailed questions about quantum particles, such as what is the precise position and velocity of a particle, you can live happily in a paradise of naive realism.

Once a person recognizes that the quantum weirdness of the microworld is unavoidable, he can take two attitudes. The first is to forget it and stick to the mathematics of quantum theory. In that way he will find the right answers and make progress in discovering the laws of the microworld. Most theoretical physicists, following the lead of Paul Dirac and Werner Heisenberg, who laid the mathematical foundations of the new quantum theory, take this attitude. The second attitude is the philosopher’s approach, which tries to interpret the quantum weirdness of the microworld in terms of physical reality. They are interested in developing a conceptual picture of the quantum world that is intelligible as well as mathematically consistent. Niels Bohr founded that attitude for modern physics, and he had much to say about the interpretation of reality.

The story of the discovery of the quantum theory began with Max Planck’s determination of the black-body radiation law, the giant first step in 1900. The main feature of the old quantum theory was that it represented attempts on the part of physicists to fit the idea of Planck’s quantum—that there was a discrete element in nature—into classical Newtonian physics. In his work on black-body radiation, Max Planck introduced a new constant into physics, called h, which was a measure of the amount of discreteness in atomic processes. When Planck did his work, in 1900, physicists thought that atoms could have any value for their total energy—energy was a continuous variable. But Planck’s quantum hypothesis implies that energy exchange was quantized. Although the introduction of a quantum of energy had no basis in classical physics, it was not yet clear that the new theory required a radical break with classical concepts. Theoretical physicists first tried to reconcile Planck’s quantum hypothesis with classical physics.

Physicists are conservative revolutionaries. They do not give up tried and tested principles until experimental evidence—or an appeal to logical and conceptual simplicity—forces them into a new, sometimes revolutionary, viewpoint. Such conservatism is at the core of the critical structure of inquiry. Pseudoscientists lack that commitment to existing principles, preferring instead to introduce all sorts of ideas from the outside. Werner Heisenberg commented, “Modern theory did not arise from revolutionary ideas which have been, so to speak, introduced into the exact sciences from without. On the contrary, they have forced their way into research which was attempting consistently to carry out the program of classical physics—they arise out of its very nature.” The old quantum theory represented a program to reconcile the quanta with classical physics.

Einstein took up Planck’s idea in his 1905 paper on the photoelectric effect. Planck assumed the sources of light exchanged quantized energy. Einstein, going a step further, assumed that light was itself quantized—light consisted of particles called photons. This revolutionary idea broke with the then well-established wave theory of light—reason enough for most physicists to reject it. Other physicists resisted Einstein’s proposal because it only explained the photoelectric effect, which was hardly direct evidence for the photon. But Einstein held firm to the notion of a wave-particle duality for light and attempted reconciling these apparently contradictory properties of light but without success.

The theoretical ideas of Planck and Einstein which advanced the quantum theory were a response to experiments which opened a whole new realm of natural phenomena. By the end of the nineteenth century a great number of puzzling new properties of matter were discovered; for the first time, scientists were making direct contact with atomic processes. Roentgen discovered the penetrating X-rays in 1895. Henri Becquerel discovered radioactivity in 1896, and the Curies isolated radium in 1898. In 1897, J. J. Thomson discovered the electron, a new elementary particle. A puzzling discovery was that under certain circumstances atoms emit spectral lines of light. If a substance is heated or if an electric current is passed through a gas of atoms, the substance or gas will emit light. If the spectrum of the light is analyzed by a prism that splits off the various colors, only definite colored lines appear in the spectrum. Neon colored lights offer one example. Each chemical element has a definite and unique set of colored lines, called its line spectrum. No one had any explanation for this phenomenon in the nineteenth century. Yet here lay the experimental clue to the structure of the atom.

Ernest Rutherford was already a famous experimentalist for his discovery of the radioactive transformation of elements with Frederick Soddy when he came to Manchester University. Rutherford and Soddy had found that chemical elements, previously thought to be immutable, changed in the process of radioactivity. Soddy suggested they should call the new process “radioactive transmutation.” Transmutation of the elements, such as lead into gold, was an ancient alchemical dream already discredited by nineteenth-century chemists and physicists. Soddy’s suggestion was met by Rutherford’s sharp reply, “By Zeus, Soddy, they’d have us out as alchemists!” But in fact they had discovered transmutation of the elements.

At Manchester, Rutherford was studying alpha particles, stable positively charged helium nuclei that are emitted by radioactive substances. Rutherford, who did not have the patience to do long hours of counting scintillations on a screen that detected alpha-particle bombardment, unleashed a young assistant, Marsden, on an experiment. The experiment is beautiful for its simplicity. A radioactive source of alpha particles is placed near a metal foil (Marsden used gold foil). The alpha particles are projectiles, like little bullets being fired at the foil. Most of the alpha particles go straight through the foil and are detected on a screen. However, on a hunch, Rutherford asked Marsden to look for alpha particles that were strongly scattered by the foil and widely deflected. By placing the detecting screen away from the line of sight to the alpha source, Marsden found a few deflected alphas. He observed that some even scattered back toward the alpha source. It would be like firing bullets at a piece of tissue paper only to find some bullets bounced backward. This discovery initiated a series of experiments.

What caused some alpha particles to scatter backward from the gold foil? Rutherford knew the alpha particles were positively charged. In the gold foil these particles would sometimes pass close to the atomic nuclei, also positively charged. Since like charges repel each other, this caused the large deflections of some alpha particles off the atomic nuclei. By carefully studying these deflections, Rutherford determined the major features of atomic structure. A window on the microworld opened.

The idea of atoms, held by many people, was that they were without parts, completely elementary, the end of all material structure—a building block for the rest of matter. While a few theoretical physicists speculated about the possibility of atomic structure, there was no experimental support for such speculations. Rutherford’s simple scattering experiment gave humankind its first glimpse into the structure of the atom.

The picture of the atom Rutherford announced in May 1911 was that most of the mass of the atom was concentrated in a tiny, positively charged core, later called the nucleus, while the negatively charged electrons, with very small mass, formed a large cloud about the nucleus, accounting for the size of the atom. The massive nucleus was ten thousand times smaller than the atom. Rutherford’s atom was like a little solar system with the nucleus as the sun and the electrons as the planets and with electric forces instead of gravity binding the system together.

Although Rutherford’s scattering experiments were compelling, from the standpoint of classical physics his planetary picture of the atom was completely unstable. According to classical physics, the electron in orbit about the nucleus should radiate away its energy in the form of electromagnetic waves and fall rapidly into the nucleus. Physicists knew that according to the laws of classical physics, Rutherford’s atom ought to collapse. But there it was nevertheless. This unsatisfactory situation soon changed dramatically. Around 1912, Rutherford wrote from Manchester to his friend Boltwood, “Bohr, a Dane, has pulled out of Cambridge and turned up here to get some experience in radioactive work.” Niels Bohr, a student of J. J. Thomson’s at Cambridge, actually spent less than half a year at Manchester before returning to his native Copenhagen. However, in spite of his brief visit, Rutherford made an impact on the young Dane.

Bohr, challenged by the problem of atomic structure, took an imaginatively daring step: He simply dispensed with some of the rules of classical physics and instead applied the quantum theory of Planck and Einstein to the problem of atomic structure. Remarkably, the few features of the quantum theory already known at the time could solve the problem—as long as one did not worry about the conflict with classical physics. Bohr simply assumed that the electrons in orbit about the nucleus do not radiate light and that the light emitted by atoms is due to some other physics. He showed that Planck’s idea of energy quantization implied that only specific orbits for the electrons are allowed. In order to ensure the stability of atoms, Bohr postulated a lowest orbit beyond which the orbiting electron could not fall. When an electron drops from a higher orbit to a lower one, thereby losing energy, the atom containing that electron emits light, which carries off the lost energy. Because only certain electron orbits are allowed, only certain jumps of the electron between orbits can take place, and consequently the energy of the emitted light is quantized. Since the energy of light is related to its color, only specific colors of light can be emitted by atoms. In this way Bohr’s theoretical model of the atom accounts for the existence of the mysterious spectral lines. The experimentally observed fact that each different atom emitted light with unique and distinct colors revealed the quantum structure of atoms.

One way of imagining the energy levels of Bohr’s atom is to think of a musical stringed instrument like a harp. Each string when plucked has a definite vibration or sound. Similarly, when an electron jumps orbits in an atom there results the emission of a light wave with a definite vibration or color. That is the origin of the discrete light spectrum.

Bohr applied his novel ideas to the simplest atom, hydrogen, which consists of a single proton with a single electron in orbit about it. The advantage of studying such a simple atom is that the allowed orbits of the electron could be precisely calculated and hence the spectrum of light from hydrogen determined. Bohr’s calculations of the hydrogen light spectrum based on his theoretical model of the atom agreed adequately with the experimentally observed spectrum. Such agreement between theory and experiment could not be an accident. It meant that the combination of ideas Bohr took from the quantum theory really worked—the scientific imagination made its first successful step into the quantum structure of the atoms. The ancient capability of the human mind to comprehend a new environment, in this case the atomic structure of matter, was again powerfully reinforced.

Theoretical physicists seized Bohr’s ideas and applied them to more complicated atoms. But Bohr’s model, like every great scientific advance, raised many new questions—questions that couldn’t be asked before. When does an electron change its orbit and cause light to be emitted from the atom? What causes a particular jump? What direction does the emitted light take off in, and why? These questions troubled Einstein. According to classical physics, the laws of motion precisely determine the future behavior of a physical system like an atom. But atoms emitting light seemed to behave spontaneously and undeterminedly. Atoms jump. But why and in what direction? The same spontaneity, Einstein realized, characterized radioactivity.

At first, physicists tried to fit the behavior of atoms into the framework of the classical theory of electromagnetism and made desperate attempts to answer the enigma of the quantum jumps without using light quanta. In 1924, Niels Bohr, Hendrick Kramers, and John Slater wrote an article advocating this approach at the expense of abandoning the laws of energy and momentum conservation at the level of the atom-a revolutionary proposal, because these laws are among the most well-tested physical laws. At the time of this proposal there had been no direct experimental evidence that these conservation laws worked for individual atomic processes. However, it soon came. Arthur H. Compton and A. W. Simon scattered individual photons, the light particles, from electrons. Using a Wilson cloud chamber, a device that displayed the tracks of individual electrons, they verified to a high degree of accuracy the conservation laws for individual atomic processes. For most physicists, these experiments done in 1925 vindicated Einstein’s 1905 proposal of the light quantum.

Through a multitude of new atomic experiments such as Rutherford’s and Compton’s, the structure of the atom was revealed. These experiments forced theoretical physicists into a new and unfamiliar world; the usual rules of classical physics no longer seemed to work. In the atom the human mind was shown a new message—a new physics revealed in the structure of the atomic microworld. The world view of determinism, supported by centuries of experiment and physical theory, was about to fall.

Bohr accepted the results of the experiments of Compton and of Simon, both the correctness of the conservation laws and the existence of the light quantum or photon. He concluded, in July 1925, “One must be prepared for the fact that the required generalization of the classical electrodynamical theory demands a profound revolution in the concepts on which the description of nature has until now been founded.” Bohr was ready for the revolution. It soon came. The first shot had already been fired on a small island in the North Sea.