Quantum physics is often regarded as obscure and weird. While it can certainly be counterintuitive, the reputation for obscurity is misplaced. Quantum theory explains the interactions of electrons, subatomic particles, and photons of light. As such, it provides a key foundation of our understanding of the world in general. Nearly everything we interact with is composed of these quantum particles. Whether we are thinking of matter, light, or phenomena such as electricity and magnetism, these tiny components are at work.
It might seem that we never experience quantum objects as separate entities, but quantum phenomena have a huge impact on our lives. It has been estimated that thirty-five percent of GDP in developed countries involves technology—notably electronics, but also materials science, medicine, and more—that could not be constructed without a knowledge of the theory behind the amazing quantum.
So, where does the apparent strangeness come from? That word “quantum” refers to something that comes in chunks rather than being continuous. And the result of applying this chunky approach to the natural world proved a shock to its discoverers. It turned out that quantum entities are very different from the objects that we can see and touch. Quantum particles do not behave like tiny tennis balls. Instead, left to their own devices, quantum particles cease to have distinct properties such as location and direction of spin. Instead, they exist solely as an array of probabilities until they interact with something else. Before that interaction takes place, all we can say about a quantum particle is that it has a certain probability of being here, another probability of being there, and so on.
This is very different from the familiar probability of the toss of a coin. When we toss a fair coin, there is a 50/50 chance of it being heads or tails. Fifty percent of the time that we look at the tossed coin, it will be heads, and fifty percent of the time, it will be tails. However, in reality, once the coin has been tossed, it has a specific value with one hundred percent certainty—we just do not know what that value is until we look. But in quantum theory, all that exists until we take a look at the quantum equivalent of a coin is the probabilities.
It is easy to regard quantum particles as strange. But we need to bear in mind that this is what nature is like. The only reason we think of such behavior as weird is that we are used to the way large-scale objects work—and, in a sense, it is their behavior that is odd, because they do not seem like the ordinary quantum particles that make them up. The biggest struggle that quantum physicists have had over the years has not been with the science, but with finding an interpretation of what is happening that could form a bridge between everyday observations and events at the quantum level. Even today, there is no consensus among physicists on how quantum theory should be interpreted. Many simply accept that the math works well and get on with it, a philosophy known as “shut up and calculate.”
This lack of fixed values for properties of particles did not sit comfortably for some of the earliest scientists involved in quantum theory at the beginning of the twentieth century. Notably, both Max Planck, who came up with the basic concept that light could be quantized, and Albert Einstein, who showed that this quantization was real and not just a useful calculating tool, hated the intrusion of probability into what they felt should be the fixed and measurable reality of nature. Einstein was convinced for his entire career that beneath the apparent randomness and probability there was some structure, something that behaved like the “ordinary” physical world. Yet all the evidence is that he was wrong.
The younger players, starting with Niels Bohr, and people such as Erwin Schrödinger, Werner Heisenberg, Paul Dirac, and Max Born, quantified probability-driven quantum behavior during the 1920s. Their progress was remarkable. These were theoreticians who had little time for experiment. Their ideas could be described as inspired guesswork. And yet the mathematics they developed matched what was later observed in experiments with impressive accuracy.
From the 1930s to the present day, there were a whole string of technological advancements in electronics, the development of the laser, the increasing employment of superconductivity, and more, each of which made direct use of the supposedly weird behavior of quantum particles. It is hard to deny something exists when you build it into gadgets found in every home. And the trigger for quantum physics to move from obscurity to center stage would be World War II.
Many of the key players in the second and third generation of quantum physicists, from Niels Bohr to Richard Feynman, played a significant role in World War II. Their involvement primarily revolved around nuclear fission. In 1938, German physicist Otto Hahn and Austrian physicist Lise Meitner demonstrated radioactive decay, a quantum process, subject to the same influence of probability as other behaviors of quantum particles. In itself, nuclear fission was interesting, but the importance of the process became clear when combined with the idea of the chain reaction. It could either run as a controlled reaction, generating heat, or given its head, it could run away with itself in an ever-increasing cascade, producing a nuclear explosion.
As the world headed unsteadily toward all-out war, there was a fear that Germany—with Denmark and Austria key centers for quantum physics—would produce a nuclear weapon, giving it a terrifying military advantage. In response to this threat, one of the first of the familiar names in the quantum theory story to become involved was Albert Einstein. Einstein was a lifelong pacifist, and it had not occurred to him that the intersection of E = mc2 and nuclear decay could produce a devastating bomb. He was asked to sign letters to the US authorities—and President Roosevelt was persuaded into action, setting up the Manhattan Project, which saw the United States produce and deploy the first atomic bombs in 1945.
Many key quantum physicists left continental Europe, either because they had a Jewish background or were horrified by the rise of the Nazis. Schrödinger went to Ireland and Born to Scotland. Meitner, who had moved to Stockholm, was invited to join the Manhattan Project, but wanted nothing to do with the bomb. Meanwhile, a young Feynman was drafted into the project. Bohr helped refugee scientists from Germany find new academic homes. He remained in occupied Denmark, but refused to be involved with the German nuclear program. It was in Copenhagen that he was visited by the most controversial of his colleagues, Heisenberg, who led the German project. Exactly what happened in the meeting has never been clear—but it seems likely that Heisenberg hoped to get help from Bohr. Bohr escaped to Sweden in 1943 when it seemed likely he would be arrested. He was a regular presence at Los Alamos where the US bomb was developed, providing consultancy.
In the end, Heisenberg failed—whether, as he later claimed, because he did not want to produce a weapon, or because it was simply too difficult. The vast Manhattan Project succeeded, and quantum physics changed the world. Wartime also saw electronics start to take off as early electronic computers were constructed to help with the war effort. The Colossus development at Bletchley Park in the UK went into full operation in 1944 cracking German ciphers, while in the United States, the more sophisticated ENIAC was running by 1946, making calculations for hydrogen bomb development.
These early computers used traditional vacuum tubes, which were fragile, bulky, and needed a lot of energy to run. They were the last leading-edge development to depend on electronics where an appreciation of quantum theory was not essential. It is no surprise that quantum physics was brought to the fore just one year after ENIAC went live with the development of the first working transistor. The wartime developments showed the potential for electronics to transform the world, but it took quantum devices to make electronic devices feasible mass-market products.
To explore the development of quantum science, and applications from lasers and transistors through superconducting magnets and quantum computers, we will divide the subject into four sections, pulling together fifty-two bite-size articles with features covering key aspects and characters in the development of our quantum understanding of the world.
The first chapter, Foundations, brings in Planck’s initial (and in his words, desperate) invocation of the quantum to explain an odd behavior of hot, glowing objects. We will see how Einstein showed the concept was real, and how the way different atoms give off and absorb a range of colors of light is central to Bohr’s model of a quantum atom. Here, electrons cannot occupy any orbit, like planets around a star, but rather can exist only in fixed shells, jumping between them in quantum leaps.
We will discover how quantum physics blurs the concepts of a wave and a particle and how the mathematical developments to explain quantum behavior brought probability into our understanding, leading to the taunting thought experiment that is Schrödinger’s cat. We will see how Heisenberg’s uncertainty principle and Pauli’s exclusion principle made it clear that we could never know everything about quantum systems, and how these quantum principles shape the reactions of chemical elements. And we will find out how quantum physics brought in a new property of quantum particles called spin—which has nothing to do with rotating.
In the second chapter, Quantum Behavior, we discover the implications for the involvement of probability and how physicists attempted to reconcile the probabilistic nature of particles with the apparently ordinary behavior of the objects made up of them. We will see how the concepts of fields and infinite seas of negative-energy electrons transformed the mathematical representation of the quantum, and how all the interactions of matter and light came under the quantum banner. We will explore strange quantum concepts such as zero-point energy, quantum tunneling, and experiments where particles appear to travel faster than light.
For the third chapter, Interpretation & Entanglement, we move onto two of the strangest aspects of quantum science. We discover why, uniquely in physics, quantum theory has a wide range of interpretations (even though the mathematical outcomes remain the same, whichever interpretation is used). And, with quantum entanglement, we uncover Einstein’s greatest challenge to quantum theory. He was the first to show that the strange quantum effect of entanglement implies that a measurement on one of a pair of specially linked quantum particles will be instantly reflected in the other particle, even if it is on the opposite side of the universe. Einstein felt that quantum entanglement proved that quantum theory was irreparably flawed, as this “spooky action at a distance” seemed impossible. But experiments have shown that entanglement exists and can be used both for unbreakable encryption and to transfer quantum properties from one particle to another.
The final chapter, The Amazing Quantum, concentrates on a mix of applications and special quantum states of matter. We discover the purely quantum origins of the laser, transistor, electron microscope, and MRI scanner. These last require superconductivity, a quantum phenomenon that is still not wholly understood. Elsewhere, we see other quantum oddities such as superfluids, which, once started, carry on moving indefinitely and can climb out of a vessel on their own. And we find out why quantum effects turn up in biology, before considering the ultimate quantum challenge. Can quantum physics ever be made compatible with Einstein’s general theory of relativity and its explanation of gravity?
Quantum physics may be strange—but that does not make it incomprehensible, just amazing and wonderful. This is, after all, the science that describes the behavior of the atoms that make you and everything around you—not to mention the light that enables you to see and carries the energy from the Sun that makes life on Earth possible. Oh, and without which we would have no phones or televisions or computers or internet. So, what better subject for a crash course?