The mathematics describing our world depends on universal constants of nature—but what if these are not constant? We may be forced to acknowledge that we as mere mortals are blind to the many dimensions of space–time that must exist beyond our perception.
Nearly two billion years ago, beneath what is now Gabon, Africa, in a region of the country called Oklo, an underground stream filtered through layers of sandstone rock. The water carried uranium and, over the course of many years, this radioactive element settled out into the sandstone and gradually built up into a seam of ore. At some point a geological upheaval thrust this vein onto its side and the flowing water began to erode the sandstone, concentrating the ore even more. Then, 1.7 billion years ago, the deeply buried uranium reached a critical mass and burst into life, becoming a natural nuclear reactor. It turned on and off repeatedly over the next few million years and eventually, as the mix of uranium isotopes changed, the reactor stopped.
In 1972, humans began to mine the ore to supply an increasingly uranium-hungry world and it was then that scientists first noticed some of the ore seemed to have already undergone nuclear fission reactions. As they analyzed the ore and pieced together the evidence for Oklo having behaved as a natural reactor, another oddity surfaced. The nature of the nuclear reaction appeared to have changed and that could only happen if the laws of physics had changed too. Research conducted in 2004 showed that the strength of the force governing the nuclear reaction rate in Oklo had been different by a tiny amount, less than five parts in 100 million, from what it is today.
Ever since the time of Johannes Kepler in the early 17th century, physicists and astronomers have enjoyed unprecedented success in describing nature with mathematics. The equations derived have become our way of understanding the laws of physics and of predicting the behavior of physical systems. Newton’s Theory of Universal Gravitation, for example, was published in 1687. It tells us that the gravitational force between two objects is dependent on the two masses and the square of the distance between them. Is it possible that such a law could change over time? Perhaps one day it may depend upon the cube of the distance, or just half of one of the masses. This is surely a categorical impossibility. We can look out into the Universe and see celestial objects millions, even billions, of light years away that we can make sense of by applying the laws of physics as we understand them today. This strongly suggests that the laws apply throughout the whole Universe and that they do not change, or change very little, with time. Any possible changes must be very subtle: they cannot be changes in the mathematics, but the eyes of suspicion fall on the “constants.”
There are many so-called constants in nature. They are the values that cannot be derived from theory, and so can only be determined by measurement. They are used in the laws of physics as conversion factors to create exact mathematical relationships between quantities. In the case of gravity, mass in kilograms and distance in meters are equated to a force in newtons by Newton’s “gravitational constant.” This is often referred to as “Big G” because it is denoted in the equation by a capital G.
Some of the constants are self-explanatory, such as the speed of light. Others seem more abstruse, such as the Planck constant, which governs the way nature breaks energy up into small “packets.” Despite calling these quantities constants, there has been a creeping suspicion over the last 15 years or so that some of them might be changing slowly with time—particularly the speed of light.
In 1993, physicist John Moffat published his solution to the cosmological horizon problem. This is the tricky observation that the temperature of the cosmic microwave background is virtually the same regardless of which direction we look (see How Did the Universe Form?), so completely disconnected regions of the Universe have somehow reached the same temperature. Traditional physics can only explain this if the Universe was driven into a sudden period of exponential expansion, known as “inflation.” However, inflation does not have any firm foundations in physics, meaning that what actually drove this supposed expansion remains a mystery. This shortcoming led some to look for alternative theories of temperature equalization. Moffat pointed out that if the speed of light had been higher in the past, photons of light could have traveled much further and so could have equalized the temperature across a much wider expanse of space without the need to invoke inflation.
Other physicists used the same idea to perform a new analysis of the cosmological flatness problem (see How Did the Universe Form?), and showed that they could also account for this without inflation, providing that the speed of light was extremely high during the first moments of the Universe’s life and then fell quickly to near its modern value. Astronomers cannot directly test for such a circumstance since they cannot see the fleeting period of time immediately after the Big Bang. But they can study distant quasars—early galaxies powered by matter falling into black holes (see What is a Black Hole?)—in the hopes of catching the last vestiges of any change in the speed of light. In order to detect such a change they look at something called the “fine-structure constant,” which defines the strength of the electromagnetic force in relation to the other forces of nature and determines the pattern of spectral lines from light sources (see What Are Stars Made of?). Its value depends on the speed of light and, importantly, it is what scientists call “dimensionless.”
Physicists have to be careful when drawing conclusions from the measurement of constants that have units attached to them. For example, the speed of light is measured in meters per second, or any other units of length and time chosen. If a variation is measured, the researcher cannot be sure whether it is the speed of light that has truly varied; it could just as well be the rate at which the clock has ticked, or the length of the ruler, that has changed. To avoid this confusion, physicists tend to concentrate on examining dimensionless constants when searching for natural variability. If you measure the ratio of, say, the proton’s mass to the electron’s mass, then the units—kilograms—will cancel out and the resulting constant you get will simply be a number. If something weird happens to the way you define a kilogram, that will be canceled out in the ratio and not affect your conclusion. So, if the value of the ratio changes by even the smallest amount, you can be certain that at least one of those masses is actually changing in some way.
The fine-structure constant is just such a dimensionless constant. It is obtained by combining the speed of light with the Planck constant of energy and the charge on an electron. It affects the outer structure of each atom, which controls the way the atomic electrons react with passing light beams. If the speed of light were to change with the passage of time, the fine-structure constant would also change, and the characteristic spectral lines of all the atoms would change as well.
This is exactly what one group of astronomers believe they have seen. In 1999, John Webb of the University of New South Wales used the world’s largest optical telescope to observe 128 quasars out to distances of around 10 billion light years. Webb’s team collected the quasar light, split it into spectra, and looked for the fingerprints of intervening atoms. Their analysis showed that the spectral lines changed in a way that was consistent with the fine-structure constant having increased slightly during the course of cosmic history, by around 1 part in 100,000 during those 10 billion years.
Numerous groups are now working to verify, or disprove, the variation of the fine-structure constant. If they confirm it, then scientists will have to decide which of the constants that define it is actually varying. Is it the speed of light, the charge on an electron or the Planck constant of energy? Most people suspect it is the speed of light, because of the way a change could help solve the horizon problem of uniform temperature across the Universe. Nevertheless, the discovery of a constant changing has enormous consequences for our understanding of the Universe. It points to physics beyond Einstein, perhaps even to the elusive “theory of everything.”
Most physicists believe that the best candidate for a theory of everything is string theory (see Was Einstein Right?). This complex mathematical theory replaces particles with wiggling strings, but the wiggling take place in higher dimensions than the three we are directly familiar with. We see the strings as particles because, rather like icebergs, there is a lot going on “beneath the surface.” According to string theory, only if all the higher dimensions are taken into account will the value of physical constants remain truly constant. Hence, string theory allows for the constants of nature to appear to change in the dimensions that we perceive. If we could measure a change, it would allow us to use string theory to prove the existence of higher dimensions and to see how they are behaving.
“There are grounds for cautious optimism that we may now be near the end of the search for the ultimate laws of nature.”
STEPHEN HAWKING CONTEMPORARY PHYSICIST
The strength of gravity has been another target for physicists searching for variations in the constants of nature. The difficulty is that Big G, which encapsulates the strength of gravity, is one of the most elusive constants in nature because despite sculpting the Universe on its largest scales, gravity is the weakest of the forces. This means that Big G is difficult to measure accurately. It was over a century after Newton’s derivation of universal gravity that the first successful measurement of Big G was made, and another century before the realization sank in that this is what had been achieved.
Working in 1797, Henry Cavendish managed to measure the force of gravity generated between two lead balls, one 12 inches in diameter and the other much smaller. He did this using a piece of equipment called a torsion balance, which transformed the small gravitational force between the two weights into a twisting of the apparatus, which could be seen and measured. Cavendish then weighed the smaller lead ball, which gave him the force of the Earth’s gravitational field, and compared the two forces to obtain the density of the Earth. This was a quantity much desired by astronomers of the time, because they could use it to calculate the density of the Solar System’s other objects.
It was only in the late 19th century that scientists came to view Newton’s gravitational constant as something fundamental to science. And then they went back to Cavendish’s data to calculate a value for Big G. Since then Big G has been measured to greater and greater accuracy, even though there have been some hiccups along the way. In 1987, scientists thought Big G was known to an accuracy of 0.013 percent. Improved experiments in 1998 forced this to be reassessed to a lesser accuracy of just 0.15 percent. Even today, the value of Big G is extraordinarily imprecise when compared with the force of electromagnetism, which is known to 2.5 million times greater accuracy than is gravity. It is this lack of precision that has led to most of the speculation about whether the constant might be changing in value slowly over time, in effect changing the strength of gravity. Such a variation would gradually change the orbits of stars and planets, affect the sizes of celestial objects, and even determine how brightly stars shine.
Most recently, lunar laser ranging experiments (see Was Einstein Right?) have shown that the value of Big G cannot have changed by more than one part in a million per year, otherwise their sensitive measurements would have picked up this variation in their 40-year observation of the Moon’s orbit. This does not mean that Big G has remained constant; it simply indicates that any variation has been smaller than one part in a million. So astronomers continue to collect lunar laser ranging data in an attempt to extract ever-finer measurements and search for long-term effects. At the same time, other physicists are searching for temporary changes in the strength of gravity brought on by the movement of Earth around its orbit. This could give us a clue to the physics beyond Einstein.
Einstein’s theories of relativity rest upon the central tenet that the laws of physics are the same, no matter where or when you are located in the Universe or how you happen to be moving. How to transform what one observer can see into the viewpoint of another is known as the “Lorentz transformation.” It is named after Hendrik Antoon Lorentz, who caught a glimpse of this behavior earlier than Einstein. He derived a mathematical expression to describe it, but did not know how to correctly interpret it. If the constants of nature change, the Lorentz transformation no longer works precisely, and a “Lorentz violation” is said to have taken place.
The great success of Einstein’s laws in predicting behavior in the Universe tells us that any Lorentz violation must be small. (This is no surprise: the tiny variation in Mercury’s orbit, which showed where Newton’s theory diverged from reality, was so small it went unnoticed for about a century and a half after Newton’s theory was published.) String theory, however, allows small Lorentz violations to have taken place in the Big Bang. If they did, these will have imprinted themselves on the fabric of space–time and could make the laws of physics vary not over billions of years, but at minuscule levels, for example over the course of a single year as Earth orbits the Sun and so travels in different directions through space. Think of this as a bit like the Earth orbiting on a hill, the slope making it a little harder for the Earth to go uphill than to roll downhill. This would show up in the speed at which a ball falls to the ground: it may take slightly longer when the Earth is traveling “uphill,” than when it is coming “downhill” six months later. Effectively, as the Earth travels in different directions through space in the course of its orbit, Earth’s gravity will change by a minuscule amount. The obvious way to test for this is to drop objects throughout the year and measure their rate of fall. Comparing measurements taken six months apart should yield the greatest difference because then the Earth is traveling in opposite directions. The best place to conduct the experiment is in space, because when an object is in free fall, small gravitational variations can be measured very precisely. A number of missions hoping to pursue this research are currently on the drawing board.
Physicists will continue to search for changes in the constants of nature—both long-term and short-term effects—for as long as they believe that string theory is the way to unite gravity with the other forces. By measuring the amount of change, they will be able to home in on the correct version of string theory and understand better its picture of a multi-dimensional Universe. Newton’s theory of gravity is said to have been inspired by watching an apple fall to the ground; earlier, Galileo is said to have dropped objects from tall buildings to discover that all objects fall at the same speed, regardless of their composition or mass. It may not be coincidental that our next breakthrough in understanding the Universe could come from measuring falling objects in orbit.
“In 2056, I think you’ll be able to buy T-shirts on which are printed equations describing the unified laws of our Universe.”
MAX TEGMARK CONTEMPORARY PHYSICIST