Beyond the quantum world lies the realm of information
This will be the last question physics answers—if, that is, it is even possible to do so. Constructing a grand unified theory of forces is all very well, and the hunt for quantum gravity, the theory that unites the science of the very small with the science of the very large, is exciting and useful. But neither of these will answer the fundamental question: what is reality made from?
Some may argue that this quest lies beyond the reaches of science. But it is in the very nature of physics to find answers to seemingly impossible questions. The history of physics is littered with “impossible” tasks that have turned out to be very possible indeed. It is easy to forget that Archimedes stunned the ancient world with his innovative thinking. Whether or not he really worked out a way to tell if the king’s crown was made from pure gold, he achieved enough of a scientific reputation to have his life protected by Roman mandate. Similarly, Newton’s description of how gravity works seems obvious now, but its formulation was truly a tour de force at the time.
St. Augustine described magnetism as not unlike a miracle, but now we know the microscopic processes behind the whole gamut of electromagnetic phenomena. The physics done in the past seems a little prosaic now; indeed, the concepts are so straightforward to us that we learn many of them as children. So, will schoolchildren of the future be bored in lessons on the fundamental nature of reality?
The nature of reality is an avenue that humans have tried to explore since at least the time of the ancient Greeks, and it is to then that we can trace our own quest for the nature of reality. The Greeks had several schools of thought on this question. Perhaps most influential was Plato, who believed in a realm of perfect abstractions of physical entities. Everything in the material world drew its existence from these “ideal forms,” and everything was but a shadow of the ideal object.
“To Earth, then, let us assign the cubic form, for Earth is the most immovable of the four and the most plastic of all bodies, and that which has the most stable bases must of necessity be of such a nature.”
PLATO
This realm, accessible only through the training of the mind, was not just about physical objects, such as trees or mountains. It also applied to mathematical ideas: Plato envisioned an ideal mathematical reality populated by the ideal solids. These five geometrical shapes created a connection between the mathematical and the physical. In his dialogue Timaeus, for instance, Plato linked the cube to the Earth: “To Earth, then, let us assign the cubic form, for Earth is the most immovable of the four and the most plastic of all bodies, and that which has the most stable bases must of necessity be of such a nature.” A similar logic linked the tetrahedron to fire, and the icosahedron to water, the octahedron to air and the dodecahedron to a mysterious “fifth element,” or quintessence.
Though it may sound like mystical mumbo-jumbo, we cannot entirely dismiss the notion of realms of ideals. All the tools of modern physics that have been gathered for use in the quest for the ultimate nature of reality have their root in mathematics—and mathematicians still can’t agree on whether mathematics is an invention of our minds or an abstract world into which mathematicians venture in order to make discoveries.
The mathematician Roger Penrose has suggested that, if we are to understand what reality is, we may need to address this fundamental issue. Our best description of reality might have to involve a kind of trinity, he suggests: physical reality is only discernible because of the “mental reality”—or consciousness—constructed by our brains, and can only be described if we believe that our equations and laws of physics come from some “mathematical reality” that exists in parallel with our physical world.
In a philosophical version of the paper, scissors, stone game, Penrose suggests there is a cyclic dependency between the three realities. Only through the equations of mathematics can we describe the fundamental physical particles such as the electron, so mathematical reality trumps physical reality. But physical reality, in the form of the brain’s neurons, gives rise to mental reality. And because mathematics is abstract, mental reality gives rise to mathematical reality. Mathematical trumps physical, physical trumps mental, and mental trumps mathematical.
There is something, however, that seems to lie beyond all three of these notions of reality—something that pushes our notion of the ultimate nature of reality into an even more abstract realm. That something is information. We hold information in our minds, it can be manipulated mathematically and it is always wedded to physical things: information cannot exist without something—a splash of ink on paper, DNA, a photon of light—to sit on. That’s why, in 1991, the IBM researcher Rolf Landauer made an odd statement that still sounds odd today: “Information is physical.”
“Information is physical”
ROLF LANDAUER
What Landauer meant was that information is not some abstract concept, a convenient shorthand for what gets transferred in communications. Wherever you find information, it is inextricably linked to some physical system. Information is carried in the arrangement of molecules on a strand of DNA, enabling the propagation and evolution of life. It is encoded in the charge on a capacitor in an electrical circuit—allowing us to build the information storage and processing facilities we call computers. It is written into the quantum state of a photon of light, allowing telephone conversations to be sent through optical fibers. Wherever information exists, it takes a physical form.
The idea has come to be known as “Landauer’s principle,” and it has sparked a revolutionary way of thinking about information. If information is physical, could it be that everything physical is actually just information? There are at least three good reasons to believe this to be true. First is the fact that information seems to be spookily connected to laws that govern the universe.
Perhaps the deepest notion in our understanding of the cosmos is special relativity’s assertion that there is an unbreakable speed limit: the speed of light (see What is Time?). This has enabled us to make sense of countless phenomena in astronomy and cosmology. But it may just be that the limited speed of light is a result of the limited speed of information. Is the theory of relativity actually an offshoot of information theory?
Information theory didn’t start out as something profound. It was developed by Claude Shannon, a mathematician and engineer who worked at Bell Labs in the 1940s. The main thrust of his work was to find ways to increase the speed at which information could be squeezed down a telephone wire, or through an electrical circuit. He developed techniques for “compressing” information to optimize this, but also found fundamental limitations. Shannon discovered that each communications channel has a maximum capacity, and there is also a maximum efficiency with which information can be sent without it getting lost in transmission.
The measure of information is the “bit,” which is short for binary digit. Computers, for example, run on the binary number system: every number, every instruction, is encoded as a series of 0s and 1s. Though information can be stored in ways that offer more than two alternatives—DNA uses the four “base” molecules, adenine, thymine, cytosine and guanine, for example—these can always be built from a binary system. The two-alternative system of the bit is the simplest, most fundamental means of storing and transmitting information.
The other important factor in information theory is the “bandwidth” of the information channel. Whether it is in an Internet connection, or the connection between the memory and the processor in your computer, the bandwidth provides a measure of how many bits can pass through each second. Any channel for information will contain a certain amount of noise that will cause errors in the transmission of information. When NASA send radio signals through the Earth’s atmosphere, for instance, the signal can be distorted by atmospheric conditions, turning a 0 into a 1 or vice-versa.
Shannon worked out that, given a particular signal-to-noise ratio and bandwidth, there is an upper limit on how fast information can be transmitted through the channel without any loss. The latest cell phone and satellite TV systems work to within 1 percent of this “Shannon limit.” However, they cannot get to it, or get past it. It is a little like the speed of light in relativity: the closer you come to this fundamental limit, the harder it is to do any better.
Why should information be like the speed of light? Is it because, like light, information is related to the fundamental underlying structure of physical reality? That is certainly what a growing number of researchers believe—especially those who research black holes.
Black holes are the second reason to think information is part of the answer to our question about the nature of reality. Nothing that falls within the spherical region known as the “event horizon” surrounding a black hole ever escapes. That means black holes are, effectively, sinks for information. Everything they swallow has information encoded on it in the form of atomic states, spins of particles and so on. So what happens to that information?
In the 1970s, Stephen Hawking showed that black holes slowly evaporate, emitting “Hawking radiation.” The trouble is, this radiation does not contain any information. The laws of physics dictate that information, like energy, cannot be destroyed, which means it must go somewhere. After decades of debate, physicists now believe that the information is encoded in the microscopic structure of space and time at the “event horizon” of the black hole, the point of no return for in-falling matter or light.
Since the event horizon is a 2D structure—the surface of a sphere that surrounds the black hole—that means the information describing 3D objects such as atoms can be encoded on a 2D surface. Extrapolating from this idea, some researchers have demonstrated that the whole universe can be viewed in the same way. The boundary of our universe is essentially the 2D surface of a sphere. The information that appears to exist within the sphere could actually be held on the surrounding 2D surface. Just as an apparently 3D hologram results from a carefully designed projection of light onto a 2D surface, our 3D reality could well be a hologram projected from information held at the edge of the universe. In other words, everything you think of as physical stems from information.
There is even a hint of experimental support for this idea. In 2008, US particle physicist Craig Hogan was trying to work out how we might test the holographic projection idea. He worked out that the boundary of the universe could hold only a limited amount of information, and that, when the information was projected into the 3D space of the universe, this limit would manifest as a kind of pixilation effect in our physical reality. We would, effectively, see the dots; space and time, Hogan suggested, should look grainy if you could see it on small enough scales.
The kinds of scales involved mean it would only be detectable in the most sensitive instruments we have—the gravitational wave detectors looking for the ripples in space and time that result from violent cosmological events such as the collision of two black holes. And so Hogan sent his best idea of how the graininess of space–time would affect these instruments to the scientists at GEO600, a gravitational wave detector sited in Hanover, Germany.
As it turned out, the GEO600 researchers had been having problems with noise in their detectors. And that noise had exactly the same characteristics as Hogan’s anticipated signal. It’s not yet conclusive evidence, but it suggests that the “holographic principle”—that everything is ultimately composed of information that resides at the edge of the universe—is at least worth taking seriously.
The third reason to consider information so important comes from quantum theory, our best set of rules for how things behave on subatomic scales. Quantum theory has been astonishingly successful, its predictions matching experiments without fail. But it is not the final answer to understanding the nature of reality. Though it provides a way to describe what happens in subatomic systems, it does not tell us why things behave as they do (see What Happened to Schrödinger’s Cat?). In fact, it leaves us perplexed about many aspects of the behavior of these systems, giving room for philosophers to wax lyrical about the absence of objective reality, and the limits of experimental science.
There are more than half a dozen philosophical interpretations of the limited view that quantum theory gives us. There is no way to choose between them because all of them are consistent with all the experiments. The only way out, it seems, is to find what lies beneath quantum theory—and that appears to be information. There is an obvious link between quantum theory and information: with bits and quanta, both information theory and quantum theory rely on a fundamental, indivisible quantity. But there is also a more subtle connection. The strangeness of the quantum world could arise from limits on the amount of information carried by a quantum particle.
One reason for thinking this is the Heisenberg uncertainty principle, which says that, if you know some things about a quantum system with perfect accuracy, there are other things that you cannot know at all (see Is Everything Ultimately Random?). Heisenberg deduced his principle from the equations of quantum theory, and we have so far had to accept that this is “just the way it is.” By considering aspects of information theory, however, we can achieve a somewhat more satisfactory explanation.
A quantum particle such as an electron has a property called spin, which is binary (up or down) and can be measured in any of the three spatial dimensions. If an electron’s spin can only carry one bit of information, the first measurement on the electron will use that bit up; there is no more spin information available to measurements in the other dimensions. The outcome of any such subsequent measurements will be random—exactly what is predicted by the Heisenberg uncertainty principle.
There are indications that information theory might also be able to make some sense of the puzzling phenomenon of quantum entanglement, which allows a “spooky” link between two particles. Entanglement is certainly all about carrying and sharing information. Put simply, it dictates that, after two particles interact, each particle’s quantum state—the full description of its position, momentum, spin and so on—resides not within that particle, but is shared between the two.
The spookiness of entanglement lies in the fact that the particles can be placed in a quantum state that is “indefinite.” Just as Schrödinger’s cat is alive and dead until it is observed (see What Happened to Schrödinger’s Cat?), the entangled pair can have a mixture of spins—they can be “up” and “down” at the same time—until someone measures the spin.
When a measurement forces one particle to a particular spin, the spin of the other particle will be made definite. Einstein hated this because it looks like the observation of one particle can change the state of another one, no matter how far apart they are (see Can I Change the Universe with a Single Glance?). If an entangled pair of particles can carry only a limited amount of information in their spin states, however, that provides a way out of the weirdness.
The quantum version of information theory says an entangled pair can carry only two bits of information. If those two bits encode something like “the spins are the same when measured in the X dimension,” and “the spins are opposite when measured in the Y dimension,” that gives a description of the spin states of both particles—but leaves no room for information about the spin of an individual particle.
That’s why the first measurement appears to give a random result, yet the result of the second measurement can be predicted with perfect accuracy. Though it gives the illusion of a “spooky” transfer of information between the particles, it’s actually just that the first measurement gives us more information. Given the first result, and the nature of the link between the spins, the second particle’s spin can be deduced with simple logic.
Quantum researchers have only just begun to appreciate that information might be the key to understanding their discipline, and don’t yet have many solid explanations for how this might work. But if information does lie at the root of quantum theory, that seems somehow appropriate. We are living in what has been dubbed the “information age,” where optical fibers and satellite transmissions fire information around the world at astonishing speeds and intensities. All these technologies work because of our understanding of the quantum world—the laser and the microchip are both spin-offs from quantum theory. It seems only right that the last question in physics should tie information theory and quantum theory together.
So where does this leave us in the search for the ultimate nature of reality? If you can describe anything as an “it,” as a real entity, it ultimately appears to come from a bit of information—or a large collection of them. We get, as the physicist John Archibald Wheeler put it, “it from bit.” In 1990, Wheeler declared that, “Tomorrow we will have learned to understand and express all of physics in the language of information.” That “tomorrow” has not yet come, but perhaps it is appearing on the horizon at last.
However, we simply cannot know how far we are on the path to discovering the ultimate nature of reality. During this century our investigations of reality have taken us from the realm of the atomic to the subatomic, right down to the idea of energetic fluctuations in the fabric of space and time. It seems that the fundamental nature of reality goes deeper than this, into abstract notions of mathematics and information. But is that the end?
“Tomorrow we will have learned to understand and express all of physics in the language of information.”
JOHN ARCHIBALD WHEELER
Physicists are painfully aware that any and all their conjectures could be a million miles from the truth. They work within the current limits of knowledge and the limits of the human imagination. Both seem to recede as we discover more about the world, but never disappear. If the end of physics is on the horizon today, it is worth remembering that it has always seemed to be there. It would be hubris to think we are taking the final steps toward understanding the very core of reality; there is undoubtedly plenty of distance left for physicists to cover. But when the journey is so deeply fascinating, that can only be cause for celebration.