Where Next?

When someone writes down an equation, there isn’t a sudden clap of thunder after which everything is different. Most equations have little or no effect (I write them down all the time, and believe me, I know). But even the greatest and most influential equations need help to change the world – efficient ways to solve them, people with the imagination and drive to exploit what they tell us, machinery, resources, materials, money. Bearing this in mind, equations have repeatedly opened up new directions for humanity, and acted as our guides as we explore them.

It took a lot more than seventeen equations to get us where we are today. My list is a selection of some of the most influential, and each of them required a host of others before it became seriously useful. But each of the seventeen fully deserves inclusion, because it played a pivotal role in history. Pythagoras led to practical methods for surveying our lands and navigating our way to new ones. Newton tells us how planets move and how to send space probes to explore them. Maxwell provided a vital clue that led to radio, TV, and modern communications. Shannon derived unavoidable limits to how efficient those communications can be.

Often, what an equation led to was quite different from what interested its inventor/discoverers. Who would have predicted in the fifteenth century that a baffling, apparently impossible number, stumbled upon while solving algebra problems, would be indelibly linked to the even more baffling and apparently impossible world of quantum physics – let alone that this would pave the road to miraculous devices that can solve a million algebra problems every second, and let us instantly be seen and heard by friends on the other side of the planet? How would Fourier have reacted if he had been told that his new method for studying heat flow would be built into machines the size of a pack of cards, able to paint extraordinarily accurate and detailed pictures of anything they are pointed at – in colour, even moving, with thousands of them contained in something the size of a coin?

Equations trigger events, and events, to paraphrase former British Prime Minister Harold Macmillan, are what keep us awake at night. When a revolutionary equation is unleashed, it develops a life of its own. The consequences can be good or bad, even when the original intention was benevolent, as it was for every one of my seventeen. Einstein’s new physics gave us a new understanding of the world, but one of the things we used it for was nuclear weapons. Not as directly as popular myth claims, but it played its part nonetheless. The Black–Scholes equation created a vibrant financial sector and then threatened to destroy it. Equations are what we make of them, and the world can be changed for the worse as well as for the better.

Equations come in many kinds. Some are mathematical truths, tautologies: think of Napier’s logarithms. But tautologies can still be powerful aids to human thought and deed. Some are statements about the physical world, which for all we know could have been different. Equations of this kind tell us nature’s laws, and solving them tells us the consequences of those laws. Some have both elements: Pythagoras’s equation is a theorem in Euclid’s geometry, but it also governs measurements made by surveyors and navigators. Some are little better than definitions – but i and information tell us a great deal, once we have defined them.

Some equations are universally valid. Some describe the world very accurately, but not perfectly. Some are less accurate, confined to more limited realms, yet offer vital insights. Some are basically plain wrong, yet they can act as stepping-stones to something better. They may still have a huge effect.

Some even open up difficult questions, philosophical in nature, about the world we live in and our own place within it. The problem of quantum measurement, dramatised by Schrödinger’s hapless cat, is one such. The second law of thermodynamics raises deep issues about disorder and the arrow of time. In both cases, some of the apparent paradoxes can be resolved, in part, by thinking less about the content of the equation and more about the context in which it applies. Not the symbols, but the boundary conditions. The arrow of time is not a problem about entropy: it’s a problem about the context in which we think about entropy.

Existing equations can acquire new importance. The search for fusion power, as a clean alternative to nuclear power and fossil fuels, requires an understanding of how extremely hot gas, forming a plasma, moves in a magnetic field. The atoms of the gas lose electrons and become electrically charged. So the problem is one in magnetohydrodynamics, requiring a combination of the existing equations for fluid flow and for electromagnetism. The combination leads to new phenomena, suggesting how to keep the plasma stable at the temperatures needed to produce fusion. The equations are old favourites.

There is (or may be) one equation, above all, that physicists and cosmologists would give their eye teeth to lay hands on: a Theory of Everything, which in Einstein’s day was called a Unified Field Theory. This is the long-sought equation that unifies quantum mechanics and relativity, and Einstein spent his later years in a fruitless quest to find it. These two theories are both successful, but their successes occur in different domains: the very small and the very large. When they overlap, they are incompatible. For example, quantum mechanics is linear, relativity isn’t. Wanted: an equation that explains why both are so successful, but does the job of both with no logical inconsistencies. There are many candidates for a Theory of Everything, the best known being the theory of superstrings. This, among other things, introduces extra dimensions of space: six of them, seven in some versions. Superstrings are mathematically elegant, but there is no convincing evidence for them as a description of nature. In any case, it is desperately hard to carry out the calculations needed to extract quantitative predictions from superstring theory.

For all we know, there may not be a Theory of Everything. All of our equations for the physical world may just be oversimplified models, describing limited realms of nature in a way that we can understand, but not capturing the deep structure of reality. Even if nature truly obeys rigid laws, they might not be expressible as equations.

Even if equations are relevant, they need not be simple. They might be so complicated that we can’t even write them down. The 3 billion DNA bases of the human genome are, in a sense, part of the equation for a human being. They are parameters that might be inserted into a more general equation for biological development. It is (barely) possible to print the genome on paper; it would need about two thousand books the size of this one. It fits into a computer memory fairly easily. But it’s only one tiny part of any hypothetical human equation.

When equations become that complex, we need help. Computers are already extracting equations from big sets of data, in circumstances where the usual human methods fail or are too opaque to be useful. A new approach called evolutionary computing extracts significant patterns: specifically, formulas for conserved quantities – things that don’t change. One such system called Eureqa, formulated by Michael Schmidt and Hod Lipson, has scored some successes. Software like this might help. Or it might not lead anywhere that really matters.

Some scientists, especially those with backgrounds in computing, think that it’s time we abandoned traditional equations altogether, especially continuum ones like ordinary and partial differential equations. The future is discrete, it comes in whole numbers, and the equations should give way to algorithms – recipes for calculating things. Instead of solving the equations, we should simulate the world digitally by running the algorithms. Indeed, the world itself may be digital. Stephen Wolfram made a case for this view in his controversial book A New Kind of Science, which advocates a type of complex system called a cellular automaton. This is an array of cells, typically small squares, each existing in a variety of distinct states. The cells interact with their neighbours according to fixed rules. They look a bit like an eighties computer game, with coloured blocks chasing each other over the screen.

Wolfram puts forward several reasons why cellular automata should be superior to traditional mathematical equations. In particular, some of them can carry out any calculation that could be performed by a computer, the simplest being the famous Rule 110 automaton. This can find successive digits of π, solve the three-body equations numerically, implement the Black–Scholes formula for a call option – whatever. Traditional methods for solving equations are more limited. I don’t find this argument terribly convincing, because it is also true that any cellular automaton can be simulated by a traditional dynamical system. What counts is not whether one mathematical system can simulate another, but which is most effective for solving problems or providing insights. It’s quicker to sum a traditional series for π by hand than it is to calculate the same number of digits using the Rule 110 automaton.

However, it is still entirely credible that we might soon find new laws of nature based on discrete, digital structures and systems. The future may consist of algorithms, not equations. But until that day dawns, if ever, our greatest insights into nature’s laws take the form of equations, and we should learn to understand them and appreciate them. Equations have a track record. They really have changed the world – and they will change it again.