From zero to hero
These days we take zero for granted as a symbol for an absence of things. We think of it as just another number, like 1, 2 or 3. But it’s not. Europe was late to accept this invention, and for a while it was even banned. Richard Webb delves into the enigma that is zero.
I used to have seven goats. I bartered three for corn; I gave one to each of my three daughters as dowry; one was stolen. How many goats do I have now?
This is not a trick question. Oddly, though, for much of human history we have not had the mathematical wherewithal to supply an answer.
There is evidence of counting that stretches back five millennia in Egypt, Mesopotamia and Persia. Yet even by the most generous definition, a mathematical conception of nothing—a zero—has existed for less than half that time. Even then, the civilizations that discovered it missed its point entirely. In Europe, indifference, myopia and fear stunted zero’s development for centuries. What is it about zero that stopped it becoming a hero?
This is a tangled story of two zeros: zero as a symbol to represent nothing, and zero as a number that can be used in calculations and has its own mathematical properties. It is natural to think the two are the same. History teaches us something different.
Zero the symbol was in fact the first of the two to pop up by a long chalk. This is the sort of character familiar from a number such as the year 2012 in our calendar. Here it acts as a placeholder in our “positional” numerical notation, whose crucial feature is that a digit’s value depends on where it is in a number. Take 2012, for example: a “2” crops up twice, once to mean 2 and once to mean 2,000. That’s because our positional system uses “base” 10—so a move of one place to the left in a number means a digit’s worth increases by a further power of 10.
It is through such machinations that the string of digits “2012” comes to have the properties of a number with the value equal to 2 × 103 + 0 × 102 + 1 × 101 + 2. Zero’s role is pivotal: were it not for its unambiguous presence, we might easily mistake 2,012 for 212, or perhaps 20,012, and our calculations could be out by hundreds or thousands.
The first positional number system was used to calculate the passage of the seasons and the years in Babylonia, modern-day Iraq, from around 1800 bc onward. Its base was not 10, but 60. It didn’t have a symbol for every whole number up to the base, unlike the “dynamic” system of digits running from 1 to 9 that is the bread-and-butter of our base-10 system. Instead it had just two symbols, for 1 and 10, which were clumped together in groups with a maximum headcount of 59. For example, 2,012 equates to 33 × 601 + 32, and so it would have been represented by two adjacent groups of symbols: one clump of three 10s and three ones; and a second clump of three 10s and two ones.
This particular number has nothing missing. Quite generally, though, for the first 15 centuries or so of the Babylonian positional numbering system the absence of any power of 60 in the transcription of any number was marked not by a symbol, but (if you were lucky) just by a gap. What changed around 300 bc we don’t know; perhaps one egregious confusion of positions too many. But it seems to have been at around this time that a third symbol, a curious confection of two left-slanting arrows (see timeline), started to fill missing places in the stargazers’ calculations.
This was the world’s first zero. Some seven centuries later, on the other side of the world, it was invented a second time. Mayan priest-astronomers in central America began to use a snail-shell-like symbol to fill gaps in the (almost) base-20 positional “long-count” system they used to calculate their calendar.
Zero as a placeholder was clearly a useful concept, then. It is a frustration entirely typical of zero’s vexed history, though, that neither the Babylonians nor the Mayans realized quite how useful it could be.
In any dynamic, positional number system, a placeholder zero assumes almost unannounced a new guise: it becomes a mathematical “operator” that brings the full power of the system’s base to bear. This becomes obvious when we consider the result of adding a placeholder zero to the end of a decimal number string. The number 2,012 becomes 20,120, magically multiplied by the base of 10. We intuitively take advantage of this characteristic whenever we sum two or more numbers, and the total of a column ticks over from 9 to 10. We “carry the one” and leave a zero to ensure the right answer. The simplicity of such algorithms is the source of our system’s supple muscularity in manipulating numbers.
We shouldn’t blame the Babylonians or Mayans for missing out on such subtlety: various blemishes in their numerical systems made it hard to spot. And so, although they found zero the symbol, they missed zero the number.
Zero is admittedly not an entirely welcome addition to the pantheon of numbers. Accepting it invites all sorts of logical wrinkles that, if not handled with due care and attention, can bring the entire number system crashing down. Adding zero to itself does not result in any increase in its size, as it does for any other number. Multiply any number, however big, by zero and it collapses down to zero. And let’s not even delve into what happens when we divide a number by zero.
Classical Greece, the next civilization to handle the concept, was certainly not keen to tackle zero’s complexities. Greek thought was wedded to the idea that numbers expressed geometrical shapes; and what shape would correspond to something that wasn’t there? It could only be the total absence of something, the void—a concept that the dominant cosmology of the time had banished.
Largely the product of Aristotle and his disciples, this world view saw the planets and stars as embedded in a series of concentric celestial spheres of finite extent. These spheres were filled with an ethereal substance, all centered on Earth and set in motion by an “unmoved mover.” It was a picture later eagerly co-opted by Christian philosophy, which saw in the unmoved mover a ready-made identity for God. And since there was no place for a void in this cosmology, it followed that it—and everything associated with it—was a godless concept.
Eastern philosophy, rooted in ideas of eternal cycles of creation and destruction, had no such qualms. And so the next great staging post in zero’s journey was not to Babylon’s west, but to its east. It is found in Brahmasphutasiddhanta, a treatise on the relationship of mathematics to the physical world written in India around ad 628 by the astronomer Brahmagupta.
Brahmagupta was the first person we see treating numbers as purely abstract quantities separate from any physical or geometrical reality. This allowed him to consider unorthodox questions that the Babylonians and Greeks had ignored or dismissed, such as what happens when you subtract from one number a number of greater size. In geometrical terms this is a nonsense: what area is left when a larger area is subtracted? Equally, how could I ever have sold or bartered more goats than I had in the first place? As soon as numbers become abstract entities, however, a whole new world of possibilities is opened up—the world of negative numbers.
The result was a continuous number line stretching as far as you could see in both directions, showing both positive and negative numbers. Sitting in the middle of this line, a distinct point along it at the threshold between the positive and negative worlds, was sunya, the nothingness. Indian mathematicians had dared to look into the void—and a new number had emerged.
It was not long before they unified this new number with zero the symbol. While a Christian Syrian bishop writes in 662 that Hindu mathematicians did calculations “by means of nine signs,” an inscription of dedication at a temple in the great medieval fort at Gwalior, south of Delhi in India, shows that two centuries later the nine had become ten. A zero—a squashed-egg symbol recognizably close to our own—had been incorporated into the canon, a full member of a dynamic positional number system running from 0 to 9. It marked the birth of the purely abstract number system now used throughout the world, and soon spawned a new way of doing mathematics to go with it: algebra.
News of these innovations took a long time to filter through to Europe. It was only in 1202 that a young Italian, Leonardo of Pisa—better remembered as Fibonacci—published a book, Liber Abaci, in which he presented details of the Arabic counting system he had encountered on a journey to the Mediterranean’s southern shores, and demonstrated the superiority of this notation over the abacus for the deft performance of complex calculations.
While merchants and bankers were quickly convinced of the Hindu–Arabic system’s usefulness, the governing authorities were less enamored. In 1299, the rulers of the Italian city of Florence banned the use of the Hindu–Arabic numerals, including zero. They considered the ability to inflate a number’s value hugely simply by adding a digit on the end—a facility not available in the then-dominant, non-positional system of Roman numerals—to be an open invitation to fraud.
Zero is crucial for mathematics, but it has taken thousands of years for its importance to be recognized.
Zero the number had an even harder time. Schisms, upheavals, reformation and counter-reformation in the church meant a continuing debate as to the worth of Aristotle’s ideas about the cosmos, and with it the orthodoxy or otherwise of the void. Only the Copernican revolution—the crystal-sphere-shattering revelation that the Earth moves around the sun—began, slowly, to shake European mathematics free of the shackles of Aristotelian cosmology from the 16th century onward.
By the 17th century, the scene was set for zero’s final triumph. It is hard to point to a single event that marked it. Perhaps it was the advent of the coordinate system invented by the French philosopher and mathematician René Descartes. His Cartesian system married algebra and geometry to give every geometrical shape a new symbolic representation with zero, the unmoving heart of the coordinate system, at its center. Zero was far from irrelevant to geometry, as the Greeks had suggested: it was essential to it. Soon afterward, the new tool of calculus showed that you had first to appreciate how zero merged into the infinitesimally small to explain how anything in the cosmos could change its position at all—a star, a planet, a hare overtaking a tortoise. Zero was itself the prime mover.
Thus a better understanding of zero became the fuse of the scientific revolution that followed. Subsequent events have confirmed just how essential zero is to mathematics and all that builds on it. Looking at zero sitting quietly in a number today, and primed with the concept from a young age, it is equally hard to see how it could ever have caused so much confusion and distress. A case, most definitely, of much ado about nothing.
To read more about zero, go to “Zero, zip, zilch.”