Afterword:
The Measure of All Things
Perfection is the child of time.
—BISHOP JOSEPH HALL, WORKS (1625)
Humankind has for most of its civilized existence been in the habit of measuring things. How far from this river to that hill? How tall is this man, that tree? How much milk shall I barter? What weight is that cow? How much length of cloth is required? How long has elapsed since the sun rose this morning? And what is the time right now? All life depends to some extent on measurement, and in the very earliest days of social organization a clear indication of advancement and sophistication was the degree to which systems of measurement had been established, codified, agreed to, and employed.
The naming of units of measurement was of course one of the first orders of business in early civilization—the cubits of the Babylonians were probably the first units of length; there were the unciae of the Romans, the grain, the carat, the toise, the catty—and the yard and the half yard, the span, the finger, and the nail of early England.
The later development of precision, however, demanded not so much a range of exotically named units, but trusted standards against which these lengths and weights and volumes and times and speeds, in whatever units they happened to be designated, could be measured.
The development of standards is necessarily very much more modern than the creation of units—and over the years there has been a steady evolution of the debates about standards, which in summary can be divided into three—whether they are and should be based on tangible human-scale entities—the thumb or the knuckle for the inch, say; or on created objects—man-made rods of brass or cylinders of platinum, say; or whether they should be based on absolute aspects of the natural world, carefully observed aspects which are immutable and constant and eternal.
IT WAS GALILEO who took the first step, in 1582, and by the simple act of noticing something quite mundane. It may or may not be legend: that while sitting in his pew in the cathedral at Pisa he watched the lantern over the nave swinging back and forth, and doing so at a regular rate. He experimented with a pendulum and found out that the rate of the swing depended not on the weight of the pendulum bob, but on the length of the pendulum itself. The longer the pendulum arm, the slower and more languid the back-and-forth interval. A short pendulum would result in a more rapid tick-tock, tick-tock. By way of Galileo’s simple observation so length and time were seen to be linked—a linkage that made it possible that a length could be derived not simply from the dimensions of limbs and knuckles and strides, but by the hitherto quite unanticipated observation of the passage of time.
A century later an English divine, John Wilkins, proposed employing Galileo’s discovery to create an entirely new fundamental unit, one that had nothing to do with the then-traditional standard in England, which was a rod that was more or less officially declared to be the length of a yard. In a paper published in 1668, Wilkins proposed quite simply making a pendulum that had a beat of exactly one second—and then, whatever the length of the pendulum arm that resulted would be the new unit. He took his concept further: a unit of volume could be created from this length; and a unit of mass could be made by filling the resulting volume with distilled water. All three of these new proposed units, of length, volume, and mass, could then be divided or multiplied by ten—a proposal which made the Reverend Wilkins, at least nominally, the inventor of the idea of a metric system. Sad to say, the committee set up to investigate the plan of this remarkable figure* never reported, and his proposal faded into oblivion.
Except that one aspect of the Wilkins proposal did resonate—albeit a century later—across the Channel in Paris, and with the support of the powerful cleric and diplomat Talleyrand. The formal proposal, which Talleyrand put to the National Assembly two years after the Revolution, in 1791, exactly duplicated Wilkins’s ideas, refining them only to the extent that the one-second beating pendulum be suspended at a known location along the latitude of 45 degrees North. (Varying gravitational fields cause pendulums to behave in varying ways: sticking to one latitude would help mitigate that problem.)
But Talleyrand’s proposal fell afoul of the postrevolutionary zeal of the times. The Republican Calendar had been introduced by some of the ardent firebrands of the day, and for a while France was gripped by a mad confusion of new-named months (Fructidor, Pluviôse, and Vendémiaire among them), ten-day weeks (beginning on primidi and ending on décadi), and ten-hour days—with each hour being divided into one hundred minutes and each minute into a hundred seconds. Since Talleyrand’s proposed second did not match the Revolutionary Second (which was 13.6 percent shorter than a conventional second of the Ancien Régime) the National Assembly, gripped by the new orthodoxy, rejected the idea wholesale.
And it would be more than two further centuries before the fundamental importance of the second was fully accepted. For now, in the minds of eighteenth-century French assemblymen, length was a concept vastly preferable to time.
For in dismissing Talleyrand so they turned instead to another idea, brand-new, which was linked to a natural aspect of the Earth, and so in their view more suitably revolutionary. Either the meridian of the Earth or its equator should be measured, they said, and divided into forty million equal parts, with each one of these parts being the new fundamental measure of length. After some vigorous debate, the parliamentarians opted for the meridian, in part because it passed through Paris; they then also decreed that to make the project manageable the meridian be measured not in its entirety, but only in the quarter of it that ran from the North Pole to the equator—a quarter of the way around, in other words. This quarter should then be divided into ten million parts—with the length of the fractional part then being named the meter (from the Greek noun μέτρον, a measure).
A great survey was promptly commissioned by the French parliament to determine the exact length of the chosen meridian—or a tenth part of it, an arc subtending about nine degrees (a tenth of the ninety degrees of a quarter-meridian), and which, using today’s measurement, would be about a thousand kilometers long. It would necessarily be measured in the length units of eighteenth-century France: the toise (about six feet long), divided into six pieds du roi, each pied divided into twelve pouces, and these further divided into twelve lignes. But these units were of no consequence—because all that mattered was that the total length be known and then be divided by ten million—with whatever resulted becoming the measure that was now desired, a creation of France to be eventually gifted to the world.
The proposed survey line ran from Dunkirk in the north to Barcelona in the south, each port city self-evidently at sea level. Since this nine-odd-degree arc was located around the middle of the meridian—Dunkirk is at 51 degrees North and Barcelona 41 degrees North, with the midpoint of 45 degrees North being the village of Saint-Médard-de-Guizières in the Gironde—it was thought likely the oblate nature of the Earth’s shape, the bulge that afflicts its sphericity and makes it resemble more of an orange than a football, would be most evident and so easier to counter with calculation. (To further confirm the Earth’s shape the French Academy of Sciences sent out two more expeditions, one to Peru and the other to Lapland, to see how long a degree of high latitude was: all confirmed the orange shape that Isaac Newton had predicted centuries before.)
The story of the triangulation of the meridian in France and Spain, and which was carried out by Pierre Méchain and Jean-Baptiste Delambre over six tumultuous years during the worst of the postrevolutionary terror, is the stuff of heroic adventure. On numerous occasions the pair escaped great violence (but not jail time) only by the skin of their teeth. The story is also outside the scope of this account, for what matters to precision engineers of the future—and to engineers all over the world, since that one remarkable survey led to the establishment of the metric system still in use today—is what the French did once the survey results were in. And that mostly involved the making of bronze or platinum rods.
The survey results were announced in April 1799. The length of the meridian quadrant was calculated from the extrapolated survey findings to be 5,130,740 toise. All that was required was that bars and rods be cut or cast that were one ten-millionth of that number—0.5130740 toise, in other words. And that length would be, henceforward, the standard measure—the standard meter—of postrevolutionary France.
The commissioners then ordered this length to be cast out of platinum, as what is known as an étalon—a standard. A former court goldsmith named Marc Étienne Janety had been selected to make it, and was called back from Marseille, where he had been sheltering from the excesses of the Terror. The result of his labors exists to this day—the Meter of the Archives, a bar of pure platinum that is twenty-five millimeters wide and four millimeters deep, and exactly, exactly, one meter in length. On June 22, 1799, this meter was officially presented to the National Assembly.
But that was not all: for in addition to the platinum rod that was the meter, so also there came with it a few months later a pure platinum cylinder which, it was explained, was the étalon of mass, the kilogram. Janety had made this one too, and also from platinum, 39 millimeters tall, thirty-nine millimeters in diameter, stored in a neat octagonal box with the label proclaiming, in good Napoleonic calendric detail, “Kilogramme Conforme à la loi du 18 Germinal An 3, présenté le 4 Messidor An 7.”
The two properties of length and mass were now inextricably and ineradicably connected. For once the standard of length had been determined, so that length could be employed to determine a volume and, using a standard material to fill that volume, so a mass could be determined too.* And so in Paris at the exhausting end of the eighteenth century it was decided to create a new standard for mass based on a formula of elegant simplicity. One-tenth of the newly presented meter—and which would be technically a decimeter—could be set as the side of an exactly manufactured cube. This cubic decimeter would be called a litre measure, and it would be made as precisely as possible out of steel or silver. It would then be filled entirely with pure distilled water and the water held as close as possible to the temperature of 4 degrees Celsius, the temperature at which the density of water is most stable. The resulting volume, this one liter of this particular water, would then be defined as having a mass of one kilogram.
The platinum object made by the goldsmith M. Janety was duly cast, and adjusted until it exactly balanced the weight of that cubic decimeter of water. And that platinum object—very much smaller than the water, of course, since platinum was so much denser, by a factor of almost twenty-two—would from December 10, 1799, henceforward be the kilogram.
The Kilogram of the Archives and the Meter of the Archives, from which the kilogram had been determined, were thus the new fundamentals of what would soon be a new world order of weights and measures. The metric system was now officially born.
These two icons of its founding are still in existence, in a steel safe deep within the Archives Nationales de France in the Marais, in central Paris. One resides in an octagonal black leather-covered box, the other in a long and thin box of reddish-brown leather.
Except that—and this is a constant feature in the universe of measurement—these beauteous objects were eventually found to be wanting.
Years after they had been fashioned, the meridian line on which they had been based was resurveyed, and to widespread chagrin and dismay it was discovered that there were errors in Delambre and Méchain’s six-year eighteenth-century survey, and that their calculation of the length of the meridian was off. Not by much, but by enough for the physical Meter of the Archives to be shown to be two-tenths of a millimeter shorter than the newly calculated version. And it follows that if the meter was wrong, then the cubic meter and the cubic decimeter and the liter-of-water equivalent in platinum, which would be the kilogram, would be wrong also.
So a cumbersome process was set in train to create a set of wholly new prototypes, which would be as perfect in their exactitude as late nineteenth-century science could manage. It took more than seven decades for the international community to agree, and many further years to make the requisite cache of bars and cylinders. The mechanics of their making illustrates just how far the idea of precision had come in the century since John Wilkinson, boring his cylinders for James Watt, had come. The need to make the standards as near-perfect as imaginable was to become the stuff of obsession.
Fifty international delegates—all of them men, all of them white, and almost all of them with lengthy beards—gathered for the first meeting of the International Metre Commission in Paris in September 1872 to begin the process. They met in the former medieval priory of St. Martin des Champs, later to be turned into the Conservatoire National des Arts et Métiers, one of the world’s greatest repositories of scientific instruments.*
The countries that would decide the future of the world’s measurement system included all the then-great Western powers—Britain, the United States, Russia, Austria-Hungary, the Ottoman Empire—but pointedly, neither China nor Japan. Their sessions, and those of their associated conferences—most notably the Diplomatic Conference of the Metre, which was more concerned with national policies, less with the technical aspects of making prototypes—went on for what at this remove seems an interminable period.
All of the meetings would, however, lead eventually to the signing, on May 20, 1875, of the Treaty of the Metre. It would mandate the formation of the BIPM, the present-day International Bureau of Weights and Measures, which would have as its home the Pavillon de Breteuil, outside Sèvres, and which it still inhabits today. Between them these bodies, at various times and in various ways, would commission the making of a set of vital new prototypes.
It took nearly fifteen years for the defining set of internationally agreed standard measures to be created, for the new standard artifacts to be cast, machined, milled, measured, polished, and offered up for the world’s approval. On September 28, 1889, a ceremony was held in Paris to distribute them.
The two best made, each so perfect in their appearance and exact in their dimensions, and which in consequence were nominated to be the international prototypes, had by now been chosen: they were the International Prototype Meter, to be known hereafter by the black-type letter M, and the International Prototype Kilogram—Le Grand K—designated by the black letter K. Both of these platinum-iridium alloy objects were to remain for all future time under heavy security in the basement of the Pavillon de Breteuil.
All the others were then, and for this September day only, on display in the Pavillon’s observatory. The stubby little kilograms gleamed under glass cloches (the national standards under a pair of glass cloches, the IPK itself under three), the slender meter bars in wooden tubes that were further enclosed in brass tubes with special fixtures to keep them safe while they traveled.
Certificates of authenticity had been engraved on heavy Japanese paper by the Parisian society printer Stern. Each of these certificates had a formulaic rubric that gave the properties of the body it accompanied: platinum-iridium cylinder No. 39, for example, had the notation “46.402mL 1kg -0.118mg,” which is decoded as meaning the cylinder had a volume of 46.402 milliliters and was lighter than 1 kilogram by 0.118 milligrams. Certificates for the meters were a little more complicated: for instance, one of the meter bars was noted as being “1m + 6μ.0 + 8μ.664T + 0μ.00100T2,” which meant that at 0 degrees Celsius it was 6 micrometers longer than 1 meter, and at a 1 degree Celsius its length would be greater by a little more than 8.665 micrometers.
Three urns stood on a dais in the room, and officials had put into each paper slips bearing the numbers of the remaining standards—they were to be distributed among the member states by lottery. And so, in midafternoon of that warm autumn Saturday, the world lined up as if bidding for the distribution of sporting season tickets. Officials called out the countries’ names, in alphabetical order, in French. Allemagne was first, Suisse last. The draw took an hour. When it was all over the United States had received Kilograms 4 and 20, and Meters 21 and 27.* Britain had acquired Meter 16 and Kilogram 18; Japan (which by this time had signed the 1875 treaty),† Meter 22 and Kilogram 6.
By the end of the day, so the delegates set off from Paris with their invaluable bounties—all packed away in boxes (the kilograms removed from their cloches for travel), and with all the bills paid. They were not insubstantial: the cost of a platinum-iridium meter was 10,151 francs; the kilogram a comparative steal at 3,105 francs. Within days or weeks (the Japanese took theirs back by ship) the new standards were safely in the metrology institutes that were by now being established in capitals all around the world. They were all kept safe and sound—though none so safe and sound as the International Prototypes, M and K, which were now to be taken to the basement and plunged into sempiternal darkness, incomparable, accurate, and fantastically precise. In safes nearby were six so-called témoins—witness bars, which would be regularly compared against the masters. These too would remain exact and perpetually inviolate.
Except, not exactly. Not so fast. The overseers of metrology’s fundamentals had been charged with the task of eternal vigilance, of always looking for still better standards than these. And in time they did indeed find one.
THE FIRST CLUES that there might be a better system had come some years before, in 1870, long before these platinum talismans were being wrought into their final definitive shapes and sizes. The Scots physicist James Clerk Maxwell, at the British Association for the Advancement of Science annual meeting in Liverpool, had made a speech that threw a wrench into everything that had been done. His words still ring in the ears of metrologists around the world. He reminded his listeners that modern measuring had begun with the survey and then the resurvey of the French meridian, and the derivation of the metric units from the results:
Yet, after all, the dimensions of our Earth and its time of rotation, though, relatively to our present means of comparison, [are] very permanent, [they] are not so by physical necessity. The Earth might contract by cooling, or it might be enlarged by a layer of meteorites falling on it, or its rate of revolution might slowly slacken, and yet it would continue to be as much a planet as before. But a molecule, say, of hydrogen, if either its mass or its time of vibration were to be altered in the least, would no longer be a molecule of hydrogen.
If, then, we wish to obtain standards of length, time and mass which shall be absolutely permanent, we must seek them not in the dimensions, or the motion, or the mass of our planet, but in the wavelength, the period of vibration, and the absolute mass of these imperishable and unalterable and perfectly similar molecules.
What Maxwell had done was challenge the scientific basis for all systems of measurement up to that moment. It had long been self-evident that a system based on the dimensions of the human body—thumbs, arms, stride, and so forth—was essentially unreliable, subjective, variable, and useless. Now Maxwell was suggesting that standards previously assumed reliable, like fractions of a quadrant of the Earth’s meridian, or the swing of a pendulum or the length of a day, were not necessarily usefully constant either. The only true constants in nature, he declared, were to be found on a fundamental, atomic level.
And by this time scientific progress was providing windows into that atom, revealing structures and properties hitherto undreamed of. These very structures and properties that were by their very nature truly and eternally unvarying, Maxwell was saying, should next be employed as standards against which all else should be measured. To do otherwise was simply illogical. Fundamental nature possessed the finest standards—the only standards, in fact—so why not employ them?
It was the wavelength of light that was the atomic fundamental first used to try to define the standard measure of length, the meter. Light, after all, is a visible form of radiation caused by the excitation of atoms—excitation that causes their electrons to jump down from one energy state to another. Different atoms produce light ranging over different spectrums, with different wavelengths and colors, and so produce different and identifiable lines on a spectrometer.
It took a further hundred years to convince the international community of the wisdom of linking length to light and its wavelength. To the graybeards who then ran the world, abandoning the certitudes of Earth for the behavior of light was akin to believing that the continents could move—a simply preposterous idea. But just as in 1965, when the theory of plate tectonics was first advanced and continental drift was suddenly seen as obvious, a reality hidden in plain sight, so it became as much the same in metrology as it had been for geology: the notion of using atoms and the wavelength of the light they can emit as a standard for measuring everything snapped into place in a sudden moment of rational realization.
It was a late nineteenth-century Massachusetts genius named Charles Sanders Peirce who had that first moment, who first tied the two together. Few men of his generation can have been more brilliant—or more infuriatingly, insanely troublesome. He was many things—a mathematician, a philosopher, a surveyor, a logician, a philanderer of heroic proportions, and a man crippled with pain (a facial nerve problem), with mental illnesses (severe bipolar disorder most probably), and with a profound inability to keep his temper in check. On the plus side of the ledger: he could stand before a blackboard and write a mathematical theory on it with his right hand on the right side and, simultaneously, write its solution with his left hand on the left. On the minus side: he was once sued by his cook for hitting her with a brick. He drank. He took laudanum. He was much married, and was pathologically unfaithful.
But it was Peirce who in 1877 first took a pure and brilliant source of incandescent yellow sodium light, and tried as hard as he might to measure—in meters, thereby establishing the dimensional link between light and length—the black spectral line it produced when run through a diffraction grating, a kind of high-precision prism. It was one of the numberless misfortunes of his seventy-five years that this experiment never quite succeeded—there were problems with the expansion of the glass of the grating, problems with the thermometers used to measure the temperature of the glass. But he nevertheless published a short paper in the American Journal of Science, and by doing so laid historical claim to being the first to try. Had he succeeded his name would be on the lips of all. As it was he died obscurely in 1914, and in abject poverty, having to beg stale bread from the local bakery. He is long forgotten, except by a very few who agree with such as Bertrand Russell, who called Peirce “the greatest American thinker, ever.”
By 1927, after much badgering by scientists who were convinced by Maxwell’s argument that this was the best approach to setting an inviolable standard, so the world’s weights and measures community came, if somewhat grumpily, to an agreement. They first accepted, formally, that one particular element’s wavelength had thus been calculated, and in fractions of a meter—a very small number. Further, they then agreed that by multiplication, the meter could be defined as a certain number of those wavelengths—by comparison a very big number, and to at least seven decimal places. Multiply the one by the other and one gets, essentially, one meter.
The element in question was cadmium—a bluish, silvery, and quite poisonous zinc-like metal that was used for a while (with nickel) in batteries and to corrosion-proof steel and now is used to make (with tellurium) solar panels. It emits a very pure red light when heated, and from its spectral line the wavelength could be determined—so accurately that the International Astronomical Union used its wavelength to define a new and very tiny unit of length, the Ångstrom—one ten-billionth of a meter, 10−10m.
The wavelength of cadmium’s red line was measured and defined as 6,438.46963 Ångstroms. Twenty years later, with the weights and measures officials in Paris now accepting both the principle and the choice of cadmium (although making its red-line wavelength slightly fuzzier by losing the final number 3, rendering it as 6,438.4696Å), the meter could have been very easily defined by simple arithmetic as 1,553,164 of those wavelengths. (Multiplying the first figure by the second gives 1.000, essentially.)
But—and in the tortuous history of the meter, this is hardly surprising—cadmium then turned out to be not quite good enough. Its spectral line, when examined closely, was found not to be as fine and pure as had been thought—the samples of cadmium were probably mixtures of different isotopes of the metal, spoiling the hoped-for coherence of the emitted light. And so it happens that the meter never was formally defined in terms of cadmium. Much else was, but not the sacrosanct meter. The platinum-iridium bar clung on gamely through all the various meetings of the weights and measures committees, surviving all the siren-like temptations of other radiations—until finally, in 1960, there came agreement.
The world settled on krypton. This inert gas, which was only discovered in trace amounts in the air in 1898, is perhaps best known as the most commonly used gas in neon signs, which are seldom filled with neon at all. More important, in this long quest to define the meter in terms of wavelength, krypton has a spectral signature with extremely sharp emission lines. Krypton-86 is one of the six stable isotopes that occur naturally,* and on October 14, 1960, the International Committee on Weights and Measures decided, nearly unanimously, that this gas, with its formidable coherence and with the exactly known wavelength of its emissions of reddish-orange radiation (6,057.80211Å) would be the ideal candidate to do for the meter what cadmium had done for the Ångstrom.
And so, with the delegates observing that the meter was still not defined with “sufficient precision for the needs of today’s metrology,” it was agreed that henceforward the meter would be defined as “the length equal to 1,650,763.73 wavelengths in vacuum of the radiation corresponding to the transition between the levels 2p10 and 5d5 of the krypton-86 atom.”
And with that simple declarative sentence so the old one-meter platinum bar was pronounced, essentially, useless. It had lived since 1889 as the ultimate standard for all length measurement: Ludwig Wittgenstein had once observed, with confusing but accurate drollery, “There is one thing of which one can say neither that it is one meter long, nor that it is not one meter long, and that is the standard meter in Paris.” No longer, for from October 14, 1960, onward, there was no standard meter remaining in Paris, nor anywhere else. This measurement had left the physical world and entered the absolutism and indifference of the universe.
MUCH ELSE BESIDES went on at the 1960 conference, which is held every four peacetime years, and usually in Paris, which on this occasion was perhaps the seminal event in metrology since the invention of the science. Most memorably, the 1960 event saw the formal launch of the present-day International System of Units, known generally by SI, initials derived from the French Système International d’Unités. Most of the world now knows, accepts, recognizes, and uses the SI—with its seven units: of length (the much-aforementioned meter); of time (the second); of electric current (the ampere); of temperature (the kelvin);* of light intensity (the candela); of the amount of a substance (the mole); and of mass (the kilogram). Six of these units are now defined in terms of natural phenomena—generally, of radiation and the behavior of or the number of atoms.
So much else came out of the meeting: the base units; the derived units—like the hertz, the volt, the farad, ohm, lumen, becquerel, henry, coulomb; the authorized prefixes for big and small—at the upper end the deca, kilo, giga, tera, exa, zetta, and yotta (this last being 1024) and at the lower deci, milli, nano, pico, femto, zepto, and yocto (this, to preserve metrologic symmetry, denoting the phenomenally tiny 10−24).
But what did not come out of the meeting was anything definite to relieve the condition of the other old standard, Le Grand K. The delegates—who had created an entirely new measurement system, after all—left Paris that late October, leaving behind them, condemned to remain locked up in a dark cellar under its triple crowns of glass, the melancholy standard mass of the kilogram, moping, miserable, a relic of an earlier century. It would take almost another sixty years for them to find a replacement, and for the highly polished solid metal cylinder, about as tall and wide as a Zippo lighter, about the size of a golf ball, to be relieved of its responsibilities of being the mass against which all the world’s kilograms could and would be measured: in late 2018, it is to be removed from under its well-guarded basement cloche and placed in a museum—a relic of former times, of more ancient technologies.
And since the kilogram’s replacement was to occur so much later than that of the meter, so it enjoyed the benefits of metrology’s even newer technological evolution. For it was to become related to a unit that had long been overlooked as the key to all others—and that is the unit of time, the second.
IT HAS TO do with the notion of frequency, which is after all the inverse of time—it is the number of occurrences of something per second. And frequency is now mentioned in no fewer than six of today’s seven foundational units of measurement.* Frequency is just about everywhere.
Three examples will suffice.
The candela, the unit that suggests the brightness of a source of light, would seem at first blush to have absolutely nothing to do with time. But it has: the international community now defines the candela as the luminous intensity, in a given direction, of a source that emits monochromatic radiation of frequency 540 × 1012 cycles per second and that has a radiant intensity in that direction of 1/683 watt per steradian. Light is here officially related to the second. It is officially linked to the concept of time.
The length of the meter, to select another of the seven units as an example, is now also defined in terms of the second—it is the length of the path traveled by light in vacuum during a time interval of 1/299,792,458 of a second. Length, henceforward (or since 1983, when it was so defined), is thus also related to time. A relationship that is agreed to by all.
And the much-vaunted kilogram, until lately defined as the carefully milled platinum cylinder in Paris, will soon reappear, defined this time in terms of the speed of light—and connected to it by way of the famous Planck constant, which, without going into the details of the thing, is a number, 6.62607004 × 10−34 m2 × kg/s, that, as the symbols imply, is also firmly linked to frequency, and thus to the second. Mass is thus defined in terms of time. The whole world now agrees it should be so: that time underpins everything.
Just as Galileo had so presciently realized when he gazed up at the lantern in Pisa. Just as Wilkins had later proposed, and the Prince of Talleyrand seconded. All are connected by time.
AND YET—JUST what is time?
“If no one asks me,” Saint Augustine is said to have remarked, “I know what it is. If I wish to explain it to him who asks, I do not know.” Time moves, we know that. But how does it move? What is its moving, exactly? And why does it only move forward, in one direction? And so far as time is concerned, what does direction mean, exactly? Can one be any more precise than simply to say, as Einstein once did, that time is what clocks measure?
All such questions are suddenly especially pertinent.
HOW WE ARRANGE—and how in history we have arranged—the accumulations of time is a matter of choice. On the matters of minutes and hours and days most generally agree*—after all, the sun’s rising and setting have long dictated the nature of time, creating a top-down arrangement that was made for the convenience of human society, and allowed for the notion, even as recently as the 1950s when the second, at the bottom of this top-down arrangement, was defined as 1/86,400th of the passage of a single day.
Beyond days—up into the other human constructs known in English as weeks and months and years—the arrangements became wildly different according to the vagaries of religion and custom and the caprices of personalities. But it is the considered aim of modern metrologists, when dealing with matters concerning the basic unit, the second, that all of the units should agree, exactly. So far as larger units of time are concerned, all are free to do as they wish. But the second itself is sacrosanct.
Until 1967 the second was very much linked to a natural phenomenon—as the fraction of the length of the day, at the top of the top-down pyramid—by way of a sundial or by a seconds pendulum, which ticked away the duration of a day at intervals that were determined by the length of the pendulum itself. It was easy enough—if time consuming—to adjust the length of a pendulum until it ticked away at the rate of 1/86,400 of the period between two sun-at-zenith moments we call noon. Easier still to apply the equation from schooldays, of T=2π√lg where l is the length of the pendulum, g is the acceleration of gravity, and T is the time taken by each beat of the pendulum.
To deduce the second from the day is indeed easy enough. The greater problem, recognized from antiquity, is that the length of the day itself turned out to be almost infinitely variable, due to a range of reasons both local—such as the frictional effects of the tides—and astronomical—such as changes in the Earth’s rotation, the wobbling-top precession of its axis, the steady slowing (and occasional random speeding-up) of the Earth’s period of rotation. For how can a second be accurately defined if the standard against which it is measured is inherently unstable? This was James Clerk Maxwell’s singular problem, once again.
The way that this problem was first dealt with was to replace the day at the top of the notional pyramid with the much larger unit of the year—and to measure increments of time as fractions of the year, of the passage of time taken for the Earth to make one complete turn around the sun. The notion of ephemeris time was born with this decision—ephemeris time being based on the movements of the planets and the stars as recorded from centuries of observation.
Tables known as ephemerides—almanacs a less confusing term—get better and better as the years go on, because of the ever-more sophisticated observations first from telescopes and, later on, from satellites. And so, the modern concept of ephemeris time, defined by the Jet Propulsion Laboratory in Pasadena, became standard, in 1952.
A second was then defined as 1/31,556,925.975 of a year—and not just any year, but the year 1900 beginning on January 0—this last being a way of using the midnight handover of 1899 December 31 to 1900 January 1 as the starting point, and ignoring the fact that, inconveniently for some, years—being a human construct—never begin with a day labeled as 0. Our counting system does (0.5); our clocks do (00.23 h); but our calendars (January 1, never January 0) do not.
But then the year itself, based as it is on the wanderings of a planet around a star, was found to be just as arbitrary and as wanting in precision as was the day, and so better still was needed. As it happens a better solution was waiting in the wings: Maxwell’s answer. That there are things in nature, most especially in atomic and subatomic nature, that vibrate at frequencies which never, ever, ever change. Or not to any measurable degree.
Quartz, as we discovered back at Seiko, is one such. The seconds presented by a quartz-based timekeeper were unvaryingly precise seconds; and the seconds they soundlessly accumulated turned into precise minutes, precise hours, precise days.
And yet, just as with Maxwell’s argument against using a human-scale or even a planetary-scale basis for defining the meter and the kilogram, so in the latter half of the twentieth century it became clear that though quartz is good enough for the average consumer of time, it is manifestly not good enough for the scientist, nor for the national metrology institutes around the world. Which led to the evolution of the standards that are in use today, and which employ one or more members of the more recently invented families of atomic clocks.
In an atomic timekeeper the same basic principle applies—that a naturally occurring substance can be induced to vibrate at a certain fixed and measurable rate. With a quartz crystal, it was the simple and easily knowable property of its vibration under the influence of an electrical charge that made it so attractive a candidate for timekeeping. With an atom, the frequency was a more delicate thing: it required that an electron in orbit around the nucleus of a candidate element be persuaded to shift to another orbit—to make a quantum leap, or a quantum jump, this being the origin of the phrase. It had been known since the nineteenth century that when an electron performs this leap from its ground state to another energy level, it emits a highly stable burp of electromagnetic radiation.
The radiation from such an atomic transition, it was said by many physicists, was so exact and so stable that it might well one day be used as the basis of a clock. The basic concept was first demonstrated in the United States in 1949, in a precursor to the laser, the maser, and which employed molecules of ammonia.
The first true atomic clock was invented by a Briton, Louis Essen, in 1955, when he and a colleague, Jack Parry, made a model and used as its heartbeat the transition of electrons orbiting the nucleus of atoms of the metal cesium. This might seem a curious choice: cesium is the softest of all metals—almost liquid at room temperature—and is a pale gold-colored substance that ignites spontaneously in the air and explodes when in contact with water. However, it now has a use and value beyond all measure, since in transition it emits radiation at such a steady and unvarying beat that the scientists at Sèvres readily agreed, in 1967, and after much badgering by Louis Essen and Britain’s National Physical Laboratory, where he worked, that it be used as the basis for a new definition of the second.
As it remains today. The definition of the second today is quite simply, if simply be the word, the duration of 9,192,631,770 cycles at the microwave frequency of the spectral line corresponding to the transition between two hyperfine energy levels of the ground state of cesium 133. The ten-digit number, daunting though it may sound at first, is known by every metrologist worth his or her salt, and is as familiarly and frequently bandied about as might be an American telephone number, and which it digitally resembles.
Cesium clocks are now everywhere, costly and bulky though they still may be. There are said to be 320 of them, all checked against one another—the American master clocks checked every twelve minutes to eradicate nanosecond errors. All these are then checked themselves by squadrons of even more accurate timekeepers called cesium fountain clocks, of which there are a dozen, and which employ lasers to roil a mess of cesium atoms inside a steel vessel and derive even greater accuracies than their simpler siblings.
In America, the master clocks are in Maryland and Colorado; and the GPS system—the highly precise and time-based creation described in chapter 8—is given its critical time data from an ensemble of no fewer than fifty-seven cesium clocks held at the U.S. Naval Observatory* in Washington, DC, and which in turn are augmented by a further twenty-four at the formidably well-protected Schriever Air Force Base in Colorado.
The accuracy of these clocks and the claimed accuracies of even newer ones that are being constructed or experimented with at various standards laboratories around the world—the ytterbium clock being studied at the National Institute of Standards and Technology outside Gaithersburg, Maryland, being a prime example—begin to verge on the barely credible. The British Standards Institution, for example, has claimed that while the standard cesium clock has an accuracy to some 10−13 seconds, with its fine-tuned cesium fountain clock known as NPL-CsF2, the second could be measured to a known degree of precision of 2.3 × 10−16, or 0.000 000 000 000 000 23.
This means it would neither lose nor gain a second in 138 million years.
Now there is talk of quantum logic clocks and optical clocks that deliver even more remarkable figures, one with a claimed accuracy of 8.6 × 10−18, meaning that time would be kept impeccably for a billion years, and the charming concept of taking the fob watch from the pocket every few days and lovingly adjusting it would be gone forever, both from the human imagination and from memory.
IT IS INTO this rarefied world of precise chronometry that science has now jumped—pouring money and equipment and personnel into matters relating specifically to the measurement of the bizarreries of time—and for the simple reason, fully recognized by teams of metrologists, that time underpins everything. “Everything” even includes, it now seems, the property of gravity. A clock that is on a table just five centimeters higher than another will record seconds that are barely measurably longer—but incontrovertibly longer, nonetheless—than its partner. And this is simply because it is less affected by the Earth’s gravity, the planet’s center being that tiny number of centimeters more distant.
THIS LINK, BETWEEN time and gravity, is now proven. And this is a happenstance of modern physics that in China—where much work is being conducted on the nature of time—has a certain unanticipated charm. There is a certain delight of synchrony for the metrologists who are conducting time-related experiments in their brand-new and handsomely funded laboratories near Beijing. For outside the very front door of their research center there stands a gift from England’s main metrology institute, the National Physical Laboratory in Teddington, west of London.
It is a sapling apple tree.
Outwardly it looks quite ordinary—just one tree among a copse of others. But this happens to be a very special tree indeed. If the Beijing summers are warm and not too dry, it will bear apples of the variety known as Flower of Kent, which are said to be crunchy, juicy, and acidic. But this is not the reason. It is the tree’s pedigree that marks it out as unique.
Before the NPL gave it as a gift the apple tree’s immediate ancestor was grown from a scion that had been grafted in the 1940s at a fruit-research station south of London, which in turn had been taken from a tree in the garden of an abbey in Buckinghamshire, and which had been planted in the 1820s. This in turn was a relic of a mighty tree that had been blown down in a great historic storm that had devastated a country estate a little farther north, that of Woolsthorpe Manor in Lincolnshire.
And Woolsthorpe Manor was the home of Sir Isaac Newton. It was to Lincolnshire that Newton had fled from Cambridge in 1666—and it was here, during the summer of that annus mirabilis, that he famously observed the apple falling from the tree. It was here, and from wondering of the force that might have impelled the apple’s fall, that he came up with the notion of gravity, as a force that affected both this humble fruit and by logical extension affected the constant motion and altitude of the moon in orbit around the planet Earth.
So, Isaac Newton’s apple tree—or more properly a child descendant of it—now flowers and fruits in a Beijing garden, beside where the Ming emperors once buried their dead, where one can see the Great Wall running along the mountain ridges, and where China’s latest generation of scientists are confirming their intellectual ambitions by working out, with the greatest accuracy, the effect that gravity has upon the steady beat of time.
Where, in other words, they are trying to establish and prove a physical, traceable connection between on the one hand the mysterious force that keeps us all rooted here on Earth, and on the other the fundamental steady tick of duration. The duration by which, fundamentally, we measure everything that we make and use, and which in turn helps establish for us with unfailing exactitude the precision that allows the modern world to function.