5. Time

In October 1967, a group of scientists from around the world gathered in Paris for a conference with the unassuming name “The General Conference on Weights and Measures.” If you’ve had the questionable fortune to attend an academic conference before, you probably have some sense of how these affairs go: papers are presented, along with an interminable series of panel discussions, broken up by casual networking over coffee; there’s gossip and infighting at the hotel bar at night; everyone has a tolerably good time, and not a whole lot gets done. But the General Conference on Weights and Measures broke from that venerable tradition. On October 13, 1967, the attendees agreed to change the very definition of time.

For almost the entire span of human history, time had been calculated by tracking the heavenly rhythms of solar bodies. Like the earth itself, our sense of time revolved around the sun. Days were defined by the cycle of sunrise and sunset, months by the cycles of the moon, years by the slow but predictable rhythms of the seasons. For most of that stretch, of course, we misunderstood what was causing those patterns, assuming that the sun was revolving around the earth, and not the reverse. Slowly, we built tools to measure the flow of time more predictably: sundials to track the passage of the day; celestial observatories such as Stonehenge to track seasonal milestones like the summer solstice. We began dividing up time into shorter units—seconds, minutes, hours—with many of those units relying on a base-12 counting system passed down from the ancient Egyptians and Sumerians. Time was defined by grade-school division: a minute was one-sixtieth of an hour, an hour was one-twenty-fourth of a day. And a day was simply the time that passed between the two moments when the sun was highest in the sky.

But starting about sixty years ago, as our tools of measuring time increased in precision, we began to notice flaws in that celestial metronome. The clockwork of the heavens turned out to be, well, a bit wobbly. And that’s what the General Conference on Weights and Measures set out to address in 1967. If we were going to be truly accurate with our measurements of time, we needed to trade the largest entity in the solar system for one of the smallest.

Nundinal calendar, Rome. The ancient Etruscans developed an eight-day market week, known as the nundinal cycle, around the eighth or seventh century BC.

MEASURED PURELY BY TOURIST ATTENTION, the Duomo of Pisa is generally overshadowed by its famous leaning neighbor next door, but the thousand-year-old cathedral, with its brilliant white stone and marble façade, is in many ways a more impressive structure than the tilted bell tower beside it. Stand at the base of the nave and gaze up toward the fourteenth-century apse mosaic, and you can re-create a moment of absentminded distraction that would ultimately transform our relationship to time. Suspended from the ceiling is a collection of altar lamps. They are motionless now, but legend has it that in 1583, a nineteen-year-old student at the University of Pisa attended prayers at the cathedral and, while daydreaming in the pews, noticed one of the altar lamps swaying back and forth. While his companions dutifully recited the Nicene Creed around him, the student became almost hypnotized by the lamp’s regular motion. No matter how large the arc, the lamp appeared to take the same amount of time to swing back and forth. As the arc decreased in length, the speed of the lamp decreased as well. To confirm his observations, the student measured the lamp’s swing against the only reliable clock he could find: his own pulse.

Most nineteen-year-olds figure out less scientific ways to be distracted while attending mass, but this college freshman happened to be Galileo Galilei. That Galileo was daydreaming about time and rhythm shouldn’t surprise us: his father was a music theorist and played the lute. In the middle of the sixteenth century, playing music would have been one of the most temporally precise activities in everyday culture. (The musical term “tempo” comes from the Italian word for time.) But machines that could keep a reliable beat didn’t exist in Galileo’s age; the metronome wouldn’t be invented for another few centuries. So watching the altar lamp sway back and forth with such regularity planted the seed of an idea in Galileo’s young mind. As is so often the case, however, it would take decades before the seed would blossom into something useful.

Galileo spent the next twenty years becoming a professor of mathematics, experimenting with telescopes, and more or less inventing modern science, but he managed to keep the memory of that swinging altar lamp alive in his mind. Increasingly obsessed with the science of dynamics, the study of how objects move through space, he decided to build a pendulum that would re-create what he had observed in the Duomo of Pisa so many years before. He discovered that the time it takes a pendulum to swing is not dependent on the size of the arc or the mass of the object swinging, but only on the length of the string. “The marvelous property of the pendulum,” he wrote to fellow scientist Giovanni Battista Baliani, “is that it makes all its vibrations, large or small, in equal times.”

In equal times. In Galileo’s age, any natural phenomenon or mechanical device that displayed this rhythmic precision seemed miraculous. Most Italian towns in that period had large, unwieldy mechanical clocks that kept a loose version of the correct time, but they had to be corrected by sundial readings constantly or they would lose as much as twenty minutes a day. In other words, the state of the art in timekeeping technology was challenged by just staying accurate on the scale of days. The idea of a timepiece that might be accurate to the second was preposterous.

The swinging altar lamp inside Duomo of Pisa

Preposterous, and seemingly unnecessary. Just like Frederic Tudor’s ice trade, it was an innovation that had no natural market. You couldn’t keep accurate time in the middle of the sixteenth century, but no one really noticed, because there was no need for split-second accuracy. There were no buses to catch, or TV shows to watch, or conference calls to join. If you knew roughly what hour of the day it was, you could get by just fine.

The need for split-second accuracy would emerge not from the calendar but from the map. This was the first great age of global navigation, after all. Inspired by Columbus, ships were sailing to the Far East and the newly discovered Americas, with vast fortunes awaiting those who navigated the oceans successfully. (And almost certain death awaiting those who got lost.) But sailors lacked any way to determine longitude at sea. Latitude you could gauge just by looking up at the sky. But before modern navigation technology, the only way to figure out a ship’s longitude involved two clocks. One clock was set to the exact time of your origin point (assuming you knew the longitude of that location). The other clock recorded the current time at your location at sea. The difference between the two times told you your longitudinal position: every four minutes of difference translated to one degree of longitude, or sixty-eight miles at the equator.

Galileo Galilei

In clear weather, you could easily reset the ship clock through accurate readings of the sun’s position. The problem was the home-port clock. With timekeeping technology losing or gaining up to twenty minutes a day, it was practically useless on day two of the journey. All across Europe, bounties were offered for anyone who could solve the problem of determining longitude at sea: Philip III of Spain offered a life pension in ducats, while the famous Longitude Prize in England promised more than a million dollars in today’s currency. The urgency of the problem—and the economic rewards for solving it—brought Galileo’s mind back to the pursuit of “equal time” that had first captured his imagination at the age of nineteen. His astronomical observations had suggested that the regular eclipses of Jupiter’s moons might be useful for navigators keeping time at sea, but the method he devised was too complicated (and not as accurate as he had hoped). And so he returned, one last time, to the pendulum.

Fifty-eight years in the making, his slow hunch about the pendulum’s “magical property” had finally begun to take shape. The idea lay at the intersection point of multiple disciplines and interests: Galileo’s memory of the altar lamp, his studies of motion and the moons of Jupiter, the rise of a global shipping industry, and its new demand for clocks that would be accurate to the second. Physics, astronomy, maritime navigation, and the daydreams of a college student: all these different strains converged in Galileo’s mind. Aided by his son, he began drawing up plans for the first pendulum clock.

By the end of the next century, the pendulum clock had become a regular sight throughout Europe, particularly in England—in workplaces, town squares, even well-to-do homes. The British historian E. P. Thompson, in a brilliant essay on time and industrialization published in the late 1960s, noted that in the literature of the period, one of the telltale signs that a character has raised himself a rung or two up the socioeconomic ladder is the acquisition of a pocket watch. But these new timepieces were not just fashion accessories. A hundred times more accurate than its predecessors—losing or gaining only a minute or so a week—the pendulum clock brought about a change in the perception of time that we still live with today.

Drawing of the pendulum clock designed by Italian physicist, mathematician, astronomer, and philosopher Galileo Galilei, 1638–1659.

WHEN WE THINK ABOUT the technology that created the industrial revolution, we naturally conjure up the thunderous steam engines and steam-powered looms. But beneath the cacophony of the mills, a softer but equally important sound was everywhere: the ticking of pendulum clocks, quietly keeping time.

Imagine some alternative history where, for whatever reason, timekeeping technology lags behind the development of the other machines that catalyzed the industrial age. Would the industrial revolution have even happened? You can make a reasonably good case that the answer is no. Without clocks, the industrial takeoff that began in England in the middle of the eighteenth century would, at the very least, have taken much longer to reach escape velocity—for several reasons. Accurate clocks, thanks to their unrivaled ability to determine longitude at sea, greatly reduced the risks of global shipping networks, which gave the first industrialists a constant supply of raw materials and access to overseas markets. In the late 1600s and early 1700s, the most reliable watches in the world were manufactured in England, which created a pool of expertise with fine-tool manufacture that would prove to be incredibly handy when the demands of industrial innovation arrived, just as the glassmaking expertise producing spectacles opened the door for telescopes and microscopes. The watchmakers were the advance guard of what would become industrial engineering.

Marine chronometer, from the Clockmakers’ Museum at the city’s Guildhall, London

More than anything else, though, industrial life needed clock time to regulate the new working day. In older agrarian or feudal economies, units of time were likely to be described in terms of the time required to complete a task. The day was divided not into abstract, mathematical units, but into a series of activities: instead of fifteen minutes, time was described as how long it would take to milk the cow or nail soles to a new pair of shoes. Instead of being paid by the hour, craftsmen were conventionally paid by the piece produced—what was commonly called “taken-work”—and their daily schedules were almost comically unregulated. Thompson cites the diary of one farming weaver from 1782 or 1783 as an example of scattered routines of pre-industrial work:

On a rainy day, he might weave 8½ yards; on October 14th he carried his finished piece, and so wove only 4¾ yards; on the 23rd he worked out till 3 o’clock, wove two yards before the sun set. . . . Apart from harvesting and threshing, churning, ditching and gardening, we have these entries: “Wove 2½ yards the Cow having calved she required much attendance.” On January 25th he wove 2 yards, walked to a nearby village, and did “sundry jobbs [sic] about the lathe and in the yard and wrote a letter in the evening.” Other occupations include jobbing with a horse and cart, picking cherries, working on a mill dam, attending a Baptist association, and a public hanging.

Try showing up for work in a modern office on that kind of clock. (Not even famously laid-back Google could tolerate that level of eccentricity.) For an industrialist trying to synchronize the actions of hundreds of workers with the mechanical tempo of the first factories, this kind of desultory work life was unmanageable. And so the creation of a viable industrial workforce required a profound reshaping of the human perception of time. The pottery manufacturer Josiah Wedgwood, whose Birmingham mills mark the very beginnings of industrial England, first implemented the convention of “clocking in” to work each day. (The lovely double entendre of “punching the clock” would have been meaningless to anyone born before 1700.) The whole idea of an “hourly wage”—now practically universal in the modern world—came out of the time regimen of the industrial age. In such a system, Thompson writes, “the employer must use the time of his labour, and see it is not wasted. . . . Time is now currency: it is not passed but spent.”

For the first generations living through this transformation, the invention of “time discipline” was deeply disorienting. Today, most of us in the developed world—and increasingly in the developing world—have been acclimated to the strict regimen of clock time from an early age. (Sit in on your average kindergarten classroom and you’ll see the extensive focus on explaining and reinforcing the day’s schedule.) The natural rhythms of tasks and leisure had to be forcibly replaced with an abstract grid. When you spend your whole life inside that grid, it seems like second nature, but when you are experiencing it for the first time, as the laborers of industrial England did in the second half of the eighteenth century, it arrives as a shock to the system. Timepieces were not just tools to help you coordinate the day’s events, but something more ominous: the “deadly statistical clock,” in Dickens’s Hard Times, “which measured every second with a beat like a rap upon a coffin lid.”

Workers punching the time clock at the Rouge Plant of the Ford Motor Company.

Naturally, that new regimen provoked a backlash. Not so much from the working classes—who began operating within the dictates of clock time by demanding overtime wages or shorter workdays—but rather from the aesthetes. To be a Romantic at the turn of the nineteenth century was in part to break from the growing tyranny of clock time: to sleep late, ramble aimlessly through the city, refuse to live by the “statistical clocks” that governed economic life. In The Prelude, Wordsworth announces his break from the “keepers of our time”:

The guides, the wardens of our faculties

And stewards of our labour, watchful men

And skillful in the usury of time

Sages, who in their prescience would control

all accidents, and to the very road

which they have fashioned would confine us down

like engines . . .

The time discipline of the pendulum clock took the informal flow of experience and nailed it to a mathematical grid. If time is a river, the pendulum clock turned it into a canal of evenly spaced locks, engineered for the rhythms of industry. Once again, an increase in our ability to measure things turned out to be as important as our ability to make them.

Potrait of Aaron Lufkin Dennison

That power to measure time was not distributed evenly through society: pocket watches remained luxury items until the middle of the nineteenth century, when a Massachusetts cobbler’s son named Aaron Dennison borrowed the new process of manufacturing armaments using standardized, interchangeable parts and applied the same techniques to watchmaking. At the time, the production of advanced watches involved more than a hundred distinct jobs: one person would make individual flea-sized screws, by turning a piece of steel on a thread; another would inscribe watch cases; and so on. Dennison had a vision of machines mass-producing identical tiny screws that could then be put into any watch of the same model, and machines that would engrave cases at precision speed. His vision took him through a bankruptcy or two, and earned him the nickname “the Lunatic of Boston” in the local press. But eventually, in the early 1860s, he hit on the idea of making a cheaper watch, without the conventional jeweled ornamentation that traditionally adorned pocket watches. It would be the first watch targeted at the mass market, not just the well-to-do.

Dennison’s “Wm. Ellery” watch—named after one of the signers of the Declaration of Independence, William Ellery—became a breakout hit, particularly with the soldiers of the Civil War. More than 160,000 watches were sold; even Abraham Lincoln owned and carried a “Wm. Ellery” watch. Dennison turned a luxury item into a must-have commodity. In 1850, the average pocket watch cost $40; by 1878, a Dennison unjeweled watch cost just $3.50.

With watches spiking in popularity across the country, a Minnesota railroad agent named Richard Warren Sears stumbled across a box of unwanted watches from a local jeweler, and turned a tidy profit selling them to other station agents. Inspired by his success, he partnered with a Chicago businessman named Alvah Roebuck, and together they launched a mail-order publication showcasing a range of watch designs: the Sears, Roebuck catalog. Those fifteen pounds of mail-order catalogs currently weighing down your mailbox? They all started with the must-have gadget of the late nineteenth century: the consumer-grade pocket watch.

Unknown soldier with pocket watch, 1860s (Library of Congress)

WHEN DENNISON FIRST STARTED thinking about democratizing time in America, in one key respect the clocks of the period remained woefully irregular. Local time—in cities and towns across the United States—was now accurate to the second, if you consulted a public clock in a place where time discipline was particularly crucial. But there were literally thousands of distinct local times. Clock time had been democratized, but it had not yet been standardized. Thanks to Dennison, watches were spreading quickly through the system, but they were all running at different times. In the United States, each town and village ran at its own independent pace—with clocks synced to the sun’s position in the sky. As you moved west or east, even a few miles, the shifting relationship to the sun would produce a different time on a sundial. You could be standing in one city at 6:00 p.m., but just three towns over, the correct time would be 6:05. If you asked what time it was 150 years ago, you would have received at least twenty-three different answers in the state of Indiana, twenty-seven in Michigan, and thirty-eight in Wisconsin.

The strangest thing about this irregularity is the fact that no one noticed it. You couldn’t talk directly to someone three towns over, and it took an hour or two to get there by unreliable roads at low speeds. So a few minutes of fuzziness in the respective clocks of each town didn’t even register. But once people (and information) began to travel faster, the lack of standardization suddenly became a massive problem. Telegraphs and railroads exposed the hidden blurriness of nonstandardized clock time, just as, centuries before, the invention of the book had exposed the need for spectacles among the first generation of European readers.

The rewinding of a big Dennison watch (operation done once a year) in the district of Holborn, London.

Trains moving east or west—longitudinally—travel faster than the sun moves through the sky. So for every hour you traveled on a train, you needed to adjust your watch by four minutes. In addition, each railroad was running on its own clock, which meant that making a journey in the nineteenth century took some formidable number crunching. You’d leave New York at 8:00 a.m. New York time, catching the 8:05 on Columbia Railroad time, and arrive in Baltimore three hours later later, at 10:54 Baltimore time, which was, technically speaking, 11:05 Columbia Rail time, where you would wait ten minutes and then catch the 11:01 B&O train to Wheeling, West Virgina, which was, technically speaking again, the 10:49 train if you were on Wheeling time, and 11:10 if your watch was still keeping New York time. And the funny thing is, all those different times were the right ones, at least measured by the sun’s position in the sky. What made time easily measured by sundial made it infuriating by railroad.

The British had dealt with this problem by standardizing the entire country on Greenwich Mean Time in the late 1840s, synchronizing railroad clocks by telegraph. (To this day, clocks in every air traffic control center and cockpit around the world report Greenwich time; GMT is the single time zone of the sky.) But the United States was too sprawling to run off of one clock, particularly after the transcontinental railroad opened in 1869. With eight thousand towns across the country, each on its own clock, and over a hundred thousand miles of railroad track connecting them, the need for some kind of standardized system became overwhelming. For several decades, various proposals circulated for standardizing U.S. time, but nothing solidified. The logistical hurdles of coordinating schedules and clocks were immense, and somehow standardized time seemed to spark a strange feeling of resentment among ordinary citizens, as though it were an act against nature itself. A Cincinnati paper editorialized against standard time: “It is simply preposterous. . . . Let the people of Cincinnati stick to the truth as it is written by the sun, moon and stars.”

The United States remained temporally challenged until the early 1880s, when a railroad engineer named William F. Allen took on the cause. As the editor of a guide to railroad timetables, Allen knew firsthand how Byzantine the existing time system was. At a railroad convention in St. Louis in 1883, Allen presented a map that proposed a shift from fifty distinct railroad times to the four time zones that are still in use, more than a century later: Eastern, Central, Mountain, and Pacific. Allen designed the map so that the divisions between time zones zigzagged slightly to correspond to the points where the major railroad lines connected, instead of having the divisions run straight down meridian lines.

Persuaded by Allen’s plan, the railroad bosses gave him just nine months to make his idea a reality. He launched an energetic campaign of letter-writing and arm-twisting to convince observatories and city councils. It was an extraordinarily challenging campaign, but somehow Allen managed to pull it off. On November 18, 1883, the United States experienced one of the strangest days in the history of clock time, what became known as “the day of two noons.” Eastern Standard Time, as Allen had defined it, ran exactly four minutes behind local New York time. On that November day, the Manhattan church bells rang out the old New York noon, and then four minutes later, a second noon was announced by a second ringing: the very first 12:00 p.m., EST. The second noon was broadcast out across the country via telegraph, allowing railroad lines and town squares all the way to the Pacific to synchronize their clocks.

The very next year, GMT was set as the international clock (based on Greenwich being located on the prime meridian), and the whole globe was divided into time zones. The world had begun to break free from the celestial rhythms of the solar system. Consulting the sun was no longer the most accurate way to tell the time. Instead, pulses of electricity traveling by telegraph wire from distant cities kept our clocks in sync.

ONE OF THE STRANGE PROPERTIES of the measurement of time is that it doesn’t belong neatly to a single scientific discipline. In fact, each leap forward in our ability to measure time has involved a handoff from one discipline to another. The shift from sundials to pendulum clocks relied on a shift from astronomy to dynamics, the physics of motion. The next revolution in time would depend on electromechanics. With each revolution, though, the general pattern remained the same: scientists discover some natural phenomenon that displays the propensity for keeping “equal time” that Galileo had observed in the altar lamps, and before long a wave of inventors and engineers begin using that new tempo to synchronize their devices. In the 1880s, Pierre and Jacques Curie first detected a curious property of certain crystals, including quartz, the very same material that had been so revolutionary for the glassmakers of Murano: under pressure, these crystals could be made to vibrate at a remarkably stable frequency. (This property came to be known as “piezoelectricity.”) The effect was even more pronounced when an alternating current was applied to the crystal.

The quartz crystal’s remarkable ability to expand and contract in “equal time” was first exploited by radio engineers in the 1920s, who used it to lock radio transmissions to consistent frequencies. In 1928, W. A. Marrison of Bell Labs built the first clock that kept time from the regular vibrations of a quartz crystal. Quartz clocks lost or gained only a thousandth of a second per day, and were far less vulnerable to atmospheric changes in temperature or humidity, not to mention movement, than pendulum clocks. Once again, the accuracy with which we measured time had increased by several orders of magnitude.

For the first few decades after Marrison’s invention, quartz clocks became the de facto timekeeping devices for scientific or industrial use; standard U.S. time was kept by quartz clocks starting in the 1930s. But by the 1970s, the technology had gotten cheap enough for a mass market, with the emergence of the first quartz-based wristwatches. Today, just about every consumer appliance that has a clock on it—microwaves, alarm clocks, wristwatches, automobile clocks—all run on the equal time of quartz piezoelectricity. That transformation was predictable enough. Someone invents a better clock, and the first iterations are too expensive for consumer use. But eventually the price falls, and the new clock enters mainstream life. No surprise there. Once again, the surprise comes from somewhere else, from some other field that wouldn’t initially seem to be all that dependent on time. New ways of measuring create new possibilities for making. With quartz time, that new possibility was computation.

A microprocessor is an extraordinary technological achievement on many levels, but few are as essential as this: computer chips are masters of time discipline. Think of the coordination needs of the industrial factory: thousands of short, repetitive tasks performed in proper sequence by hundreds of individuals. A microprocessor requires the same kind of time discipline, only the units being coordinated are bits of information instead of the hands and bodies of millworkers. (When Charles Babbage first invented a programmable computer in the middle of the Victorian Age, he called the CPU “the mill” for a reason.) And instead of thousands of operations per minute, the microprocessor is executing billions of calculations per second, while shuffling information in and out of other microchips on the circuit board. Those operations are all coordinated by a master clock, now almost without exception made of quartz. (This is why tinkering with your computer to make it go faster than it was engineered to run is called “overclocking.”) A modern computer is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.

The accuracy of the quartz clock made its pendulum predecessors seem hopelessly erratic. But it had a similar effect on the ultimate timekeepers: the earth and the sun. Once we started measuring days with quartz clocks, we discovered that the length of the day was not as reliable as we had thought. Days shortened or lengthened in semi-chaotic ways thanks to the drag of the tides on the surface of the planet, wind blowing over mountain ranges, or the inner motion of the earth’s molten core. If we really wanted to keep exact time, we couldn’t rely on the earth’s rotation. We needed a better timepiece. Quartz let us “see” that the seemingly equal times of a solar day weren’t nearly as equal as we had assumed. It was, in a way, the deathblow to the pre-Copernican universe. Not only was the earth not the center of the universe, but its rotation wasn’t even consistent enough to define a day accurately. A block of vibrating sand could do the job much better.

KEEPING PROPER TIME IS ULTIMATELY all about finding—or making—things that oscillate in consistent rhythms: the sun rising in the sky, the moon waxing and waning, the altar lamp, the quartz crystal. The discovery of the atom in the early days of the twentieth century—led by scientists such as Niels Bohr and Werner Heisenberg—set in motion a series of spectacular and deadly innovations in energy and weaponry: nuclear power plants, hydrogen bombs. But the new science of the atom also revealed a less celebrated, but equally significant, discovery: the most consistent oscillator known to man. Studying the behavior of electrons orbiting within a cesium atom, Bohr noticed that they moved with an astonishing regularity. Untroubled by the chaotic drag of mountain ranges or tides, the electrons tapped out a rhythm that was several orders of magnitude more reliable than the earth’s rotation.

The first atomic clocks were built in the mid-1950s, and immediately set a new standard of accuracy: we were now capable of measuring nanoseconds, a thousand times more accurate than the microseconds of quartz. That leap forward was what ultimately enabled the International Conference of Weights and Measures in 1967 to declare that it was time to reinvent time. In the new era, the master time for the planet would be measured in atomic seconds: “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.” A day was no longer the time it took the earth to complete one rotation. A day became 86,400 atomic seconds, ticked off on 270 synchronized atomic clocks around the world.

The old timekeepers didn’t die off completely, though. Modern atomic clocks actually tick off the seconds using a quartz mechanism, relying on the cesium atom and its electrons to correct any random aberrations in the quartz timekeeping. And the world’s atomic clocks are reset every year based on the chaotic drift of the earth’s orbit, adding or gaining a second so that the atomic and solar rhythms don’t get too far out of sync. The multiple scientific fields of time discipline—astronomy, electromechanics, subatomic physics—are all embedded within the master clock.

The rise of the nanosecond might seem like an arcane shift, interesting only to the sort of person who attends a conference on weights and measures. And yet everyday life has been radically transformed by the rise of atomic time. Global air travel, telephone networks, financial markets—all rely on the nanosecond accuracy of the atomic clock. (Rid the world of these modern clocks, and the much vilified practice of high-frequency trading would disappear in a nanosecond.) Every time you glance down at your smartphone to check your location, you are unwittingly consulting a network of twenty-four atomic clocks housed in satellites in low-earth orbit above you. Those satellites are sending out the most elemental of signals, again and again, in perpetuity: the time is 11:48:25.084738 . . . the time is 11:48:25.084739. . . . When your phone tries to figure out its location, it pulls down at least three of these time stamps from satellites, each reporting a slightly different time thanks to the duration it takes the signal to travel from satellite to the GPS receiver in your hand. A satellite reporting a later time is closer than one reporting an earlier time. Since the satellites have perfectly predictable locations, the phone can calculate its exact position by triangulating among the three different time stamps. Like the naval navigators of the eighteenth century, GPS determines your location by comparing clocks. This is in fact one of the recurring stories of the history of the clock: each new advance in timekeeping enables a corresponding advance in our mastery of geography—from ships, to railroads, to air traffic, to GPS. It’s an idea that Einstein would have appreciated: measuring time turns out to be key to measuring space.

Professor Charles H. Townes, executive of the physics department at Columbia University, is shown with the “atomic clock” in the university’s physics department. Date released: January 25, 1955.

The next time you glance down at your phone to check what time it is or where you are, the way you might have glanced at a watch or a map just two decades ago, think about the immense, layered network of human ingenuity that has been put in place to make that gesture possible. Embedded in your ability to tell the time is the understanding of how electrons circulate within cesium atoms; the knowledge of how to send microwave signals from satellites and how to measure the exact speed with which they travel; the ability to position satellites in reliable orbits above the earth, and of course the actual rocket science needed to get them off the ground; the ability to trigger steady vibrations in a block of silicon dioxide—not to mention all the advances in computation and microelectronics and network science necessary to process and represent that information on your phone. You don’t need to know any of these things to tell the time now, but that’s the way progress works: the more we build up these vast repositories of scientific and technological understanding, the more we conceal them. Your mind is silently assisted by all that knowledge each time you check your phone to see what time it is, but the knowledge itself is hidden from view. That is a great convenience, of course, but it can obscure just how far we’ve come since Galileo’s altar-lamp daydreams in the Duomo of Pisa.

AT FIRST GLANCE, THE STORY of time’s measurement would seem to be all about acceleration, dividing up the day into smaller and smaller increments so that we can move things faster: bodies, dollars, bits. But time in the atomic age has also moved in the exact opposite direction: slowing things down, not speeding them up; measuring in eons, not microseconds. In the 1890s, while working on her doctoral thesis in Paris, Marie Curie proposed for the first time that radiation was not some kind of chemical reaction between molecules, but something intrinsic to the atom—a discovery so critical to the development of physics, in fact, that she would become the first woman ever to win a Nobel Prize. Her research quickly drew the attention of her husband, Pierre Curie, who abandoned his own research into crystals to focus on radiation. Together they discovered that radioactive elements decayed at constant rates. The half-life of carbon 14, for instance, is 5,730 years. Leave some carbon 14 lying around for five thousand years or so, and you’ll find that half of it is gone.

Once again, science had discovered a new source of “equal time”—only this clock wasn’t ticking out the microseconds of quartz oscillations, or the nanoseconds of cesium electrons. Radiocarbon decay was ticking on the scale of centuries or millennia. Pierre Curie had surmised that the decay rate of certain elements might be used as a “clock” to determine the age of rocks. But the technique, now popularly known as carbon dating, wasn’t perfected until the late 1940s. Most clocks focus on measuring the present: What time is it right now? But radiocarbon clocks are all about the past. Different elements decay at wildly different rates, which means that they are like clocks running at different time scales. Carbon 14 “ticks” every five thousand years, but potassium 40 “ticks” every 1.3 billion years. That makes radiocarbon dating an ideal clock for the deep time of human history, while potassium 40 measures geologic time, the history of the planet itself. Radiometric dating has been critical in determining the age of the earth itself, establishing the most convincing scientific evidence that the biblical story of the earth being six thousand years old is just that: a story, not fact. We have immense knowledge about the prehistoric migrations of humans across the globe in large part thanks to carbon dating. In a sense, the “equal time” of radioactive decay has turned prehistoric time into history. When Homo sapiens first crossed the Bering Land Bridge into the Americas more than ten thousand years ago, there were no historians capable of writing down a narrative account of their journey. Yet their story was nonetheless captured by the carbon in their bones and the charcoal deposits they left behind at campsites. It was a story written in the language of atomic physics. But we couldn’t read that story without a new kind of clock. Without radiometric dating, “the deep time” of human migrations or geologic change would be like a history book where all the pages have been randomly shuffled: teeming with facts but lacking chronology and causation. Knowing what time it was turned that raw data into meaning.

HIGH IN THE SOUTHERN SNAKE MOUNTAINS in eastern Nevada, a grove of bristlecone pines grows in the dry, alkaline soil. The pines are small trees for conifers, rarely more than thirty feet high, gnarled by the constant winds rolling across the desert range. We know from carbon dating (and tree rings) that some of them are more than five thousand years old—the oldest living things on the planet.

At some point, several years from now, a clock will be buried in the soil beneath those pines, a clock designed to measure time on the scale of civilizations, not seconds. It will be—as its primary designer, the computer scientist Danny Hillis, puts it—“a clock that ticks once a year. The century hand advances once every 100 years, and the cuckoo comes out on the millennium.” It is being engineered to keep time for at least ten thousand years, roughly the length of human civilization to date. It is an exercise in a different kind of time discipline: the discipline of avoiding short-term thinking, of forcing yourself to think about our actions and their consequences on the scale of centuries and millennia. Borrowing a wonderful phrase from the musician and artist Brian Eno, the device is called “the Clock of the Long Now.”

The Clock of the Long Now

The organization behind this device, the Long Now Foundation—cofounded by Hillis, Eno, Stewart Brand, and a few other visionaries—aims to build a number of ten-thousand-year clocks. (The first one is being constructed for a mountainside location in West Texas.) Why go to such extravagant lengths to build a clock that might tick only once in your lifetime? Because new modes of measuring force us to think about the world in a new light. Just as the microseconds of quartz and cesium opened up new ideas that transformed everyday life in countless ways, the slow time of the Long Now clock helps us think in new ways about the future. As Long Now board member Kevin Kelly puts it:

If you have a Clock ticking for 10,000 years what kinds of generational-scale questions and projects will it suggest? If a Clock can keep going for ten millennia, shouldn’t we make sure our civilization does as well? If the Clock keeps going after we are personally long dead, why not attempt other projects that require future generations to finish? The larger question is, as virologist Jonas Salk once asked, “Are we being good ancestors?”

This is the strange paradox of time in the atomic age: we live in ever shorter increments, guided by clocks that tick invisibly with immaculate precision; we have short attention spans and have surrendered our natural rhythms to the abstract grid of clock time. And yet simultaneously, we have the capacity to imagine and record histories that are thousands or millions of years old, to trace chains of cause and effect that span dozens of generations. We can wonder what time it is and glance down at our phone and get an answer that is accurate to the split-second, but we can also appreciate that the answer was, in a sense, five hundred years in the making: from Galileo’s altar lamp to Niels Bohr’s cesium, from the chronometer to Sputnik. Compared to an ordinary human being from Galileo’s age, our time horizons have expanded in both directions: from the microsecond to the millennium.

Which measure of time will win out in the end: our narrow focus on the short term, or our gift for the long now? Will we be high-frequency traders or good ancestors? For that question, only time will tell.