CHAPTER 20
              OF CALENDARS AND DIALS

Time must never be thought of as pre-existing in any sense; it is a manufactured quantity.

—HERMANN BONDI1

A calendar is a tool which cannot be justified by either logic or astronomy.

—E. J. BICKERMAN2

EVEN THE MOST SOPHISTICATED PEOPLE LOSE THEIR HEADS OVER the mystery of counting time,” wrote Umberto Eco in the run-up to the current millennium. He was stating a general truth, although he was specifically talking about the confusion over whether the millennium would arrive on December 31, 1999, or the same day in the year 2000—which went to the question of whether year one began at zero or a year from zero—the zero referred to, of course, being the moment of Christ’s birth, that itself being debatable.3 This system of counting time from the birth of Christ was initiated in the sixth century A.D. by Dionysius Exiguus; before him, dates were calculated from the reign of Diocletian (A.D. 284-305) onward, or from the beginning of the world (measured with an unreal precision), but Dionysius got the year of the Nativity wrong, so calculations were off from the word go.

In 1956, many school examinations in Britain, as well as plays and other celebrations marking the two thousandth anniversary of Caesar’s assassination, caused substantial embarrassment when people realized that without the year zero the deed was only 1,999 years ago. By the same reckoning, Jesus died at thirty-two, not thirty-three. There are of course other systems for creating calendars and measuring years. On September 12, 2007, Ethiopians ushered in their own millennium, based on a calendar more than seven years behind the Gregorian. Then there is the town of Hamelin, which long dated its municipal records not by the year of grace but from July 26, 1284, when, say its archives, 130 children were taken from the city—apparently by the Pied Piper—never to be seen again.4

There are yet more possible sources of confusion. Calendars still divide about equally on when the seasons begin: is the vernal equinox March 20 or 21? It is still a matter of contention among historians when Britain entered the First World War, since her ultimatum said hostilities would begin at midnight, but neglected to say whether by London or Berlin time. More recently, the raid on Pearl Harbor in 1941, which is commemorated as having taken place on December 7 in Hawaii, occurred on December 8 in Japan, which lies on the other side of the International Date Line.

Ethiopians celebrating the new millennium—our September 12, 2007 Illu(20.1)

We can put to one side the business of correctly reckoning time zones, eras, and millennia; but what of years and months? All calendric calculations involve some form of astronomy, whether observations of the Sun or the Moon or both, and every major civilization has developed at least one form of calendar acknowledging its own commemorations and festivals. Not to mark time is to be lost out of time: in the early 1940s, an anthropologist studying the Sirionó tribe in Bolivia would tellingly note, “No records of time are kept, and no type of calendar exists.” The tribe was, he concluded, “man in the raw state of nature.”5

Almost all early civilizations started with lunar calendars—the Babylonians, Greeks, Jews, and Egyptians in the Middle East; the Aztecs and Inca in the Americas; and the Chinese and Hindus in East Asia—then transformed them into lunar-solar hybrids. The Moon, after all, before the advent of sophisticated mapping systems, was more reliable than any star (moon, meaning “the measurer”; thus month, “the time measured”); but it often differed from the natural calendar of farmer and shepherd. As a result, any given society might employ three calendars, one for government use, one for religious observances, and yet a third for day-to-day purposes.

Of all the myriad forms known to history, only four calendars have been based on the Sun alone: the Egyptian (eventually), the Achaemenian Later Avestan (as used in Persia from 559 to 331 B.C.), that developed by the Mayas and adopted by the Aztecs, and our own Julian/Gregorian. Even in these, the lunar element was never suppressed completely.6 Whatever the community, its main religious dates shifted yearly against the seasons, be it Easter, Eid, Deepavali, the Chinese New Year, or Yom Kippur.

The difficulties confronting the early calendar makers, whether using Sun or Moon, were stupendous. The only way to tell the time was by the Sun during daylight hours and by the stars at night, and in the course of history each people, sometimes each generation, attempted its own solution. An added pressure came shortly before 3200 B.C. with the invention of writing: as literacy spread, people sought to assign dates to records, letters, and inventories that would be commonly recognized and understood.

Whenever a calendar had to be created, such was its importance that it acquired a dimension of real power. In ancient and medieval China, since its emperors were considered the embodiment of the heavens’ will, each change of reign—even more important, each change of dynasty—required that a fresh calendar be created, fixing different festival days, and yet new dates to plant or harvest, in order to show that a new disposition of celestial influences had asserted itself. (One might wonder just how many such changes could be made.) This tradition was well established by Han times (206 B.C. to A.D. 220) and the era between the early Han Dynasty and the Ming Dynasty of A.D. 1368 gave rise to some forty new calendars. “For an agricultural economy,” observes Joseph Needham of all this, “astronomical knowledge as a regulator of the calendar was of prime importance. He who could give a calendar to the people would become their leader.… The promulgation of the calendar by the emperor was a right corresponding to the issue of minted coins, with image and superscription, by Western rulers. The use of it signified recognition of imperial authority.”*

From about 2000 B.C., the Babylonians set their calendar from direct astronomical observation. The official day began at sunset, except for the first of each new month, as reckoned from the sighting of its young crescent moon. If this couldn’t be seen—owing to poor visibility or because it was too close to the Sun—the beginning of the month was simply deferred, although common sense required that no month last more than thirty days. Strictly, a lunar month is the time it takes the Moon to pass through each of its phases—new moon, half, full moon—and return to its original position: 29 days, 12 hours, 44 minutes, and 3 seconds. At some point in the fourth century B.C. yet another system was devised, the Babylonians conceiving the notion of a “mean” sun—a fictitious sun moving at a uniform rate—which, without essential change, is what guides our reckonings today.

The Chinese, like the Babylonians, relied wholly on observation, and seem to have drawn few deductions from the errors that resulted: their almanacs were continually falling off course. Over two thousand years they made more than fifty revisions, not all of which marked changes of reign. Their first calendar went by the Moon, with alternating months of twenty-nine and thirty days, shortchanging the year by eleven days. By about the sixth century, their astronomers, again echoing the Babylonians, recognized a nineteen-year cycle after which the phases of the Moon recurred on the same day of the solar year, so they took to adding extra days as necessary, to “save the phenomena”—that is, to keep calculation in step with nature. By the first century B.C. they had adopted a system of twenty-four fortnightly periods, each such period corresponding to a 15-degree motion of the Sun along the ecliptic (its apparent path in the sky). Their year began on February 5, and the days of their months had names that, in imagery at least, put other cultures to shame—for instance, “The Awakening of Insects” (March 7); “Grain in Ear” (June 7); and “Descent of Hoarfrost” (October 24). By the sixth Christian century, the irregular apparent motion of the Sun had been taken into account, and eventually the motions of Sun and Moon were incorporated.*

The Islamic calendar, at first lunar, then changed to a lunar-solar mix (the rightward-facing crescent, the sign of a new moon, appears on the flags of Muslim nations), also achieving a degree of accuracy by adding days where necessary. However, in A.D. 632 Muhammad, angered that certain communities were altering the sacred months in which fighting was prohibited, forbade all such intercalation, saying that the addition of days to make the calendar conform to the solar year violated the commands of God. He then introduced a purely lunar calendar, but this makes the Islamic year only 354 or 355 days long, so that over the course of thirty-four or so of our years, Muslim holidays will slip through all the seasons.

Strict adherence to the lunar calendar also means that the declaration of a new month depends on the actual sighting of its new moon, so the calendar cannot be determined ahead of time and the months begin unpredictably. New moons, says the Qur’an, “are fixed times for the people and for the pilgrimage [to Mecca].” Each month, the official declaration of the Moon’s sighting is eagerly awaited—although as Islam has dispersed geographically, it has become harder to adhere to an exclusively lunar calendar. In 1971, the shah of Iran switched his country from the Islamic (lunar) calendar to the Persian (solar), the better to celebrate the 2,500th anniversary of the pre-Islamic Peacock Throne, prompting an intensive program worldwide to create an international Islamic calendar, a mixture of calculation and sighting; but no version has yet been universally accepted.

In Jewish observance, the day divides into six watches—as reflected in the Psalms—from midnight to midnight. To keep pace with the solar year, Jewish leap years add an extra month in the third, sixth, eighth, eleventh, fourteenth, seventeenth, and nineteenth years of each nineteen-year period. Every twenty-eighth year (known as a Nisan, when the position of the Sun is calculated to have been the same as it was on the day of Creation), Orthodox Jews engage in a blessing of the Sun (the last one was in 2009). As with the Muslim calendar, the months of the Jewish register are reckoned by the Moon, beginning when its crescent first appears in the western sky.

The Egyptians began with a lunar model, but it, like others of its time, proved so inaccurate that (just as later) a year of twelve lunar months was a full eleven days apart from a solar year, and festival dates again drifted badly. They elected to divide their calendar into three seasons—the flooding of the Nile, its subsidence, and the harvest—each lasting four lunar months. To make a full year, they added another month whenever Sirius rose late in the twelfth month. Eventually, once they recognized that the length of the year was close to 365 days, they simply tacked five extra days onto the final month.

After moving to a solar year, they devised a more sophisticated calendar using a roster of thirty-six stars located around Sirius, the fresh sighting of each star marking a new day. Each of the thirty-six “decans” (now so called because they came into view ten days apart) would be invisible for seventy days before rising. At any one time, eighteen decans covered the period from sunset to sunrise, three each were assigned to the in-between periods of dusk and twilight, and the remaining twelve marked total darkness (a word rendered in Egyptian by the powerful phrase “what grips the bowels”). From this emerged the twenty-four-hour day, with hours that varied in length through the year—daylight hours being longer in summer, for example—until a new dynasty, beginning the Egyptian New Kingdom (1539 B.C. on), introduced the sixty-minute hour. They added an extra day every four years.

The Mayan method of measuring time was impressive but mystifying. They had a calendar that, by A.D. 800, used both agricultural and solar cycles, one of which was made up of eighteen months, each of twenty days—followed by an intercalary “month” of five days, called the uayeb—with names such as Pop, Zip, Zec, Mol, Yax, and Zac, which have a fine ring; but these were later subsumed into a calendar of thirteen twenty-eight-day months based on the Moon. Their initial, solar version remains accurate to within about three seconds per year—closer to the true year than the Gregorian.7 Its “long count” ends on December 21, 2012, which coincides with various unusual alignments in our solar system sufficiently well to have inspired myriad theories (under the general rubric of “the Mayan Prophecy”) about a coming apocalypse.

The Aztec calendar consisted of a 365-day cycle called xiuhpohuali (year count) and a 260-day ritual cycle called tonalpohuali (day count). The former made up the agricultural calendar, since it had to take account of the Sun; the latter was the sacred calendar. The Inca likewise had two versions. When in 2004 I visited the mountain city of Cuzco, I was told that its massive sun pillars (sadly destroyed by the Spaniards) had been used to fix the planting of crops. More recently, a range of thirteen stone towers within a 2,300-year-old ruin 250 miles north of Lima has been revealed as a solar index (something like Stonehenge), employed by the Inca to help run their empire; but there also coexisted a star-determined agricultural calendar consulted by the lower classes.

In astronomy, the Inca, like the Aztecs, were not as developed as the Mayans; yet they had an elaborate calendar of twelve lunar months, which they corrected from time to time according to sightings from Cuzco. The four months bracketing the solstices honored the Sun; those around the equinoxes were governed by the cults of water and the Moon Goddess; and the four left over were dedicated to agriculture, death, the Thunder God (deity of war and of all climatic events), and the goddess of the planet Venus.8 Otherwise, the Inca ways of measuring time were not very different from those of many of their counterparts—imaginative, imprecise, regularly tinkered with, and driven by a mix of solar and lunar observations.

The innumerable Greek states had a whole mix of indices. One, used in Athens in the fifth century B.C., organized itself from the summer solstice, beginning the New Year with the following new moon. In Athens, as in most Greek jurisdictions, the day began at sunset, a natural arrangement with a system that reckoned time by the Moon; but magistrates were free to repeat dates—several times if they so wished, such as having a succession of December twenty-fifths. There were limits to this power, for the year had to end on the right date; if one day was repeated, another had to be omitted, and by the last month there was little freedom left for manipulation. The calendar was finally reformed around 432 B.C. by Meton of Athens, the lunar month being aligned with the solar year by intercalating seven months into every nineteen-year cycle, and by varying months between twenty-nine and thirty days, thus making the lunar month overlong by only two minutes.*

The first significant change in European calendars following the Athenian innovations of 432 B.C. came about as a mixture of practical necessity and political factors (and perhaps a little lust). Under the Roman Republic, the endlessly recurring need for a correcting month was assessed annually, often in the light of blatantly political considerations. Julius Caesar was both pontifex maximus (a senior priest of the state religion) and a proconsul (provincial governor), and in his former role oversaw the calendar. However, he was at grave contention—that ultimately passed into civil war—with the faction led by his son-in-law and former ally, Pompey; and during this time the calendar year had contracted anarchically from a 365-day span, with January falling in the autumn.

Once he had triumphed over his enemies, Caesar commissioned a Greek astronomer, Sosigenes of Alexandria, to devise a new calendar, and he suggested increasing the lengths of some months by one day each and February by one day every three years only (a “leap year”), creating a new calendar, the Julian, of about 365.25 days per annum, with the year starting on January 1. One difficulty with this was that too many leap days were added with respect to the astronomical seasons. On average, solstices and equinoxes advance by eleven minutes per year against the Julian calendar, causing the Julian to drift backward one day about every 128 years. Caesar happily dubbed 46 B.C. “the last year of confusion,” but with 45 B.C. lasting a full 445 days, the citizens of Rome wittily reversed this to “the year of confusion.” They were nevertheless jubilant because they believed that Caesar had extended their lives by an extra three months.

Several of our months reflect the Julian counting method, the seventh month being September (septem), the eighth October (octo), and so on. The new version mirrored the Greek-Egyptian calendar of 238 B.C., which Caesar was happy to introduce, as at the time he was passionately involved with Cleopatra, and also because before the civil war the Senate had refused to intercalate the extra months that the old calendar needed, as that would have prolonged Caesar’s term of office.

The new calendar was still seriously out of kilter, for it misreckoned every third rather than fourth year a leap year, so that by 11 B.C., a mere thirty-three years after Caesar’s death, the year was starting three days late. Caesar’s great-nephew, adopted son, and more tactful successor as dictator, Augustus, corrected this by skipping three leap years, not having another one until A.D. 8. During his reign, the fifth and sixth months, Quintilis and Sextilis, were renamed July and August in honor, respectfully, of Caesar and Augustus.*

The advent of Christianity made new demands on the reckoning of time. The Julian calendar was used to “fix” certain events such as Christmas, the Epiphany, and the Annunciation, and also to determine the movable-feast sequence of Easter, Pentecost, and Lent. Easter is assigned as the Sunday after the first full moon to fall on or after March 21; at least a dozen other festivals follow from that dating. As the Gospels state unequivocally that Jesus was crucified at Passover, Easter depended on the complicated lunar calculations by which Jewish authority set that festival. Many early Christians held that Jesus had died on a Friday and risen two days later; but if one followed the Jewish calendar, there was no assurance that Easter would fall on a Sunday. This led to a major schism between the Eastern Orthodox Church and Rome, the former observing Easter on the fourteenth day of the lunar month regardless of the day of the week. Different faiths still celebrate Easter on different Sundays.

How best to determine an ideal calendar was exhaustively examined in On the Theory of Time-Reckoning, written in 725 by the Venerable Bede, who calculated that the 365.25-day Julian year was longer than the solar year by eleven minutes and four seconds, and that a more accurate measurement was closer to 365.24 days. Still nothing was done, and over time Caesar’s calendar got more and more out of step with the seasons. When at last reform came it took a pope to do it. The story that has been passed down is that when it became apparent that the Easter of 1576 would be particularly ill timed, Gregory XIII went to the Vatican’s Tower of the Four Winds, where a papal astronomer showed him the solar image on the meridian line across the floor of the Calendar Room, and the pontiff could see that the Sun was ten days away from where it had to be to reach equinox on March 20-21. At that moment, it is said, he decided that the calendar must be aligned with the unyielding heavens. Maybe the story is true, but the need for reform had long been recognized: aside from Bede, as early as the thirteenth century Roger Bacon had sent Clement IV a treatise on the calendar’s shortcomings.

In a matter of weeks, a new plan, devised by a well-known Calabrian physician and amateur astronomer named Aloysius Lilius (c. 1510–1576), was presented to the pope by the astronomer’s brother (Lilius himself having recently died). Gregory asked the Jesuit mathematician Christopher Clavius, a Bavarian living in Rome, where he was acclaimed as “the Euclid of the sixteenth century,” to review it. Clavius endorsed the innovations and added some of his own. Over the next few years all Catholic countries were instructed to omit the extra ten days. The pope agreed to 1582 as the changeover year, and October was chosen as the changeover month, as containing the fewest Church feasts, so imposing the minimum disruption.

There was also a good political reason for the choice, for Easter 1583 would thus fall on the same day in both the Julian calendar (March 31) and the Gregorian (April 10), a happy conjunction that would not repeat itself for many years. One might ask why Gregory didn’t delete fifteen days rather than ten, shifting the spring equinox back to its traditional date of March 25. Had he done so, however, the winter solstice would have moved to December 25, by this time a major Christian feast day. As Duncan Steel notes, “By allowing Christmas and the solstice to coincide once more, the Church would have walked into trouble. Christianity had successfully pinched the solstice festival of the pagan religions over twelve hundred years before … and was not about to hand it back.”9

In Spain, Portugal, and parts of Italy the new calendar was taken up immediately; France and the Low Countries followed by the end of that year. In what is now Belgium, the calendar went from December 21, 1582, straight to January 1, 1583, depriving everyone of Christmas. Catholics in Germany followed suit in 1584, Denmark-Norway by 1586—although Sweden held out until 1753. Most other non-Catholic Christians scorned the new calendar; not until 1700 did the German Protestant states adopt it. In Grisons, the most easterly of the Swiss cantons, Catholics and Protestants, even those living on the same street, kept different calendars, a situation that continued down to 1798, when the French invaded and imposed the Gregorian on everyone.

Protestant England was likewise suspicious of any pronouncement from Rome, and although Elizabeth I was not unsympathetic to the proposed change, the Spanish Armada of 1588 effectively scuttled any chance of its being accepted. Voltaire would scoff: “The English mob preferred their calendar to disagree with the Sun than to agree with the Pope.”10 More likely, well aware of the Gregorian calendar’s flaws, English scientists thought that if they developed a superior calendar it would help effect a rapprochement with those European nations fence-sitting in the quarrel with Rome. One interpretation of The Taming of the Shrew is that Katharine represents the Protestant religion, Petruchio pre-Reformation Catholicism; thus his order that she should call the Sun the Moon no longer appears senseless when one considers that at the time the play was written (1592), England was already ten days out of sync with the Roman calendar.*

The 1751 bill bringing in the Gregorian calendar was so unpopular that William Hogarth introduced a stolen Tony placard, “Give Us Our Eleven Days!” (it lies on the floor, under the foot of the man with the knob kerry) into The Election Dinner (1755), his painting of a tavern meeting organized by Whig candidates with Tories protesting outside. Illu(20.2)

Britain and its colonies delayed two centuries before making the adjustment, by which time their calendar had to be moved forward another day, Wednesday, September 2, 1752, being followed by Thursday the fourteenth. When the British Parliament debated the change, rumors spread throughout the country that salaried employees were losing eleven days’ pay and that everyone was surrendering eleven days of their life. The protests were not unreasonable: people who had paid rent for a full month found they received no rebate, while bankers refused to pay their taxes on March 25 and put off payment by eleven days—hence the British tax year still begins on April 5. As recently as 1995 an Ulster politician attacked the Vatican for meddling with the calendar. Yet ultimately it is not the objections that are remarkable, but the fact that there was so much agreement.

Japan waited until 1872 to adopt the Western calendar, when the reform led to peasant riots. Turkey was last, accepting the inevitable in 1927. Russia may boast the most confusing history of all. Until the end of the fifteenth century, new years began every March 1, then on September 1 until 1700, when Peter the Great changed the date to January 1. In 1709 the Julian calendar, as favored by the Orthodox Church, came in, more than 127 years after the Gregorian had been introduced into Western Europe. For most of the nineteenth century, the Department of Foreign Affairs used the Gregorian, as did the expansionist Russian navy; finally, in 1918 Lenin decreed the adoption of the Gregorian throughout the country—but this held only until 1923, when both Gregorian and Julian were discarded in an attempt to eliminate the Christian reckoning. In their place came the “Eternal Calendar” (with special cards distributed to workers, to increase production). By 1929 a week of only five days and therefore months of six weeks were in place, the former remaining so until 1934, when the Gregorian calendar was reimposed, although the seven-day week did not make a return until 1940. The potential for confusion was amply realized.

The new Gregorian calendar had both virtues and drawbacks. It did not perfectly track the equinoxes and solstices, but it did more accurately indicate the time of year in respect to the seasons, being based on the tropical/solar year (i.e., the time it takes the Earth to orbit the Sun, measured between two spring equinoxes) of 365 days, 5 hours, 49 minutes (approximately). As each year is reckoned as 365 days exactly, it means that the following new year begins 5 hours and 49 minutes ahead of the Earth’s completing its circuit, so that after any four years the calendar is four times 5 hours 49 minutes ahead of the Sun. To get back in sync with the Sun, every fourth year, the so-called leap year, an additional twenty-four hours (the intercalary day) is added, yielding a year of 366 days. (Islam makes no such adjustments, and so Ramadan, for example, floats through spring, summer, fall, and winter: particularly confusing since it is from the Arabic word for August—rams, meaning “parched thirst”—that Ramadan takes its name).

In fact, the calendar has really been held back eleven minutes too much each year—forty-four minutes every four years. In the course of a hundred years, this adds up to nearly a full day; if allowed to accumulate, four hundred years would embrace 146,100 days instead of the actual 146,097. Consequently, the three centurial years out of every four whose first two digits are not exactly divisible by four—so far, 1700, 1800, and 1900—are not leap years; by this arrangement the calendar again inches back toward the path of the Sun, and all is well with the world.

Gregory’s calendar was meant to get Easter back to its old sequence—an adjustment that was made for devotional rather than scientific purposes. Yet working out exactly when that festival (and others) will fall continues to engage our attention. In October 2003 I traveled to Heidelberg to see Dr. Reinhold Bien at the Astronomisches Rechen-Institut. Comfortably dressed in brown-checked shirt, pleated pants, and scuffed shoes, he was small and rounded, a little like a friendly hedgehog. At the institute, Dr. Bien measures the heavens, charts the motions of the stars, and each day posts the times for the morrow’s sunrise and sunset. But he also advises the German government as to when certain key celebrations should take place each year. “One can’t frame a calendar for all time,” he patiently explained. “The Earth’s orbit is slowing down, so what is accurate today will not be in the future.” He was soon telling me with gusto stories about Christopher Clavius, and the railleries of John Donne, a great poet but also an Anglican convert, who dismissed Clavius as a glutton and a drunkard. As Dr. Bien defended Clavius, I found my gaze wandering to a small notice pinned just above his cluttered desk. It read, in English: THIS YEAR CHRISTMAS WILL BE ON 25 DECEMBER. A joke; but only just.

THE CALCULATION of the year’s passage is one kind of timekeeping. Measurement of the hours, and of lesser units of time, probably began much later, though nobody knows quite when. So long as people lived by raising crops and herding animals, there was little need for measuring smaller units of time. From at least the ninth century on, most cultures measured in weeks, marking out the year with dates drawn from folklore, their own needs and observations, and (in the more sophisticated cities) the liturgical calendar. “The understanding of a savage,” wrote William Hazlitt in 1827, “is a kind of natural almanac, and more true in its prognostication of the future.”12 Leaving aside what is implied by that loaded word “savage,” we know that the Konso people of central Africa still reckon their day by function rather than by calculation: 5 P.M. to 6 P.M., for instance, is kakalseema (“when the cattle return home”), with periods of the day named after the activity performed then.*

The simple concept of an “hour” as having a uniform duration took more than two of the five millennia of known history to develop. For the Egyptians, an hour in January and an hour in August, or an hour in northerly Alexandria and an hour in southerly Memphis, had markedly different lengths. The most natural division of time is into two parts—“day” and “night.” The Romans, until the end of the fourth century B.C., divided the former into before midday (ante meridiem) and after midday (post meridiem). Since all court work took place before noon, they posted a civil officer to detect the moment of the Sun’s crossing the zenith and proclaim it in the Forum. They also distinguished between dies naturalis, the natural day that runs from sunrise to sunset, and dies civilis, the civil day that runs from the completion of one Earthly rotation to another—for them, from midnight to midnight. The word “day” was ever ambiguous.

Finer subdivisions were made: the night into four “watches,” each named after its last “hour” and proclaimed by constables. Were greater accuracy needed, a range of time-based words came into play: occasus solis (sunset), crepusculum (twilight, hence “crepuscular”), vesperum (appearance of the evening star), conticinium (the falling of silence), concubium (bedtime), nox in-tempestum (timeless night, when nothing is done), and gallicinium (cockcrow), among others.

Before the Industrial Revolution, and the advent of better lamps and lanterns, work hours for most European societies were set by sunrise and sunset. From about the twelfth century on, church bells were sounded indicating when to begin and end toil, announcing curfew (from the French couvre-feu—“cover fire,” i.e., lights out), and so on. For centuries in Europe, the period between midnight and cockcrow was seen as “dead time,” thus inspiring the phrase “the dead of night.”

Regardless of the era, we have remained transfixed by the instant of noon. Noonday demons were the merciless torturers of the hermit Desert Fathers of the early Christian Church. Both France and Italy have whole regions—the Midi and the Mezzogiorno—named for the noonday Sun. While ancient Rome had timekeepers to shout out its arrival, one Parisian inventor fitted a lens into his sundial to act as a burning glass that precisely at noon set off a small cannon, and a “noon gun” is still fired daily in places as far apart as Cape Town and Santiago, Chile. For years, the lighthouse keeper at Brockton Point, in Vancouver, marked midday by detonating a stick of dynamite. During the nineteenth century, to give navigators a precise visual signal by which to reset their chronometers, certain major harbors would drop a huge time ball at 1 p.m. (not at noon, as that is when observatories took their readings).*

To tell the time—more or less—throughout daylight hours, not just at noon, or sunrise, or sunset, was a challenge, but once again the Sun provided the necessary means. Gnomons (from the Greek word for “indicators”) were first used to measure height, but later employed as proto-sundials, with the length of the shadow cast by the gnomon determining the hour of the day. A gnomon could be anything vertical, including the human figure. As Chaucer writes:

Borneo tribesmen measure the length of the Sun’s shadow at summer solstice with a gnomon. Joseph Needham included this photo in his Science and Civilization in China (1953). Illu(20.3)

It was four o’clock, according to my guess,

Since eleven feet, a little more or less,

My shadow at the time did fall,

Considering that I myself am six feet tall.13

To measure the hours after dark man turned to water clocks, used at night from at least 1450 B.C. on in Egypt, a millennium before they came to Rome. However, they were not particularly accurate until the third century A.D., when the Alexandrian Tesibius (c. 285-222 B.C.) invented a device to ensure a uniform flow. China had water clocks from at least 30 B.C., and ended by adopting a whole series of small vessels on a rotating wheel. “And so,” writes Joseph Needham, “the great breakthrough in accurate time measurement came about.”14

Besides clepsydras, notched-candle clocks (invented by Alfred the Great, says legend), sandglasses (the necks of which became worn from repeated use, passing the grains too quickly, thus shortening their hours), fire clocks, the Egyptian merkhet (made from plumb line and palm leaf), incense clocks banded in different aromas (by which one could “smell” the hour), and a score of other methods all served to reckon the time. But the most popular was the sundial. Made of stone or wood, it fixed a piece of metal parallel to what we now know to be the Earth’s axis and at right angles to a circle divided by as many lines as were deemed necessary, so that when the Sun shone, the shadow of the metal blade moved around the circle (the “heliotrope”—sun-turning) according to its motion through the sky.

The earliest known example is Egyptian, dated to about 1500 B.C. By the sixth century B.C., sundials were being used in Greece, and it was Anaximander who invented the discipline of “gnomonics” as a science. For at least the next ten centuries the sundial would be the world’s most accurate timekeeping device,15 but it took time for it to spread, partly because it was not always used properly. No one seems to have understood, for example, why a dial looted in the sack of Syracuse (latitude about 37° N) in 212 B.C. failed to show the right time when shipped to Rome (about 42°N).16 Dials, it began to be realized, had to be specially made for different latitudes, because the Sun’s altitude decreases as one moves poleward, producing longer shadows. “In order to make the fallen shadow even approximate the correct time,” explains Dava Sobel, “the dial must be laid out with regard to latitude north or south of the Equator where it is to be used, respecting the changing high point of the sun in the sky from day to day over the course of the year and the variable speed of the Earth’s movement around its orbit. There is nothing obvious about the construction of a proper sundial.”17 Of course, for centuries it was not realized that these were the transactions involved.*

Not until about A.D. 1371 would there be a polar-oriented sundial—at the Great Mosque in Damascus—tilting its gnomon at an angle corresponding to its latitude and thus, as we now know, the curvature of the Earth. Thus time was measured not by the gnomon shadow’s length, but by its angle. This was a major breakthrough, but one not actually needed until mechanical clocks were invented. Until then, almost everyone used unequal hours.

The approximations used to design ancient portable dials could introduce errors of up to a quarter of an hour. The inconsistency of measurement between two or more sundials may be guessed from the words Seneca (c. 4 B.C.-A.D. 65) puts into the mouth of one of his characters, talking of the death of the emperor Claudius: “I cannot tell you the hour exactly: it is easier to get agreement among philosophers than among clocks.”18 Inaccurate as they were, sundials struck some people as only too modern, changing their relationship to time in ways that made them nostalgic for the “happy life” that existed before time could be measured. A play attributed to Plautus (c. 254–c. 184 B.C.) has one character exclaim:

The Gods confound the man who first found out

How to distinguish hours! Confound him, too,

Who in this place set up a sundial,

To cut and hack my days so wretchedly

Into small pieces! When I was a boy,

My belly was my sundial—one surer,

Truer, and more exact than any of them.

Evidently, sundials were still something of a novelty. But it was not long before people learned to love them, and appreciate what they brought.

Around 1400 the chiming clock was invented, and by the 1600s it had become so pervasive that theatergoers might not even have registered the anachronism in Julius Caesar, when, in response to Brutus’s “I cannot, by the progress of the stars / Give guess how near to day,” and asking the time, Shakespeare has his fellow conspirator Cassius reply, “The clock hath stricken three.” A similar mistake appears in Cymbeline, when again a clock strikes three; but in Richard II it is a dial, not a clock, that Shakespeare has his king invoke, to make the passing of time a visible movement:

I wasted time, and now doth time waste me;

For now hath time made me his numb’ring clock:

My thoughts are minutes; and, with sighs, they jar

Their watches on unto mine eyes, the outward watch,

Whereto my finger, like a dial’s point,

Is pointing still, in cleansing them from tears.19

The increasing number of weight-driven clocks intensified the preoccupation with time, paradoxically sparking a boom in sundials, which became so profitable a business that design methods were closely guarded. The “art of dialing” formed an important branch of mathematics and was the subject of many textbooks.20 The making of sundials remained the province of astronomers rather than clock makers, as one had to take into account the Earth’s rotation and elliptical motion as well as the angle of its axis.

Even the arrival of precision timekeeping in the form of pendulum clocks and the balance spring did not dent the sundial’s popularity. As Dava Sobel says, “A clock or watch may keep time, but only a sundial can find time [by interrogating the external world]—a distinctly different function.”21 Charles I (1600–1649) carried a silver sundial, which he entrusted to an attendant on the eve of his execution as a last gift to his son, the duke of York (after whom New York is named). Thomas Jefferson in old age would distract himself from his chronic rheumatism by calculating the hourlines for a dial. Instead of a watch, George Washington used a silver pocket dial given him by Lafayette.

Over the years sundials have been T-shaped, portable, perpendicular, sunken, cubical, and flat (the common or garden variety). Vitruvius, architectural theorist of ancient Rome and a contemporary of Julius Caesar’s, counted at least thirteen styles already in use in Greece by 30 B.C., and says he could not invent any new types, since the field was exhausted. But that was by no means the case. Universal sundials, adjustable for use in any latitude, came in during the eighteenth century. Design became ever more demanding as standards of clock making rose; many became prized works of art.

The Samrat Yantra, the giant sundial at the Jaipur Observatory, one of a family of massive instruments built under Maharajah Jai Singh II (1686–1743). These devices had no telescopes and relied upon naked eyesight and extremely precise construction. Illu(20.4)

Where once there had been nostalgia for the era before sundials, the persistence of the device may have had something to do with the pastoral sense of rustic peace that seemed to attach to them in the era of the clock. Shakespeare’s stricken Henry VI exclaims: “O God! Methinks it were a happy life / To carve out dials quaintly, point by point, / Thereby to see the minutes how they run.”22 “Of the several modes of counting time,” wrote Hazlitt, “that by the sundial is perhaps the most apposite and striking, if not the most convenient or comprehensive. It does not obtrude its observations, though it ‘morals on the time,’ and, by its stationary character, forms a contrast to the most fleeting of all essences.”23

“Morals on the time” is a reference to the custom of sundials being inscribed with mottoes. There are myriad such sayings. An eighteenth-century English verse runs: “Read the riddle that I’ve found, / Come, answer it to me, / What is it travels o’er new ground, / And old continually?” Answer: a shadow. Two others much quoted are “I tell only sunny hours” and “The clock the time may wrongly tell, / I never if the sun shines well,” although the latter, while celebrating its dial’s accuracy, underlines its one drawback: it works only when the Sun is out. Nonetheless, when NASA landed a spacecraft on Mars in January 2004, it deposited a suitable timepiece: two aluminum sundials, no larger than a human palm, built into two Mars Exploration Rovers and carrying the motto “Two Worlds, One Sun.”24

Back in the 1930s, the flamboyant movie mogul Sam Goldwyn visited his New York bankers and spied a sundial. Swinging around on his companions, he exclaimed: “What will they think of next?” Not a joke; but only just.

* Joseph Needham, Science and Civilization in China (Cambridge: Cambridge University Press, 1959), p. 189, and Christopher Cullen, Astronomy and Mathematics in Ancient China: The Zhou Bi Suan Jing (New York: Cambridge University Press, 1996), p. 6. Cullen, who currently oversees the Needham Research Institute in Cambridge, reckons that Needham badly underestimated the importance of calendrical astronomy in China; his own writing makes only a partial case.

* Chinese rulers had it both ways. Foreign calendars were regularly introduced, an Indian one arriving during the Tang Dynasty (A.D. 618-906), a Muslim one under the Yuan (A.D. 1279–1368), and the Gregorian in the seventeenth century. But the Chinese systems were not officially supplanted until 1912, and even today most Chinese calendars give both the Gregorian year and the year in the old, sixty-year cycle.

* If one takes the year 432 B.C. as reckoned in the Gregorian calendar, it is daunting to realize that this same year appears in other calendars as follows: Ab urbe condita (from a massive history of Rome, c. 753 B.C.), 322; Bahai, –2275/4; Berber, 519; Buddhist, 113; Burmese, –1069; Byzantine, 5077-78; Chinese (sexagenary cycle), 2205-66; Coptic, –715-14; Ethiopian, –439-38; Hebrew, 3329-30; Hindu, Vikram Samvat, –376-75; Hindu, Kali Yuga, 2670-71; Holocene, 9569; Iranian, 1053-52 BP; Islamic, –1085-84 BH; Korean, 1902; and Thai, 112. This list omits the Armenian, Japanese, Mayan, Aztec, and Inca calendars, which do not cover such early dates.

* Over the centuries other attempts to rename the months have been made, but all have failed. In 1793, for example, the French Republic declared a new Revolutionary calendar, with twelve thirty-month days beginning with Year One of the Revolution, September 22, 1792. The new months were Vendémiaire, Brumaire, Frimaire, Nivôse, Pluviôse, Ventôse, Germinal, Floréal, Prairial, Messidor, Thermidor, and Fructidor. Every tenth day was a holiday, and there were five sans-culottides (intercalary) days (named after the Revolutionary lower classes, who did not wear fashionable culottes). It lasted just over a decade before it was abolished, since a ten-day week gave workers less rest; also, every new year started on a different date (because fixed to the equinox), a major source of confusion for almost everybody, not least because it was incompatible with the secular rhythms of trade fairs and agricultural markets. The Revolution also attempted to introduce a decimal clock, with each heure twice as long as the traditional pre-Revolutionary one. On Fructidor 22, Year XI (September 9, 1803), this innovation too was repealed. (More successfully, the English, from 1880 on, would yoke together the last day of the Christian week—Saturday—and the first—Sunday—into “the weekend,” which, after the Great War, more and more countries adopted both as a term and as a practice.

* According to the Gregorian version, Shakespeare died on Friday, May 3, 1616, his fellow writer Cervantes on Tuesday, April 23 (said to be Shakespeare’s birthday). But by the older Julian calendar that England used, Shakespeare had died on its April 23, Saint George’s Day. Although the Spaniard’s death was thus ten days earlier than Shakespeare’s, the two men are often said to have perished together. In honor of this conjunction, UNESCO established April 23 as International Day of the Book. Anyway, Cervantes probably died on April 22 (Gregorian), but was buried on the twenty-third.

The great variety of calendars over history was to provide Tolkien with rich material. The Lord of the Rings devotes a seven-page appendix to the “Shire Year,” with its twelve months Afteryule, Astron, Afterlithe, Winterfilth (winter–full moon), Solmath, Thrimidge, Wedmath, Blotmath, Rethe, Forelithe, Halimath, and Foreyule. Each year begins on a Saturday, and Mid-Year’s Day has no weekday name. “In Middle-Earth the Eldar also observed a short period or solar year, called a coranar or ‘sun-round’ when considered more or less astronomically,” and so on; J.R.R. enjoying himself.11

* Many sense small passages of time naturalistically. I have corresponded with several blind people about their experiences of the Sun, and certain responses seem widespread: the aviator Miles Hilton-Barber, for instance, uses the feeling of the Sun on his face to help when flying (but not at the controls) to get an idea of time. The writer Ved Mehta told me that when he left his native India in 1949, aged fifteen, to attend a college for the blind in Arkansas, many at the school didn’t wear a Braille watch, as they preferred to reckon the time from the atmosphere, “and the Sun of course is part of the atmosphere.”

In court proceedings, klepsydrae, or clepsydras (meaning “water thieves,” in which a pierced, measured vessel sank in a tub of water), were used like egg timers, and their span measured the time that advocates could speak: the phrase aquam dare, “to grant water,” meant to allot time to a lawyer, while aquam perdere, “to lose water,” meant to waste time. If a speaker in the Senate spoke out of turn or for too long, his colleagues would shout that his water should be removed.

* The duel fought out in the American West was traditionally held at “high” noon so that the Sun’s glare would not be in either man’s squinting eyes: part of the myth of fairness. Fred Zinnermann’s classic 1952 film High Noon eschews images of dwindling shadows while Sun and tension mount, and instead we see inexorable clock hands as time drains away—a sign of how these machines had usurped the Sun’s ancient role.

* Even so, Boy Scouts learn to tell direction by their watches, holding them face upward and pointing the hour hand toward the Sun. In the Northern Hemisphere, the south alignment will then lie halfway between the hour hand and twelve o’clock: a rough but ready reckoner.