The word clock strictly refers to a timepiece with a bell (from French, cloche). Before such devices the running of sand and water were used to mark time – but the daddy of them all is the simple sundial. Had these been first developed in the southern hemisphere, clockwise would denote rotation to the left. Sundials also explain why clock faces are traditionally round; true, they present problems around the equator and at the poles, but nothing is perfect!
However you choose to carve up the day, clocks have always been used to mark cultural trends rather than dictate them. For example, although noon is now twelve p.m., named from the Latin nona (ninth), this was for much of medieval Europe the hour we now call 3 p.m., as the working day started at 6 a.m., with the first break nine hours later. Plays and other entertainments put on in the local castle at 2 p.m. were called matinees, from the French matin (morning) but, as the working day became less arduous and noon moved back to its present position, matinees were marooned in the new afternoon.
In hotter countries the day still started at 6 a.m. but, due to the midday heat, the workforce rested six hours later, at 12 p.m. (noon). Termed the sexta hora (sixth hour) in Latin, this evolved into the modern ‘siesta’.
DAYS, HOURS, MINUTES AND SECONDS
Yet another hand-me-down from the Babylonian obsession with astrology.
The Babylonians revered the numbers twelve, sixty and three hundred and sixty; the first two multiplied together gave the third, which was the number of days in their lunisolar year – the solar year comprised 365.24 days and the lunar year 354.38: add the two together and divide by two results in 359.8, this being rounded up to three hundred and sixty for convenience. In keeping with their notion of twelve signs of the zodiac, the Babylonian year was subdivided into twelve months.
Because they believed their lives were subject to the influence of their twelve zodiacal signs, the Babylonians decided that both the span of daylight and that of the night should be divided into twelve equal parts, which explains why we now have twenty-four hours to each day. Furthermore, to keep in step with their beloved number sixty, each hour was divided into sixty minutes and each of those minutes into sixty seconds.
The ush is the only Babylonian time unit we have not adopted. A unit of four minutes, it was most likely invented because each day comprised 1,440 minutes, which divided by four takes us back to three hundred and sixty. That said, the earth takes twenty-three hours and fifty-six minutes – four minutes short of twenty-four hours – to rotate on its axis and bring an observer back under the same star or constellation, but the Babylonians could not have worked that out – could they?
Because three hundred and sixty was thought to represent the circle of the years, in Babylonian maths the circle was likewise divided up into three hundred and sixty degrees and each of those degrees subdivided into sixty minutes.
THE SUN STANDS OVER THE YARDARM
On the old square-rig sails, the top horizontal timber (or yardarm) was used as a rough guide to judge the right time to pour out the first rum-issue of the day. Obviously, this would depend greatly on the ship’s latitude and direction of passage but as a general rule in the North Atlantic, where the expression and practice originated, anyone standing on the main deck would see the sun standing above the top yardarm at about 11 a.m.
UP IN THE AIR
Another problem of human attempts to measure time arises from your position in relation to the planet – not in which time zone you stand, but your altitude.
In 2010 Dr James Chin-Wen Chou of NIST, the American National Institute of Standards and Technology, conducted experiments with an atomic clock. Accurate to one second in 3.7 billion years, this ran on the ‘ticking’ of a single ion of aluminium that ‘ticked’ between its two energy states one million billion times a second.
It has long been known that time accelerates the deeper into space you go, as you are moving further and further away from any gravitational influence. Gravity generates a form of time dilation, so the further away you move from the point of such force, its increasingly weakened influence allows time to pick up speed. So, a person on top of a mountain not only weighs less than they do at the foot of the peak, they also age fractionally faster. However, Dr Chou was able to measure a minute acceleration of time when the clock was raised a mere twelve inches off the laboratory floor. This variance was checked and double-checked to ninety billionths of a second over seventy-nine years so it is unlikely to have a profound effect on anyone’s lifespan, but it can now be said that your head ages faster than your feet and that going upstairs to bed will shorten your life.
It had also been known for decades that time slows as you approach the speed of light. This is known as the Twin Paradox. If one twin gets into a rocket and accelerates away from earth, at or beyond the speed of light, for a while before returning in the same fashion, he or she will be younger than the sibling on the ground.
But Dr Chou proved that you do not need rockets or warp drive to experience the effect. When transported at a mere twenty miles per hour (32 km/h) the clock slowed by several billionths of a second. So, drive slower and live longer.
For those of us with our feet on the ground such variance in time caused by altitude and speed are mildly interesting but of no discernible impact, but that cannot be said of GPS satellites feeding data to the irksomely monotone satnavs in our cars.
As mentioned above, the faster you travel in space the slower time passes and the further you are from the earth’s gravitational pull the faster the passage of time, so positional clocks on a GPS satellite are subject to these two opposing influences. Due to a satellite’s speed alone – 8,700 miles per hour or 14,001 kilometres per hour to keep pace with the earth’s rotation – time runs slower by 7,200 nanoseconds every day. Its altitude of 12,000 miles means that there is the counter-indication of time passing 45,900 nanoseconds a day faster. Deducting one from the other leaves the satellite’s balance 38,700 nanoseconds faster every day.
Although such miniscule variance is unlikely responsible for our satnavs carefully leading us into dead ends and farmyards, it does have to be accounted for, as the roll-up over several years would have a discernible effect.
In the general perception GPS satellites stand in a fixed and geosynchronous position above a single spot on the earth’s surface whereas this is only possible directly above the equator. In reality there are twenty-four GPS satellites, each of which ‘rise’ and ‘set’ twice a day. Divided into groups of four, each group travels one of six separate orbits so that eight are visible from any position on earth at any one time.
Far from our dashboard satnavs being guided – or misguided, as the case may be – by a single satellite directly above, information is being constantly fed to a GPS receiver by the visible eight. This data is combined and averaged out to feed satnavs a reading accurate to about ten metres.
DECIMALIZATION OF TIME
The first to challenge the time structure imposed on us by the six-mad Babylonians were the revolutionary French, who decided to decimalize both the clock and the calendar, launching such lunacy in 1793, which was rebranded year one.
Try as they might, the members of the committee appointed by the Revolutionary Council to oversee the change could not equally divide three hundred and sixty-five days to come up with a ten-month structure, so they opted instead for twelve months of thirty days, each divided into three ten-day blocks called decades. The first day of each decade was a day of rest, which did not go down well with the working populace, who hankered after the old Gregorian structure in which they got one day off in seven.
The months were renamed according to their prevailing weather, such as Brumaire (mist) and Thermidor (heat), and the days of the decades numbered consecutively, moving towards a more scientific and secular system. The first day of this new-style year, in ‘Grape-harvest’ month, fell on 22 September 1792, the date of the founding of the First French Republic.
As for the clock, this was redesigned to have a face of ten hours, each of one hundred minutes comprising one hundred seconds. It is fair to say that the only people pleased with the new system were calendar printers and clockmakers, who made a fortune. These same people would again make a killing in 1805, when the new Emperor, Napoleon I, quickly reverted to the Gregorian calendar.
TIMEKEEPING
Responsible for setting the precedent of circular clock faces, the sundial could be extremely accurate, with the Egyptians erecting them on a massive scale in the form of obelisks, as early as 3,500 BC. These first ever public clocks cast their shadows across hour markers built into the paved squares in which they stood, with Cleopatra’s Needle on London’s Victoria Embankment, its twin in New York’s Central Park, and the Luxor Needle in Place de la Concorde, Paris, being three surviving examples of these gigantic timepieces.
As sundials were pretty useless after dark, the ancients tried to set up moon dials. These proved to be more trouble than they were worth, as they only worked on the night of the full moon. For every night of the fortnight following the full moon, such ‘clocks’ would read about fifty minutes slow, and for each night of the two weeks leading up to the next full moon the dial would be about the same number of minutes fast. So, in the week either side of the full moon, a moon dial is about five-and-a-half hours slow or fast. Useless perhaps, but these dials provided the first proof that the orbit of the moon was anything but circular.
For a sundial to give accurate readings it must be properly aligned, which means that in the northern hemisphere the dial’s shadow-caster, or gnomon, must point to true south, and to true north in the southern hemisphere (where the dial will read anti-clockwise). On or near the equator, they are a nightmare to set up, with the gnomon requiring to be aligned with the direction of the earth’s rotation.
As stated above, sun- and time-obsessed Egyptians were frustrated by their inability to tell the time at night and, having tried the abortive moon dial, invented the water clock.
Around the middle of the sixteenth century BC we find the first reliable mention of a water clock, in the ancient temple complex at Karnak in Luxor, Egypt. There are claims of functioning water clocks existing as early as 4,000 BC in India and China, but these cannot be substantiated.
Calibrated by a sundial, the first examples were, like all good ideas, blissfully simple, with the flow from a main reservoir being regulated to fill a receiver with graduated markings up the inner surface. Still having to be calibrated by sundial, the water clock rapidly gained popularity over its predecessor as it became more complex.
The Babylonians were quick to realize that with scales to weigh the increasing weight in the receiver and gears to turn a dial you could have something very close to the modern clock. The Greeks and Romans refined things further by extending the use of gearing and including the first escape mechanism, so by the fourth century BC water clocks featured alarms, little doors opening on the hour to reveal various gods – the lot. It was this second refinement that would be copied by German clockmakers of the 1620s to produce the first cuckoo clocks, later hijacked by the Swiss.
Both Greek and Roman courts used water clocks to restrict the time for which anyone could speak, and all Athenian brothels used water clocks to call time on clients. In fact, water clocks were so good that it is fair to say that come their replacement across seventeenth-century Europe by the admittedly more accurate pendulum clock, the only difference was the method of drive, from water weight to a metal spring: the complicated gearing remained much the same.
SANDGLASS
Hourglass is a slightly misleading term in that the contents could be calibrated to any span of time. Nor for that matter were they routinely filled with sand, a material far to coarse and angular for smooth transfer from one bulb to the other, and too inconsistent of grain size to give an accurate reading. In Renaissance times the better examples were typically filled with marble dust harvested from the masons’ workshops or pulverized eggshell, and although sand is today sometimes used for decorative hourglasses, this tends not to be beach sand but the kind of finer-grain sand found on riverbeds.
First noted in Alexandria in 150 BC, the hourglass did not find its way into Europe until the fourteenth century when it was first popular with navigators irked by condensation and motion-induced inaccuracy of their water clocks at sea. No surprise there, so the hourglass was the first means by which navigators could calculate longitude with reasonable precision.
The device found equal favour on land, for everything from measuring cooking times to clergy being obliged to turn one in the pulpit at the beginning of a sermon to prevent them droning on ad infinitum. In fact, they are still put to similar use in the Australian Parliament in Canberra, which uses either a four-minute glass or a two-minute glass to goad members into making their minds up quickly as to which way they intend to vote by joining the appropriate lobby in the event of a division.
INCENSE CLOCK
More a feature of Eastern timekeeping, these comprised a lantern housing an incense stick of calibrated dimensions with wire-and-bell markers at the hour or divisions thereof. As the stick burned down, the thin wire would break to allow its bell to drop into a copper drum below, giving an audible alert of the passing time to those present.
Until 1924, when cost-conscious clients demanded that the duration of their visit was timed by modern clocks, this remained the standard Japanese method of working out the fee owed a geisha for his or her attendance, depending on the number of bells on that drum-plate or complete incense sticks consumed if in attendance for longer periods.
Entertainers of great artistic skill and education, geisha are much misunderstood in the West, where there is no equivalent. Female geisha were unheard of until the middle of the eighteenth century when the first few provoked mixed reactions in Japanese culture: not until the turn of the nineteenth century did female geisha start to outnumber their male predecessors. Male geisha still function in Japan, but are more usually called taikomochi.
TIME IN YOUR POCKET
The increasing demand for timekeeping on the move forced clockmakers towards portable timepieces for the night watch, a kind of early police force in cities, and for those on watch duty at sea – hence ‘watch’.
Timepieces intended to be worn first emerged in sixteenth-century Germany, but these were so cumbersome that they had to be hung about the neck by a rather strong chain. We do know that Robert Dudley, first Earl of Leicester (c. 1532–88), presented Elizabeth I of England with a watch ‘to be worn about the arm’ in 1571, and the golden rule seems to have been wristwatches for women and pocket watches for men, for whom the waistcoat was developed in the 1670s as a vestment with small pockets in which to keep their timepieces.
The wrist/pocket gender divide arose from the fact that small timepieces were until quite recently prone to fouling in damp weather and, generally, men followed outdoor pursuits while ladies stayed at home. Wristwatches remained decidedly unmanly until the 1880s, when British Army officers became the first men to wear them to enable them to synchronize attacks on horseback in conflicts such as the Boer War (1899–1902).
These military timepieces were basically pocket watches mounted on a strap, but the creeping barrage tactics of the First World War (1914–18) created the demand for purpose-built, robust wristwatches. Artillery and infantry officers very much needed to know the precise timing of each other’s actions to reduce appalling levels of casualties due to so-called friendly fire.
THE CLOCK WITH NO HANDS
Although the Romanian chronobiologist Franz Halberg (1919–2013) coined ‘circadian’ – rhythmic, biological cycles recurring at approximately daily intervals – in 1958, people had for centuries noticed that all living things went through a repeating pattern of behaviour in the course of twenty-four hours.
With its main triggers being dawn and dusk, the circadian clock in humans seems to re-set at 4 a.m. when body temperature is at its lowest, perhaps giving rise to the myth that more people die at such time than in any other hour in the day (11 a.m. is the real Hour of the Wolf). Either way, the importance of the circadian clock being able to re-set itself according to the rhythm of light and dark is responsible for the jet lag experienced by people travelling so fast and so far that they land before they have even taken off, according to the local time of their destination.
An altogether more severe demonstration of circadian disruption is being studied in Greenland, which has the highest known suicide rate in the world. You would have thought that the three-month nights would be the major cause of depression, but it is the relentless periods of daylight that coincide with peaks in suicides and homicides.
To prove the predictable reliability of circadian rhythm, eighteenth-century Swedish botanist Carl Linnaeus (1707–78) laid out the design for a flower clock, using clusters of plants known to open and close at precise times of the day. His design incorporated around fifty different flowers, ranging from goat’s beard (Tragopogon pratensis) that opens around 3 a.m., to the daylily (Hemerocallis) that closes around 7 p.m. Had Linnaeus actually planted this device, it would have proved largely accurate.
Prior to the advent of radio and its familiar pips on the hour, cannons were the first time signals heard in every city and port. Typically fired at noon, these allowed ships’ captains to set all timepieces used to calculate longitude prior to slipping anchor. That said, Vancouver installed a gun in 1894 to be fired at 6 p.m. on Sundays to call for a cessation of all fishing: it still fires nightly at 9 p.m., but only as a time signal.
Other Canadian guns, such as those installed in Halifax and Quebec City, still fire at noon, as do those in Cape Town and Santiago in Chile. The Roman gun, introduced in 1847 by Pope Pius IX (1792–1878) to coordinate the ringing of church bells in the city, still fires at noon on the Janiculum, a hill on the west bank of the Tiber.
With light, as a general rule, travelling faster than sound, cannon were replaced by visible signals such as the time ball, invented in 1829 by the British Admiral Robert Wauchope (1788–1862). A brightly coloured metal sphere released at noon to drop down the pole on which it was mounted, the time ball on top of the Greenwich Observatory still drops at noon (Greenwich Mean Time) every day, a device copied by New York since 31 December 1907, the date of the first Times Square ball drop to signal the New Year.
With the passing of time being an abstract concept which everyone tried to measure by the position of the sun at noon, the imposition of a standard system across an entire country or even continent made the task of getting an angry cat into a shoebox look easy.
Russia, for example, has eleven time zones ranging from GMT+2 to GMT+12, with a time difference spanning ten hours. China had five time zones that were abolished in 1949 by the fledgling People’s Republic, which decreed the whole country would run on Beijing Time, equating to GMT+8. Even on an island as small as the UK, until 1880 there was no straight answer to ‘What time is it?’
As in every other country in Europe, every town in early nineteenth-century Britain was running on its own local time, dictated by the zenith of the sun in its own back yard. Few journeyed very far in such times and coach trips were so slow that carriage clocks could easily be re-set, or ignored completely, so there were few time-travelling problems to contend with – but then came the railways.
Parts of the UK were anything up to half an hour out of sync with GMT, so when the railways started to move people about the UK at speed, problems did arise which were not much helped by the various railway companies agreeing between themselves to run on GMT, leaving their timetables hopelessly out of sync with local times of arrival and departure.
Such matters came to a head on 25 November 1858 at the Dorchester Assizes, which was convened to hear the case of Curtis v. March at 10 a.m. that day – but the court clock was set to GMT and the town clock to local time. When the defendant and his lawyer failed to appear at 10 a.m. GMT, the bench found in favour of the plaintiff. March appealed to have that decision overturned by a higher court, which ruled that 10 a.m. meant local time and not this irksome railway time, as GMT was then popularly known due to having been first employed by Great Western Railway.
This ruling remained in place until 1880 when GMT was officially adopted as a blanket timescale for the UK. All except the British Royal Family, that is, who ran on what was called Sandringham Time – thirty minutes ahead of GMT.
SANDRINGHAM TIME
With male members of the British Royal Family passionate about hunting on the Sandringham estate, all the clocks were set half an hour fast to conserve daylight for field sports. But after this confusion of clocks led to the ‘murder’ of King George V (b.1865) in 1936, the royal observance of Sandringham time was abandoned.
George was certainly on his last legs at Sandringham on the night of 20 January that year and his attending physician, Lord Bertrand Dawson (1864–1945), was anxious that he die in time to make announcement possible in the morning edition of The Times instead of ‘less appropriate evening journals’.
Knowing that newspaper’s deadlines for last-minute alterations to the front page and confused by his being surrounded by clocks giving the wrong time, Dawson thought it time to hurry things along by giving the king a lethal overdose of morphine so he could die in time to make the right headlines. Before he acted, Dawson phoned his wife in London and told her to ring The Times and tell them to hold the front page.
Later that same year, Dawson opposed a motion to legalize euthanasia brought before the House of Lords which, in the light of the above revelations of his personal diaries made public in 1986, smacks of hypocrisy to say the least.
GMT
Although many presume GMT to be a super-accurate measure of time, it is no such thing, as the M stands for ‘mean’, as in average. GMT noon is only ‘right’ once a year. Because the earth’s orbit is elliptical and of variable speed as the planet also shifts on its axis, GMT noon at the Greenwich Observatory can be anything up to sixteen minutes before or after the point at which the sun stands in its zenith above the Greenwich Meridian at noon.
As the GMT day marks the time it takes for the earth to rotate from noon to noon it found little favour in the scientific community which, in 1935, opted for Universal Time which marked out its days from midnight to midnight. In 1972, Universal Time was further refined into Coordinated Universal Time, which included the insertion of leap seconds to allow for the fact that the earth not only wobbles on its axis but is also slowing down its rotational speed due to gravitational drag. The most recent, thirty-sixth, leap second was added at midnight on 30 June 2015.
All atoms vibrate and, in 1967, it was established that an atom of caesium, at its atomic level, did so at the rate of 9,192,631,770 times per second to generate the new standard for keeping Coordinated Universal Time with atomic clocks. Today, the quoted International Atomic Time is the arithmetic mean of readings taken from four hundred atomic clocks spread across fifty countries. As the eccentricities of the earth’s orbit cause the year to gain or lose the odd nanosecond here or there, in the rarefied atmospheres of astronomy and astrophysics a year is no longer 365.24 days but 290,091,200,500,000,000 caesium oscillations long – but who’s counting?
GREAT PARTY, WRONG YEAR
In the April of 1997 an atomic clock was set up by Accurist on the meridian at Greenwich to give the public a visual display of the thousand-day countdown to the turn of the second millennium.
This was not the only such gimmick but the various sponsors could not agree over the inclusion of leap seconds. Some had them built into the programming while others decided to refrain from their inclusion until they were actually inserted by Coordinated Universal Time. So, at any one point, the millennium Countdown clocks differed by anything up to three seconds.
Be that as it may, when the Accurist display at Greenwich flicked over to a bank of zeros at the turn of 31 December 1999 to 1 January 2000, those in attendance popped their champagne corks to toast the new era, blissfully unaware that the clock they had been so avidly watching was slightly out (by exactly one year) due to the mistake made by Dennis the Humble – remember him?
As explained previously (see here), Dennis was the monk who created the AD and BC system in the sixth century after picking a year at random and pronouncing it to be the year of Jesus’s birth. But, back in the sixth century no one in Europe had any knowledge of the mathematical concept of zero, so he called it year one. From one to one hundred is only ninety-nine, and so forth, so the new millennium did not begin at the turn of 31 December 1999 to 1 January 2000; it began on the turn of 31 December 2000 to 1 January 2001. But what the hell – it was a great party.