CHAPTER 30
Ada Byron, Countess of Lovelace: coding pioneer a century before Alan Turing
An increasing number of films and programmes credit Alan Turing with inventing the ‘drum machine’ type of computer used to crack the Nazi’s Enigma code. In fact, it was invented a hundred years earlier, and its first programmer was Ada, Countess of Lovelace, also famous as Lord Byron’s daughter. She was the first computer programmer, and this piece was published on her birthday.
Benedict Cumberbatch’s portrait of a halting, frustrated, brilliant Alan Turing in The Imitation Game, 2014 has beamed the Cambridge mathematician’s unique contributions to modern digital computing onto the Hollywood big screen.
However, the impression it leaves of Turing as the visionary who dreamed up an unprecedented, futuristic machine of computational cogs and wheels is misleading. The story of groundbreaking English ‘drum’ computers started over a hundred years earlier.
Mechanical machines have, of course, been performing complex computational functions for centuries. In 1901, divers pulled a rusty box of cogs from the depths of the Aegean near the island of Antikythera. International experts have deduced from the oddity’s 30-something gearwheels and countless astronomical inscriptions that it was part of a precisely engineered machine for crunching the mind-boggling mathematics needed to model the positions of the sun and moon and to predict solar eclipses. According to research published in the last few weeks, the Antikythera Mechanism dates to 205 BC, over 1,200 years before mechanical clocks appeared in Europe.
Today is the 199th anniversary of the birth of Ada Byron (1815–82), Countess of Lovelace, only legitimate child of the poet Lord Byron. When she was 19, she met Charles Babbage (1791–1871), the Devonian genius who arguably invented computers. Their partnership was perhaps the true birth of computer science, as he invented the hardware, while she has gone down in history as the first programmer. (The US Defence department’s computer language is called Ada, after her).
Like Turing, Babbage was a Cambridge mathematician. Turing was a fellow at King’s, while Babbage was the Lucasian Professor, a post held by both Sir Isaac Newton and Stephen Hawking.
Babbage’s first foray into mechanized mathematics was his Difference Engine, a calculating machine cranked by a handle. He was never able to build more than a seventh of it (which worked magnificently), but in 1991/2002 the Science Museum managed it, using materials from the 1820s. It weighs five tons, and operates exactly as Babbage predicted.
But Babbage’s major scientific leap came in 1834, when he started work on a steam-powered machine to perform an infinite variety of programmable operations. He called it his Analytical Engine, and it is recognizably the first modern digital computer. Like the Polish ‘Bombe’ machine that Turing took as his starting point at Bletchley, and like Turing’s subsequent designs for the ‘Universal Turing Machine’ and ‘Automatic Computing Engine’, Babbage’s computer used vast banks of rotating drums. The design was truly unprecedented and visionary, allowing it to store 1,000 numbers, each stretching to 40 decimal digits. Just like a modern computer, it incorporated a separate processor and memory bank, looping, and conditional branching. It even had a printer. Sadly, Babbage only had funding to assemble a few parts of it.
Interestingly, away from designing computers, Turing and Babbage had something else in common: both were covert government code-breakers. From 1938–45, Turing was at the Government Code and Cypher School at Bletchley Park, where he first ran Hut 8 (Naval and U-boat cyphers), before eventually becoming consultant to all Bletchley’s operations. Babbage’s wartime role is not so well-known, but during the Crimean War he cracked the enemy’s Vigenère’s autokey cypher, although the British government never allowed it to be known for fear of losing the intelligence advantage.
Ada Lovelace collaborated with Babbage (he called her his ‘enchantress of numbers’), and it was the most significant work of her short life. She was taught mathematics by her mother, who hoped to keep her well away from poetry. Ada was clearly gifted, and when she met Babbage he asked her to translate an Italian account of his Analytical Engine written by the future Prime Minister of Italy. The work engrossed her, and her fame rests primarily on one of the many notes she added for Babbage, in which she proposed an algorithm for the Analytical Engine to calculate a sequence of Bernoulli numbers. This algorithm is arguably the first true piece of computer code.
English innovations in computing have not slowed since Babbage, Lovelace, and Turing. In June, a chatbot called ‘Eugene Goostman’ won a competition at the Royal Society marking the 50th anniversary of Turing’s death. Its simulation of a 13-year-old Ukrainian boy was hailed by many as the first programme ever to pass the ‘Turing Test’, a sacred benchmark in artificial intelligence requiring a computer to deceive more than 30 per cent of its human judges into believing it to be human for a period of five minutes.
But not everyone views all modern computing developments positively. Tim Berners-Lee, English inventor of the World Wide Web, has spent much of 2014 being increasingly concerned with what the internet has become. Understandably so. He could never have foreseen the criminality, terrorism, and nation-sponsored cyber warfare now travelling through its servers. And, on the more theoretical side, in a moment of uncharacteristic pessimism, last week Professor Stephen Hawking dramatically predicted that the development of full artificial intelligence could spell the end of the human race.
As no civilization or technology lasts indefinitely, perhaps one day, far in the future, a diver will find a hard drive containing the code for the Eugene Goostman chatbot, and be as baffled by it as we are by the Antikythera Mechanism.