I’ve use the word digital throughout this book in one of its most recent and vaguest senses: to describe technologies that involve digital data or phenomena associated with these technologies. Yet the word itself is far older than such a definition suggests.
By the OED’s reckoning, the first written record of the word digital appeared in English around 1425, describing “a whole number less than ten.” Spelled digitalle, the word came from the Latin digitalis, meaning “measuring the breadth of a finger” (today, the word digitalis is probably best known in English as a genus of the purple flower commonly called a Foxglove, its Latin name being a reference to the way its flowers seem perfectly sized to fit over the end of a human finger).
When was digital used for the first time in the context of computing? By the 1940s, the term had begun to be used to describe discrete electronic values in electronics. It wasn’t, however, until 1945 that the notion of an electronic digital computing machine was both born and named.
The specific machine in question was the Electronic Numerical Integrator and Computer (ENIAC), whose completion in 1946, at a cost of over 6 million dollars in today’s money, gave the world its first electronic general-purpose computer. Thanks to a machine that weighed over 25 tons and needed 150 kW of power to operate, the age of digital computing had begun.
What really drove digital into the mainstream was not simply computing, but the steady development over the second half of the twentieth century of digital formats to replace the analog storage media for everything from sound and images to video and text. Over the last two decades of the century, Compact Disks (better known as CDs) and Digital Versatile Disks (DVDs) largely replaced vinyl records and videos. Then, over the first decade of the twenty-first century, the direct distribution of digital files via the internet—known, inevitably, as “digital distribution”—began to replace these storage media in turn, dispensing with everything other than the pure bytes of information itself.
Today, “digital” is everywhere. We watch digital television; we analyze the digital economy; we speak about digital culture and digital trends. Increasingly, though, digital seems at risk of becoming the victim of its own success. We live in a thoroughly digital age—which may well mean we stop needing the word at all. We no longer say “digital computer.” Soon enough, we may no longer say “digital” anything else.
Except, of course, if we’re talking about fingers or foxgloves: two uses of the word that are likely to remain even when ones and zeros no longer need to be mentioned.