The words we use say more about us than we usually realize. In a sense, they also use us—and never more so than when we’re speaking about what it feels like to be us.
Take an innocuous human term like “memory.” The word itself has been with us in English for a good eight hundred years, arriving from Latin via French (memoria and memorie, respectively) in the mid-thirteenth century, with little essential change in its significance for several thousand years.
In 1946, however, memory stopped being a strictly biological business, when it was applied for the first time in history to the memory of something inanimate—an early electronic computer.
Today, using memory to describe the physical microchips encoding a computer’s files is as familiar a usage as describing the human process of remembering. It’s also an implicit analogy that has had a significant impact on the way we think about ourselves.
Over the last half-century, computers have become a dominant metaphor for the way we describe our own minds. From talk of processes and calculations to sub-routines, modules and components, accounts of our brains no longer feature the homunculi of early-twentieth-century illustrations or the metaphysical humors and passions of classical thought. Instead, we turn to hardware and software for analogy—complete with all the baggage that analogy brings.
Consider what it means for a machine’s memory to function well. It should be large, free from errors, rapid, searchable, easy to expand or wipe clean, and categorized into comprehensive and unambiguous sections. The bigger, cleaner, brighter, and faster it is, the better.
Speak of memory today and these are some of the associations that will be summoned, whatever the context. Yet they bear little resemblance to the architecture of a human mind—in which recall is serendipitous, embedded in a unique personal history, entwined with feelings, places and beliefs, and constantly shifted by the mind’s churning present tense.
Just as steam-powered machinery left its metaphorical mark during the industrial revolution, the language we bring to bear on our own minds is increasingly shaped by computing: from talk of “processing” and “downloading” ideas to acts like “rebooting” our attitudes, “reprogramming” our thinking or even “rewiring” our brains.
We seek to understand ourselves—as we must—with the words we have. In digital technology, we possess a unique kind of mirror for self-reflection; but also an analogy for intelligence to whose imperfections we must remain alert. Computers may help us to remember and to record; but there’s a world of difference between mere forgetting and deletion, or a moment recalled and a moment merely recorded.