COMPUTERS

“Computer” was a vocation before it was a thing. From the seventeenth century onward, a computer was an individual tasked with carrying out mathematical computations. In the first half of the twentieth century, most computers were women who worked in science or industry. Computers in the more familiar sense of inanimate information technologies are associated with two qualities above all others: they are understood to be mechanical or automated; and they are understood to perform operations with numbers. Yet this cold, mechanistic idea of what computers are fails to capture the intimacy they have come to claim in our daily lives, where a computer may nestle in our pocket or balance on our lap. Computers as we now know them—sleek, lightweight, and stylish products that afford us access to software, entertainment, and social networks—seem to have little to do with numbers as such. They are platforms for delivering media for personal consumption; instruments enabling us to communicate over great distances; and tools for enhancing productivity and creativity through the discovery, management, and sharing of information. Only in certain cases are computers still used for crunching numbers, as in the kind of work an accountant or a research scientist might enlist one to perform.

Even the word itself can sound distant and out of place in everyday speech: nowadays it is just as common to say phone or laptop or even machine as to say computer. But the term computer retains an aloofness that can serve to remind us of why they have come to be so pervasive, the one device that has replaced so many others. Paul Ceruzzi defines the essential characteristics of computers through three basic functions: calculation, as we have seen; automated control via sequences of instructions (what we more commonly think of as software, or a program); and the representation of stored data that are encoded in ways tractable to manipulation by calculation and control by automation. Note that all these functions are abstracted from any particular material implementation. Computers have been designed, and in some cases built, out of gears and rods, switches and relays, Tinkertoys, water and plumbing, paper, and, of course, electrical circuitry miniaturized and mounted on silicon wafers or “microchips.” Also absent from this imagining are the iconic array of peripherals we tend to associate with computers, like display screens, keyboards, and the mouse, or storage formats like magnetic disks or tape. Ceruzzi’s computer, by contrast, is a computer simply because it is a machine capable of following instructions (including forks and branches in those instructions) in order to manipulate stored data through arithmetical operations, whether or not the data themselves finally take the form of numbers.

From these characteristics—calculation, autonomous control, and encoded data—come all the functions of computers as we know them, from doing actual math to sharing a picture of a cat. Whereas once calculation was its own end, it is now a means to an end, a method for manipulating symbolic values so that they can form the basis for other types of representations (like a sound or an image). Because sequences of programmed instructions can also branch and repeat themselves, computers are efficient not just at addition and multiplication but also at iteration—the ability to repeat the same set of instructions, with or without variance, over and over again. In practice, computation is performed through iteration, and it is through vast quantities of iterations—we expect them to be lightning fast—that computers gain the awesome power to emulate and extend the world around us. Alan Turing presented the mathematical foundation of these principles in his paper “On Computable Numbers” in 1936. John von Neumann expressed them in what has come to be their most influential formulation in his Draft Report on the EDVAC in 1945, which contributed the idea of storing data and instructions for operating on those data in the same substrate as part of the computer. Charles Babbage and Ada Lovelace clearly understood the essence of the computer’s principles a century earlier.

So much for what computers are, in theory. As actual artifacts—working things—computers have a long and messy history. Indeed, like many exemplars in the history of information, simplicity in concept is realized only through extreme ingenuity in implementation. Whereas once computers filled entire rooms, now they fit in our pocket (or indeed, in our bloodstream). Computers are often assumed to be interchangeable with *digital systems or technologies, but this is a matter of convenience in the engineering; likewise, we associate computation with the binary numeric system of 0 and 1, but this too is just a matter of convention for the more elemental condition of presence or absence (on or off) that is the most efficient schema we have yet devised to encode information and allow for its manipulation. Because digital systems operate with discreet symbolic or numeric tokens, error checking can be formalized and the inevitable imprecisions of the analog world can be modulated through discreet values with high tolerances for ambiguity. As W. Daniel Hillis puts it, “computers must produce perfect outputs from imperfect inputs, nipping small errors in the bud.” Digital systems serve to mediate between the mundane stuff out of which computers are made and the formal requirements of mathematical computation, effectively ensuring that a given value is always unambiguous and that the result “checks out.”

Computers thus support an illusion, or call it a working model, of an immaterial or “virtual” environment: copies are indistinguishable from originals, keystrokes can be backspaced and corrected without a trace, email flashes around the globe in the blink of an eye. In reality, however, these effects are carefully designed efficiencies. To achieve them, computers must have energy sources, such as comes from fossil fuels. They generate heat, which must be cooled with fans and air-conditioning. Computing’s infrastructure has changed the landscape (including the ocean floor and low earth orbit), and, with much else, it is contributing to changing the climate. The raw materials out of which computers are made (gold, silver, copper, coltan, silicon, and palladium) must come from somewhere and are often extracted and manufactured under unjust labor conditions. For all the precision and perfection we expect from computers, they are a part of the world not apart from the world.

Computers have often been associated with predictive functions, whether related to the natural world, theoretical physics, economics, society, or warfare. In 1952 American television audiences were treated to the spectacle of a computer correctly predicting Eisenhower’s victory over Adlai Stevenson in the presidential election. Displays like that undoubtedly contributed to the mythology around computers as autonomous or artificial intelligences, with interests potentially inimical to our own—a theme mined by science fiction ever since. (The first popular book about computers, published in 1949, was entitled Giant Brains, or Machines That Think.) In recent years such anxieties have increased again with new public awareness and concern over *“big data,” privacy, surveillance, and algorithmic decision making.

But a computer, on its own, can do little besides compute. Its inputs and outputs are always embedded in a human context. Initially those contexts were determined by the government entities that possessed the resources to design, build, and operate computers: the military and scientific establishment. Computers became commercialized in the second half of the twentieth century, contributing to automation in a wide range of industries and professions; in the early 1980s, computers became affordable to an upper-middle-class segment of the population in the global North, and a consumer marketplace quickly took hold. Computers thus became associated with office work, household tasks, and recreation (games). At about the same time, computers started to become integrated into national and international telecommunications networks. The combination of telecommunications, individual ownership, and increasingly sophisticated audio and graphical capabilities ensured that by the twenty-first century computers would become the foundational technology for media production, distribution, and consumption. For many of us, our computer is also our telephone, typewriter, calculator, daybook, bookshelf, atlas, almanac, *encyclopedia, newspaper, television, cinema, stereo, family photo album, game room, mailbox, and soapbox. Computers as we know them are thus a realization of Turing’s principles for what he termed a “universal” machine. As such, computers as we now know them are the instruments of politics, culture, and commerce as much as they are science and engineering.

As computers approach physical limits in both their size and their speed, the technical frontier has shifted to so-called quantum computers, which, if achievable, would allow for exponential increases in processing power. Nonetheless, even quantum computers would still be machines that iterate through discretely encoded units of information along programmable and controllable paths. In 1803 the poet and painter William Blake expressed a desire to “see a World in a Grain of Sand.” Blake, of course, was not writing about computers. But the line serves to capture the contrast between today’s computers built on microscopic silicon chips and their enormous capabilities and consequences for nearly every segment of human society. By virtue of innumerable individual operations repeated in immense quantities with exquisite accuracy in a mere instant, computers—and the ways we have all chosen to use them—have changed how we see the world.

Matthew Kirschenbaum

See also accounting; data; digitization; encrypting/decrypting; error; files; media; networks; programming; quantification; social media; storage and search; telecommunications

FURTHER READING

  • Janet Abbate, Recoding Gender, 2012; Edmund Berkeley, Giant Brains, or Machines That Think, 1949; Chris Bernhardt, Turing’s Vision, 2016; Martin Campbell-Kelley et al., Computer, 2014; Paul Ceruzzi, Computing, 2012; idem, A History of Modern Computing, 1998; Thomas Haigh et al., ENIAC in Action, 2016; Marie Hicks, Programmed Inequality, 2017; W. Daniel Hillis, The Pattern on the Stone, 1998; Noam Nisan and Shimon Schocken, The Elements of Computing Systems, 2005.