QUANTIFICATION

Over the last century, one key definition of information has come to depend on the idea that anything and everything can be represented by a *binary number sent humming through wires or optical fibers. Quantification has been built into the foundations of information technology. But digitization in the “information age” is but one episode in a much longer history of filling the world with numbers while making sense of the world through numbers.

The earliest examples of quantification date back to near the dawn of agriculture and well before the advent of written history—indeed some scholars argue that quantification acted as a stepping stone toward writing. In the beginning, categories were narrow, and each came with its own means of counting. Looking at the ancient Near East between 7500 and 3500 BCE, Denise Schmandt-Besserat explains that every class of goods had to be represented by a particular token. A person counted grain with grain tokens and garments with garment tokens. With the rise of cities came clay tablets bearing marks for counting and keeping track of debts. On those tablets, the symbols for quantities of grain—themselves already standardized—broke free of their specificity and became instead abstracted numerals fit to be used in counting within any or all classifications. Techniques for quantification laid claim to people and their labor as well. The discussion of Inka khipus in this volume explains how twisted and knotted cords were employed to track thousands of hours of work owed as tribute to the empire.

In the modern era quantification became inescapable and often industrial in character. For example, one of the largest life insurance companies in the world in the early twentieth century developed a technique for assessing the risk of individual lives and called it the “numerical method.” The numerical method’s “human treadmill” began working each day when the company’s mailroom received a flood of standardized forms filled in by applicants, agents, or examining physicians—each already abounding in numbers from earlier quantifying processes and sorted by gender, race, and class. Those forms eventually made their way to white women in white blouses and white men in vests, jackets, and ties sitting in row upon row of desks in the office tower in a New York City skyscraper. Those workers disassembled each application into its key questions and determined how to classify each answer to each question, assigning a point value (often derived from consulting a table of printed numbers) to the resulting classifications. Then the worker added and subtracted these preliminary ratings according to a fixed algorithm to determine the extent to which each applicant departed from a standard set for healthy, white men. The resulting number stood in for the insurance applicant and in most cases justified, on its own, the company’s decision to accept, reject, or impose a penalty on the applicant. What is striking in this description is not only the vast scale of the quantifying operation, but also how many numbers had to be made and how many others consulted in the process. A person—or a commodity, or a physical phenomenon—became a number, but only after becoming a number many times before.

The processes that filled the modern world with numbers depended on widespread observation mediated by specialized instruments and new techniques for managing and presenting numerical data. When the naturalist Alexander von Humboldt traveled to South America in 1799 determined to reveal the laws linking life to its surroundings, he carried with him an observatory’s worth of thermometers, barometers, chronometers, and sextants borne on the backs of mules guided by Indigenous workers. To make sense of all his measurements, Humboldt drew maps on which he could plot his figures and trace “isolines” connecting them. The subsequent invention of graph paper in the early nineteenth century facilitated further reliance on observational data and the use of graphical techniques to make sense of proliferating observations. Astronomers and actuaries alike, in the name of science or business, defined natural laws with curves drawn by hand through plotted numbers.

In the early nineteenth century, Victorian bureaucracies responsible for reforms meant to hold off revolution precipitated an “avalanche of printed numbers” in Europe, according to Ian Hacking. With more numbers came the possibility to look for and discover patterns or regularities, many of which became new objects of study in themselves. Hacking’s key example was sickness, a state of being that came to be seen as explicable by statistical laws in the context of the avalanche and, once in the grasp of standardizing bureaucrats attending international statistical congresses, produced a host of new and newly defined kinds of illness. Over the second half of the twentieth century, for instance, high blood pressure evolved from an indicator of high risk for later heart disease to something like the status of a disease in itself requiring sustained medical treatment.

Corporate offices set off their own avalanches of printed numbers in the mid-nineteenth century. Railroads and telephone companies built vast networks for coordinating the flows of both goods and information. They—and the mass marketers who grew along with them—sent agents to far-flung peripheries, built branches, and advertised widely. Quantification offered a means for disciplining dispersed employees. Home offices that did not trust their field divisions to keep the corporation’s best interests at heart forced them to fill in complicated blank forms, many of which accepted only numbers. Relying more on figures from account books or objective instrumental measures also meant relying less on the judgment of ill-trained or deceiving agents. States responded to corporate expansion with their own systems for measuring the honesty and probity of big business and, with the advent of cost-benefit analysis and elaborate twentieth-century budgeting systems, of government itself. Of course, those who provided the figures could still doctor them, or game them, to win some advantage. And quantification caused new distortions when it encouraged governance based on numbers that, even if faithfully rendered, privileged whatever was most readily quantified rather than what experts, bureaucrats, or ordinary people judged to be most significant. It was in the context of a heavily regulated corporate network—the Bell telephone system—that Ralph Hartley and Claude Shannon made “information” itself into a reliably quantifiable entity, as Paul Duguid explains in this book (see chap. 12).

The twentieth century nurtured bigger, nimbler networks of observers and instruments supported first by empires, then new international governance bodies (like the League of Nations and the United Nations), and later by Cold War alliances. Such networks invented new globe-spanning numbers that helped make debates about international politics look more like engineering problems. With the calorie serving as a standard unit, for instance, national and global indexes of hunger could be created and used to justify humanitarian interventions or the distribution of food aid (which also subsidized agriculture in rich countries). To take another example, world population figures eventually fed major efforts by nation-states teamed with philanthropic foundations to curtail or control fertility. But determining such a figure after World War II required first reconciling national censuses of varying quality, coverage, and organization. Similar challenges confronted efforts to represent global climate or assign and then track annual average global temperatures. To meet those challenges, researchers across fields integrated computer-powered data models into their quantifying treadmills. In the early twenty-first century, much quantification results from people and things who, via the *internet, act as unceasing observers of themselves and their surroundings. Some of the numbers they produce remain private, some combine to form risk profiles in various national security databases, and many are stored by a new breed of corporations for whom such data acts like capital.

Over the last few hundred years, quantification has been closely associated with objectivity. Such associations have proven particularly useful to bureaucrats and scientists alike who sacrificed a modicum of professional judgment in favor of allowing numerical criteria—from credit scores to risk ratings to p-values—the power to make more defensible, because less subjective, decisions for them. Focusing on the way quantification constrains subjectivity, however, distracts from one of the most important explanations for quantification and its effects. A complex range of emotions drive people to make numbers. Nathaniel Bowditch, for example, embraced the task of calculating numerical tables that American sailors used to find their longitude at sea in the early nineteenth century. He ferreted out errors in others’ tables and made his own extraordinarily precise. While others considered such work drudgery, Bowditch approached his long hours as an aesthetic or spiritual pursuit, according to Tamara Plakins Thornton.

Numerical tables could also communicate emotions and aesthetics to readers, like those described by Jacqueline Wernimont, who found a glimpse of the sublime and hints of human mastery even in John Graunt’s tallies of those who died from plague in seventeenth-century Britain. Or, as Caitlin Zaloom discovered, commodity traders in Chicago and London at the end of the twentieth century lived in a world awash with market numbers, but most learned to interpret them less as individual *facts than as a mass whose motions must be watched and judged, all as part of getting a feel for or telling a story about the “market.” On a larger scale, numbers like those produced by pollsters and marketers often shaped how ordinary people, like those discussed by Sarah Igo, understood themselves, their communities, or their nation. Numbers are not inherently objective, and to assert that they are only further hinders us from fully understanding how quantification works.

Numbers often seem entirely natural and quantification an elementary process, but only because so many tools, techniques, institutions, and infrastructures have developed over millennia—and especially over the last few hundred years—to churn out numbers and make the world intelligible through them.

Dan Bouk

See also accounting; bureaucracy; commodification; data; error; governance; khipus; observing; surveilling; surveys and censuses

FURTHER READING

  • Ian Hacking, “Biopower and the Avalanche of Printed Numbers,” Humanities in Society 5, nos. 3–4 (1982): 279–95; Sarah E. Igo, The Averaged American, 2007; Emily R. Merchant, “Prediction and Control,” PhD dissertation, University of Michigan, 2015; Theodore M. Porter, Trust in Numbers, 1995; Denise Schmandt-Besserat, “Tokens and Writing,” Scripta 1 (2009): 145–54; Tamara Plakins Thornton, Nathaniel Bowditch and the Power of Numbers, 2016; Jacqueline Wernimont, Numbered Lives, 2018; Caitlin Zaloom, Out of the Pits: Traders and Technology from Chicago to London, 2006.