A HILL whose height remained unknown was an insult to the intelligence, muses Alexander von Humboldt in Daniel Kehlmann’s novel Measuring the World (2007). There were many surveyors in the nineteenth century—and not only during this period. Cartography was a part of the movement to measure the world, as was Linné’s botanical taxonomy of the species or Franz-Josef Gall’s phrenological work, promising—based on the physiognomy of Johann Kaspar Lavater—to be able to infer the characteristics of a person from the dimensions and form of his or her skull. And then there was the prospect of a glimpse inside the skull, into the depths of the unconscious, thanks to Sigmund Freud’s dream-work and psychoanalysis. To measure, to count, to weigh, to unearth, and to uncover was the stimulus of modernity, and rationality was its guide.
No wonder that once rationality was radicalized as formal logic through computation this would lead to excesses of measurement—especially given the fact that nowadays surveyors no longer need to risk their lives as von Humboldt had. Decoding the human genome and making neuroscientific analyses take place within the safe confines of the workplace or lab, as does the deciphering of human behavior, which is made possible because networked computers not only create a huge capacity for computation but also subject the social realm to the blanket collection of data. Visits to websites, searches, the purchase of tickets, transcriptions of verbal exchanges—nothing that happens online escapes the attention of algorithms. Thanks to smart phones, GPS, and RIFD, what happens in RL—“real life,” in contradistinction to the virtual life of cyberspace—is mirrored on the Internet as well.
Facebook’s Gross National Happiness Index illustrates the excess and absurdity of big-data mining by counting the use of positive and negative words in Facebook’s status updates. Another example is Hedonometer.org, which scours Twitter, New York Times articles, Google Books, and music lyrics for keywords. The knowledge derived from Facebook’s research is that people are generally happier around Christmas and that Catalonians are also happier on Sant Jordi and Americans on Super Bowl Sunday.1 The insights from Hedonometer are more interesting since it also analyzes the places from which the tweets are sent. Thus it discovers that people feel altogether happier if they are not at home, which, on the one hand, is surprising and, on the other, may not be—depending on one’s perspective on life. The interpretation of data, however, is not only relative to human perspectives. The danger of obvious misunderstandings is ever present when the analysis is undertaken by artificial intelligence. This kind of thing delights the investment company Berkshire Hathaway, which is happy every time the actress Anne Hathaway appears in a new film because the value of its shares rises—and even more so the more scantily she is dressed. The shares do not rise in value because the actress owns the company but because trading algorithms assume that the more often the name Hathaway appears, the more people will be interested in the investment company.2
But matters are more serious than these examples might suggest. In the first place, there is a lot of money to be made with data. Empirical studies confirm a connection between Facebook’s happiness score and stock-market values. Anyone who does a thorough analysis of communications regarding financial markets in social media will understand the “psychology of the stock market” and may even be able to make predictions about its future development.3 This creates high expectations—and leads to many new startups. What is the subject of the moment on Twitter? No problem. Who sends the most tweets and receives the most retweets? No problem. Which video is the hottest, which piece of music, which politician, which country, which website, which search, which electronic book, which passage in that book?—no problem at all! The data traces we create become the shadow of our online existences—and a paradise for statisticians, sociologists, corporations, and intelligence agencies.
This shadow can only be shed by those who act like the hero in the German writer Adelbert von Chamisso’s Peter Schlemihl (1814)—namely, by getting involved with the underworld: by doing whatever is necessary to become anonymous, traveling the net more or less incognito. Nevertheless, valuable data can still be gathered since information on how long somebody lingers somewhere allows for the compilation of behavioral patterns and indices of normalcy even if this somebody remains anonymous. Admittedly, the statistical analysis becomes more concrete as soon as the identity of the user is known. However, following analysis, the pressure to optimize—on, for example, the editorial team of an online journal or with respect to any web offer—operates just as effectively with numbers that are indisputable even if derived from anonymous sources.
To make processing easier, measurement-friendly forms of communication such as views, likes, and shares were created, allowing for a translation of the opinions generated by Web 2.0 use into measurable currency. The binary operational mode of digital media—zero/one, yes/no—thus becomes effective beyond the interface. From this point of view, the medium’s message is that society is constructed of dichotomies, and, to a certain extent, it is also infantilized, dissolving complex matters into the nuance-free, fairy-tale alternatives of good or bad. At one and the same time this dichotomizing is a sophisticated strategic move attempting to solve the problems of interpretation at the point of entry. Instead of laborious, ambivalent—let alone ironic—commentary, we get unequivocal, measurable votes.
This is exactly what makes statistics so attractive. They appear factual, subject to economics, mercilessly neutral, and can be universally understood, even by observers outside the discipline. Communication in terms of numbers generates trust efficiently and is construable for those who remain clueless. On the basis of numbers, even managers without any critical or cinema background can make judgments about what films should be shown and which should be taken out of the schedule. Quantification allows for decision making without having to make a decision. This is the end of cultural rule by experts—not through an opening up of discourse to everyone but merely by switching from word to number. The statistical view of society establishes a numeratocracy.4
The evocation of Chamisso’s Schlemihl is more important than previously intimated. He bargained with the devil and traded his shadow for a purse permanently filled with gold, but he was, thereby, excluded from his bourgeois background. He who has no shadow has no identity. And it is this that holds the future danger currently being discussed under the rubric “I post, therefore I am”: Those who do not create a data trace do not exist. This is already true of social networks, within which service providers are pushing anyone who resists the constant requirement for transparency to the margins of social communication. As everyone knows, data traces are indispensable in other areas. Without a credit history there is no credit; without a history of incidents there is no insurance protection. One cannot assume that more effective technologies of identification will decrease the demand that human subjects should identify themselves.
Data acquisition will increasingly become a social duty and will be perceived as being in the interest of the public good to the extent that one will not be able to avoid the provision of personal data without consequences. Public access to all medical records, for example, is not only desired by interested parties on Internet portals such as PatientsLikeMe.com, whose visitors are greeted with this tagline: “Making healthcare better for everyone through sharing, support, and research.” The rhetoric of sharing implicitly labels those who withhold their data as egoistic and asocial. In the era of big data, concern for our fellow men no longer shows itself in the sharing of one’s coat but in the sharing of one’s data. “Sharing is caring” reads the slogan at Circle, a combination of Google and Facebook in Dave Eggers’s dystopia The Circle (2014)—with the inevitable flipside: “privacy is theft.” To participate in society, we must have a data shadow. Those who withhold it will one day find themselves cut off from medical care, just like Violet, the rebellious woman in M. T. Anderson’s dystopian science-fiction novel Feed (2002).