SHOULD one happen to search the Internet for pictures of “computers,” a series of cartoons will appear showing computers with faces and arms. Often the face is smiling while the arms wave invitingly. In one case the computer is standing at attention, holding a piece of paper and a pen, an image that at different points in time might have inspired different interpretations. Twenty, thirty years ago it might have seemed to be alluding to the possibility of a new electronic secretary or to our new paperless writing processes. In the era of Web 2.0 and big data, one might well interpret the image instead as a computer noting everything that we do, be it typed or clicked on: opening a website, viewing a video, sharing a link, dialing a number, buying a book, shortening a URL, retweeting a tweet. The “note-taking” computer has metamorphosed from an electronic secretary into a digital panopticon. Just as in Jeremy Bentham’s nineteenth-century concept of the prison, in which a single guard was able to look into each cell from his tower without being seen, our activities in the net today are potentially and perpetually the object of observation. Since laptops and smart phones exist, everyone, as Zygmunt Bauman puts it, carries, just as “snails carry their homes,” her own “mobile, portable single-person mini-panopticon.”1
The metaphor of the panopticon is almost inevitable in the context of the current discourse of surveillance. However, it is only obliquely useful because the analogy of the net and the panopticon is lacking both in spatial and in temporal terms. For one thing, there is no local and hierarchical separation of the monitored inhabitants within their individual cells from the invisible guard who surveys all: on the Internet everyone can monitor everyone else without any previous sentencing or detention that would legitimize or facilitate this activity. If there is a watchman left in this constellation, a watchman who knows everything about everyone, then it is the algorithm. Second, this watchman is not required to detect any possible misconduct of those controlled in real time since whatever happens on the Internet is also stored and can be monitored after the fact. This asynchronous responsiveness is moreover an unavoidable necessity of the searching procedure: before something can be googled it has to have been stored.
For these reasons and others, surveillance research has introduced terms that modernize our conception of the panopticon with respect to digital media and their social networks. For example, “banopticon” identifies unwanted groups (immigrants or unprofitable consumers, say) in order to cut off their access to certain sites, that is, to exclude them from further observation with regard to marketing. Another example is “synopticon,” which indicates a reversal of the more usual act of observing: when many watch a few rather than a few watching many, that is, in the form of mass media.2 However, the most important difference between current processes and traditional terminology lies in their function. The current panopticon does not serve to discipline; it serves to market, analyzing in order to classify.3 It observes in the interest of customer care; it cares rather than surveils. In a way, it returns to the heart of another highly influential trope: that of the all-seeing God.
This image has its origin about four thousand years ago in the Egyptian desert into which Hagar, Abraham’s slave, who was pregnant with his child, had fled to escape the fury of Abraham’s wife, Sarah. In total desperation, Hagar finally experiences the appearance of an angel who commands her to return, promising many descendants. Hagar answers, “You are the God who sees me” (NIV, Genesis 16:13). The gratitude for the care that is expressed by her words lives on in Psalm 139, which invokes God as omniscient and all-caring. Later, clerical discourse will recast God’s eye. It will replace the caring eye with a watchful one, the eye of the law securing the enforcement of social norms through the assertion of all-encompassing surveillance backed up by future reckoning. Bentham’s panopticon and above all Foucault’s discussion of it in Discipline and Punish secularizes this “harsh” eye of God. Google, on the other hand, as an incarnation of the all-noting computer, refers itself to the caring eye of Psalm 139: “You have searched me, LORD, and you know me.”
Google’s self-presentation as advisor in times of need, as an anchor point of orientation in the flood of information, operates rhetorically with a certain tendency to resacralize. It does not do so by comparing itself to God explicitly but implicitly through its unofficial company motto: “Don’t be evil.” This appeal excludes those business practices that, in the service of short-term gains, place the interest of customers (in the advertisement business) above that of the users (of the search engine). The biblical care presented here gains prophetic dimensions in Google’s ambitions not only to answer our questions but also to tell us what we should do next.4 Admittedly, the commitment of the search-machine giant has become a warning to all Internet users by now. It is Google’s efficiency—strengthened by social and technical support in the form of tagging, login coordinates, social bookmarking, and image-recognition software—that ensures that none of our activities in the Internet remain hidden. This will be underlined all the more once Google Glass (or some similar successor technology) finally succeeds in storing and controlling our every eye movement. To the extent that Google is becoming the all-seeing eye of God, we—the Internet users, including Google’s employees—have become the new chosen people of the “don’t be evil” imperative. This became clearer most recently when Google’s former CEO Eric Schmidt publicly declared in 2009, “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place,” and threatened in 2010, “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”5
In consumer society, care is expressed in the form of customer care, realized, for example, by way of Google’s scrutiny of its users and through various forms of big-data mining. The object of data love—according to the formula quoted at the outset—is always the human being behind all the data and the enhancement of his or her economic, social, or health situation. One doesn’t have to be an adherent of critical theory in order to object to the many unknowns in this formula. On which conception of the human being is this love based? Which economic improvement, and at what price? Which economic interests are involved? What constitutes social progress? What alternatives are there? Even when dealing with traffic control and healthcare the matter is not straightforward. Not everyone is ready to divulge information about their movements, even anonymously, and not everyone wants to know about their predisposition to this or that illness. But it becomes truly questionable when data analysis is performed under the banner of consumption and when data protection is compromised for this reason. A side effect of this process, which is hardly surprising, is the semiotic hostage taking of old anxiety concepts like “Big Brother,” which for many today represents no more than a reality-television program.
Against this background, snooping around in social networks can only ever be seen as a trivial offense, as long as it is in the interest of enhancing customer loyalty. The surprise campaign by the Dutch airline KLM is an early example. In November 2010 it gave out little presents to its passengers before their departure: a heart-rate monitor for the lady who wanted to climb mountains, a New York City guidebook for someone who wanted to visit New York. These were personalized presents made possible by KLM checking out customers’ Twitter and Facebook profiles: KLM knew the travel plans and the personal interests of the individual passengers sitting in its airplanes. It was a very successful campaign and earned KLM the epithets “cool” and “media competent.” There was no mention of Big Brother.
The next step in this direction was a changing perspective with regard to Big Brother’s toolset. In the 2013 Super Bowl commercial “Security Cameras,” Coca-Cola flipped the negative image of surveillance cameras by turning them into witnesses of goodwill. Now they capture surprising and moving scenes: “music addicts” (the caption for a man who dances uninhibitedly in front of street musicians), “people stealing kisses” (a couple on a bench, the boy spontaneously kissing the girl), “friendly gangs” (four Arab men helping jumpstart a car), “rebels with a cause” (someone holds up a poster that reads “NO TO RACISM”), etc. With their ironic allusions to typical subjects of surveillance (“addicts,” “stealing,” “gangs,” “rebels”), the little microcosms of these scenes achieve what the commercial as a whole aims to do: the reinterpretation of fear-laden terms and symbols as sympathetic. This reinterpretation is reinforced by Supertramp’s superhit “Give a Little Bit,” with a pointed readjustment of its central line: “Now’s the time that we need to share / So find yourself” becomes “Now’s the time that we need to share / So send a smile.” In place of self-discovery we are supposed to smile for the surveillance camera. This somewhat offbeat, sugarcoated perspective on the problem of surveillance perfectly exemplifies Coca-Cola’s maxim “think different” and thus assumes a mask of cool contemporaneity. However, in this context, its pseudosubversive cool plumbs the depths of cynicism.6
But this is exactly the rhetorical context needed by big-data business. Parallel with the defusing of fear-laden terms came the ideological exploitation of concepts with politically positive connotations like “social,” “share,” “transparency,” “public sphere,” “participation,” etc. Facebook Inc.’s decision in October 2013 to allow young people between thirteen and seventeen to share their Facebook sites with all Facebook users instead of just their friends or friends of friends is an example. The purpose of doing this can hardly be mistaken. It was to open channels of communication between a potential consumer group and commercial interests while the group was still easy to influence. Outwardly, Facebook, Inc., proclaims that young people should be given more credit for self-determination and prudent behavior, and this stance is accompanied by the argument that one should offer, especially to socially active teenagers such as musicians and “humanitarian activists,” the same unlimited access to the public sphere as do Twitter and Weblogs. Who would want to deny the activists among teenagers necessary access to a public sphere? This is how youth protection, educators, and child psychologists who argue with respect to the immaturity of those whom they want to protect are checkmated: by political correctness in the service of the market economy.