It’s possible that future accounts of history will tell of a dispute that took place in 2023. Back then, they’ll say, the German Ministry of the Internet (founded just a few years after the 2013 NSA surveillance scandal) sought an injunction against the Association of Data Privacy Activists. Their so-called white block had long urged people to create pockets of noncommunication, by deactivating the GPS on their smartphones, for example. By 2023, deactivation was no longer possible, but owning a smartphone was not yet required by law. Now the Ministry of the Internet wanted to change that.
The Ministry of Transportation proposed establishing presence technology (which displays the position of your vehicle with an accuracy of five centimeters) in traffic law. That was important, it was said, because cars were now silent. All kinds of collisions could be avoided without seeing or hearing a thing, simply through automatic warning signals and application of the brakes if the distance separating the position coordinates of two objects equipped with such sensors fell below the minimum limit. The data privacy activists had no arguments against this procedure, which seemed safe, but they demanded anonymization, since preventing a collision between a car and a bike doesn’t require the operators to be identified.
The Ministry of the Internet did not agree with this perspective, because cutting-edge data mining could calculate the likelihood of a collision based on insights into the operators’ physical and mental states, their daily routines, the vehicle models, and other available data, and trigger effective preventative measures accordingly. Since traffic safety is not a personal matter, citizens couldn’t refuse to be identified. The appeal to create holes in cybernetic communication was deemed a danger to transportation, even labeled a terrorist act by some, and ultimately prohibited by the judiciary.
Science fiction? What part of it? Presence technology and smart things are now just as much a topic in the newspapers as the use of GPS is a widespread reality. Algorithms have long been poring over big data for hidden connections, more and more self-trackers willingly contribute to the data pool with observations about their own daily behavior, and rationalizations were a tool of modern state governance long before the digitization of society. Is it science fiction that people will soon tell their wearables to find information about the people sitting at the next table over?
When Google Glass was about to be introduced to the market, a blog comment prophesied that the wearers of such surveillance devices would have higher insurance premiums because of all of the injuries and broken noses they could expect from their surroundings. That is a fiction. On the contrary, Google Glass—or whatever the marketable equivalent will be called—will be as successful as the iPad, and its users will receive all sorts of discounts if they let the insurance companies look through their lenses.
The NSA scandal turned data privacy into a news topic during the 2013 summer slump, and Orwell’s 1984 into vacation reading. Cries to save basic civil rights and the Internet as it once was (a place for free expression of opinion, without surveillance or commerce) put many people in a fighting mood. Few comments emphasized that loss of data privacy was only in part a product of the secret services; ignorance, avarice, and convenience are stronger factors.
It is ignorant to say, “But I don’t have anything to hide.” Who knows what patterns the algorithms will find in your behavior, and the degree to which these might be a disadvantage? How can you exercise your basic right to informational self-determination if modern analysts don’t even tell you what they’ve discovered, nor ask you if they can make public use of it?
Such a statement also lacks solidarity, which is made clear by a report about the first marriage between two men in a Protestant church in Germany in August 2013, right at the peak of the privacy debate that followed Snowden’s revelation. What was forbidden or taboo twenty years ago is now recognized by society and blessed by the church, so that many who yesterday had something to hide are now considered to have been good people all along. Anyone who sings the hymn of transparency turns a cold shoulder to the troubles of minorities, and they are politically naive because, on the one hand, they assume that their society’s current laws and moral attitudes are up to date, and on the other hand, that they are sacrosanct. Jews also once thought that they had now arrived in mainstream German society, and after Trump’s victory, many US citizens fear that they will be driven back out of their society’s mainstream again. As long as history cannot protect progress against setbacks, it must be the duty of all citizens to protect the right to hide things by practicing it themselves.
It also demonstrates a lack of solidarity when you hand over your personal data for financial gain. It starts when you divulge individual consumer habits in exchange for discounts, or allow Google’s complimentary email service to read your correspondence. Anyone who pockets insurance rebates because they allow their driving style or body movements to be monitored ultimately see to it that those who exceed the calculated average (or don’t want to provide proof of their averageness) must pay more. Personal permissiveness with data has consequences for others.
But the most powerful engine for loss of data privacy is convenience, the delegation of as many tasks as possible to software-equipped everyday objects that communicate with one another: the swimming pool that heats itself when a barbecue is on the calendar, the refrigerator that places orders when the beer is gone or the milk is expired, the car that checks traffic updates and construction schedules to automatically adjust the route. The media scholar Marshall McLuhan termed the media an “extension of man.” There are extensions for the arms (hammers, pistols), the legs (bicycles, cars), the eye (telescopes, microscopes), the memory (writing, photography). The Internet of things allows computers not just to compute for us, but also to observe and evaluate our environment, that is, to think. It is unsurprising that for McLuhan, the shadow of the extension of man is amputation; we know that driving cars doesn’t make our legs any stronger. The Internet of things doubles these shadows. It doesn’t make our thinking any more resilient, and it functions all the better the more it knows about us.
The communication of things, history “books” of the future will tell us, was a triumph of artificial intelligence and human convenience over the remaining efforts at data privacy in the early twenty-first century. It made human life easier, and also easier to control because it subjects nearly every action to data analysis. A paradise for doctors, city planners, and traffic controllers. The resulting big data pool is not a symptom of a zealous secret service, but rather the legacy of the Enlightenment, which viewed every unmeasured hill as an insult to reason. Society’s digitization is the extension of this yardstick into the social.
Loss of data privacy is thus the product of a cultural and technological disposition, which in McLuhan’s perspective is identified as second-order technical determinism: “We shape our tools, and thereafter our tools shape us.” It’s an old story: machines take command, like the supercomputer HAL in Stanley Kubrick’s 2001: A Space Odyssey. A sort of “Sorcerer’s Apprentice 2.0,” except that unlike in Goethe’s ballad, the master also doesn’t know how to get rid of the spirits he’s summoned.
In his 1979 book, The Imperative of Responsibility: In Search of an Ethics for the Technological Age, the philosopher of technology Hans Jonas described the problem as powerlessness in the face of success: the catastrophe for humanity is the triumph of homo faber, the maker, who ultimately becomes “the compulsive executor of his capacity.” In summer 2013, “YES WE SCAN” was written on a poster protesting the NSA’s surveillance practices, which, together with the famous phrase that accompanied President Obama into office, characterizes the fatal relationship of modernity to technology: yes we can analyze big data better and better, and we do.
Have we already been defeated by these technical capacities—“programming” or “analyzing” in current terms—or can we still escape the power that we have over algorithms before they force their postulates on us? The answer also depends on the consciousness of the problem that the weakening of data protection leaves behind, and where people suspect the enemy might lie: in the secret service and the state, in the Internet giants and programmers, or in ourselves, since (for example) accident prevention through algorithms is such a relief to us that we don’t just accept the necessary exposure, but also force it on the remaining “enemies of progress.”
Government ministries also recognized long ago that collecting data makes their work much easier. Though the 2016 “Law on Digitizing the Energy Transition” (Gesetz zur Digitalisierung der Energiewende) enacted by the German Bundestag doesn’t compel identification via presence technology, it does promote the installation of smart electricity meters. This will generate energy savings, so the justification goes, of at least 1.5 percent. Who would want to bring up data privacy—as the Green Party and Left Party did when they voted against the law—in opposition to ecological successes, or mention pedantic security risks because, as an interface to the “smart home,” smart meters could allow unwarranted access to it?
As one politician from the center-left Social Democratic Party put it, the old “uncommunicative” technology of electromechanical meters must finally be replaced with digital technology. A politician from the center-right Christian Democratic Union affirmed this longing for technological progress: “We can’t always keep holding ourselves back with data privacy.”1 Occurrences like these make it clear that loss of data privacy is neither science fiction nor something that democratically elected parliaments would rather leave to dictatorial systems.