Did Facebook really make Trump possible? Who belongs to his secret “Facebook army”? How much of the guilt falls on Zuckerberg? And what about Twitter? The media fell all over themselves to point the finger, turning every scrap into a story. They accused Facebook of spreading lies in Trump’s favor, providing a gathering place for his supporters, and making it possible for him to reach them. They feverishly tried to hold social media responsible for Trump’s victory, after having unscrupulously fawned over the forty-fifth president themselves, back when he was merely one ridiculed but scandalous (and therefore ratings-boosting) candidate among many.
Then everyone was talking about the filter bubble, which cocoons Facebook users in a like-minded circle. This bubble ultimately produced so many hateful comments and violent threats that the Office of the Munich Public Prosecutor opened preliminary investigation proceedings against Zuckerberg for allegedly aiding and abetting hate speech. In doing so, they assumed that Facebook holds a certain influence over society, which it definitely does, but they accused the right people of the wrong thing.
You can hardly reproach Zuckerberg for fake news or bringing together those in the wrong. All the rash accusations, which extend to Trump publicity posted by Macedonian youths, nearly make a person want to come to Zuckerberg’s defense. And yet, the suspicion remains that Facebook paves the way for demagogues like Trump, simply by virtue of the fact that it is what it is. If public discourse were to open an investigation into Zuckerberg—and it absolutely should, at least as a thought experiment to better understand our media society—its charge against him would have to be inciting the people to stupidity rather than inciting them to hate speech. The prosecution’s chief witness would still be the filter bubble, but for different reasons than you might think.1
Facebook didn’t invent the filter bubble, of course. The human desire for cognitive consistency has been among the basic insights of psychology since the 1950s. Long before Facebook, we already knew that the control the Internet gives people over their communication processes doesn’t really do them any good. Facebook’s much-derided algorithms basically just improve on human impulse by means of technology—at least if you reduce the filter bubble to its content.2
The problem of the filter bubble is bigger than generally accepted because the bubble is bigger than you think. It contains all the little bubbles the media were so eager to talk about: right-wing and left-wing bubbles, Brexit and anarchist bubbles, bubbles for neoliberals and Marxists, Slavoj Žižekists, and probably even one for postmodernists. The filter bubble is as big as Facebook itself because it doesn’t exist on but rather as Facebook; Facebook is the bubble. In other words, the bubble is a framework of technical and social conditions that largely determine the communication that occurs in its area of influence. Quantity, dualism, and speed are the pillars of this framework.
Quantity is the currency of the popular, which rules Facebook. We judge the value of the people and posts we see there based on their number of friends, shares, and likes. The question isn’t which friends, or what the likes were for, but rather how many. The opportunity for written comments doesn’t help much, since (1) they usually only amount to a few words; (2) their number pales in comparison to single-click assessments; and (3) anyone who has ever posted a nuanced text on Facebook knows that they get very few likes. Numerical appraisal is the standard on Facebook, with dubious consequences for politics.
Reduced complexity and automatic judgment can only be avoided with language, because only those who express their own opinions in language ask themselves which words are best suited for the task. It’s no guarantee—comments generally devolve into absurdity and insult—but it is the precondition for forming a critical opinion beyond knee-jerk partisanship. “When they go low, we go high,” said Michelle Obama during the election campaign in reference to Trump. A powerful statement that nonetheless promises to rise above the level of slogans. A hopeless prospect, if quality is determined in numbers instead of words. When the highest number is what counts, the “low” always comes out on top.
Numeric populism is related to postfactual emotionalism: unjustified likes are the technical version of the mantra-like repetition of empty assertions. Just as a lie that is told often enough in real life becomes a truth for many, a Facebook post gains credence if it is given credence. People always click on the option with the highest number, reinforcing its top status. The number is an emotional plea, because so many can’t be wrong, especially if your best friends are among them.
The basic principle of the filter bubble is antagonistic: someone or something belongs or doesn’t belong. The opposition connotes inside/outside or us/them thinking at every possible position on the political spectrum. This antagonism is reminiscent of binary code, which is the basis for the Internet and every computer. But can you blame the interface’s back-end 0/1 binary for the polarization trend on its front end? Of course you can, insofar as the computer’s operational logic aims to organize information into databases, and reduce opinions to the either-or of a like or dislike button.
The consequences of eliminating complexity are highly political, but it is rehearsed in every interaction, even the most unpolitical, whenever we quickly like or dislike things: books, films, photos, recipes, makeup tips, Tinder dates, and newspaper articles. Whether we express it with a click on a like or dislike button, on a thumbs-up or thumbs-down icon, or by swiping left or right, the attitude is always to pursue one of two possibilities, wordlessly and without justification: yes or no, friend or enemy, true or false, in or out. Gradually, you forget how to count to three.
The theoretical foundation of the thinking machine may be “yes or no,” but that doesn’t necessarily make it as stupid as many of its users. What the binary model lacks in complexity the computer makes up for in speed, by reducing complicated logical operations to many single steps, each ultimately based on a binary formula. People are slower than computers, and since everything moves so quickly on the Internet, they have less and less time for complicated things. In the end, that keeps them more ensnared in the dualistic mode than binary computers are.
The central hallmark of click culture is instantaneousness. Nothing in Facebook’s news feed is as old as a post from this morning. Reactions must conform, which is why posts by good friends often receive a like before we’ve had time to look at them. And since new posts have already accumulated by evening, we hardly have time to process the previous ones. If you know the author, you confirm without scrutiny, developing a culture of partisanship and blind trust that doesn’t simply go away when the posts become political.
Speed is also an enemy of depth. When time is short, material that is supposed to garner likes must be short on depth. Since everyone likes to be liked, they filter complexity and seriousness out of their posts. And so the hunger for affirmation Facebook instilled in us produces a glut of intellectual fast food. Dumbing things down out of loneliness is the dialectical flipside of the connectivity Zuckerberg has brought into the world.
Zuckerberg defended himself against accusations in the wake of Trump’s victory with the claim that Facebook gave umpteen million people the opportunity to form a political opinion in the run-up to the election. This self-congratulation demonstrates how little the accused was troubled by a sense of guilt. In acting this way, he can appeal to those who still cling to the founding myth of the Internet as a tool for emancipation and democracy, and who associate Facebook with the Arab Spring, new communication opportunities for minorities, and various forms of activism-via-click.3
It’s true that Facebook offers a new gathering space where people can communicate uncensored by the “discourse police.” It’s also true that Facebook gives all people a voice, not just the “big guys,” as Zuckerberg likes to emphasize. But a closer look reveals that this doesn’t so much solve the problem as exacerbate it. The means to form and express their opinions that Facebook gives its users dismantle the psychological foundations on which the serious formation of opinion is based.
It isn’t just alternative political perspectives that fall by the wayside in Facebook’s communication bubble, but also the effort required for something as complicated as politics can be. Complex arguments are jettisoned in favor of simple slogans, text in favor of images, laborious explorations at understanding the world and the self in favor of amusing banalities, deep engagement in favor of the click. Facebook’s crime doesn’t consist in tolerating fake news and hate speech, but rather in creating conditions for communication that make people susceptible to such posts.
Zuckerberg likes to emphasize that Facebook is not a media company but a tech company, allotting it just as little responsibility for the content shared as Apple has if a murder is planned via iPhone. Critics are right to retort that Facebook has become the only newspaper that many people read, which demands a corresponding level of responsibility for the content presented there. This responsibility entails not treating information impartially as a technology company would, but rather separating news of parliament from that of our best friends, and prioritizing the former over the latter. But the critique has to go further than that and assess more deeply. Facebook isn’t just an information provider; it is an institution for socialization that handles information in such a way that actually makes it easier for demagogues like Trump to win—as well as those who will imitate him.
The investigation into Zuckerberg would have to summon more witnesses against Facebook than the filter bubble, and it would have to accuse it of more than quantity, dualism, and speed. Disinformation begins with the range of information in the Facebook bubble, which replaces the political with the private, and replaces intelligence and education with banality and spectacle. Zuckerberg does all he can to expedite this process when he deprives traditional media of subscribers and advertising clients, backing them into a corner economically. He has pushed them so far by now that they grit their teeth and seek refuge on Facebook, courting an audience with “instant articles” like a voice in the wilderness.
But that isn’t all. He can be charged with more counts of inciting the people to stupidity. Day after day, click after click, Facebook squanders the mental resources of an educated public: patience, skepticism, concentration, interest, a certain respect for experts, and the willingness to work hard enough to understand them. All of this will be discussed below.
The question of liability must be resolved in the first place, because it is doubtful that one man and one company have the power to change the world’s cultures. The defense will argue, in Zuckerberg’s own words, that Facebook is just picking up on trends. The prosecution will be correct to retort that Facebook reinforces or initiates these trends for base motives, sacrificing deep reading for copious clicks simply owing to its business model. The defense will counter the accusation of personal gain by pointing to the Chan Zuckerberg Initiative, which the prosecution … and so on.
It all blew over in a few weeks. Not fear of Trump, but the allegations against Zuckerberg. The topic was exhausted, according to the press, while media scholars hoped it would finally be explored in depth. Other questions were now causing a stir: Trump’s victory as a resounding defeat for political consultants and polling agencies, the significant number of Latinos who voted for Trump, Trump’s transition team, his son-in-law. Facebook only made it past the media’s news-value filter again when massive ads by Russian agents were detected.
Nevertheless, the anxious probing of Facebook’s societal influence, and the clueless superficiality of the criticism made apparent the need for discussion. Now everything depended on keeping the question alive beyond the daily business of the old scandal-oriented media, and to bring it to all corners of society. The criticism must go deeper and gain a broader foothold. It must be developed at universities, it must reach politicians, it must go into the schools, it must lead to media literacy that doesn’t just ask how we can effectively use media, but also how new media change the conditions of our existence. What social, cultural, and political consequences does it have? Is that what we want? Can we stop it?
In his first statement on the election results, Zuckerberg explained (under the notable headline “Feeling Hopeful”) that his project was larger than any presidency and that progress doesn’t occur along a straight line. His mission is to bring people closer together, as he often points out. Facebook isn’t supposed to reconcile only Americans; it should unite the world. However, the considerations presented above suggest that the conditions for communication on Facebook tear the world apart because people are not prepared to handle opposing political views in a constructive way, or to view their own skeptically. Why is Zuckerberg so hopeful? What does he have in mind? Could it be that we’ve completely misunderstood him up to now? Don’t we have to think of the filter bubble in much bigger terms than those used here? Does Zuckerberg want to solve the problem of political aggression by driving the political out of communication? Does he want Facebook to be the enormous party of banalities, free of all explosive political material? To save the world through a culture of the banal—stupidity as remedy? It’s an absurd thought, which we’ll save for the end of the book.4