AFTERWORD

Never before has an age been so informed about itself, if being informed means having an image of objects that resembles them in a photographic sense.” Kracauer immediately contradicts this praise of photography when, a few lines later, he characterizes it as a “strike against cognition,” because reproducing reality mechanically makes it superfluous to grasp it consciously. Kracauer’s commentary on photography provoked a discussion of the cognitive achievement of social networks and the informativeness of Facebook society. What we found was that the photographic form of self-representation delegates individual experiences to the social network and suppresses narrative forms of perception. This conclusion may seem less threatening in light of the critique levied against the narrative mode and given the corresponding advantages of forgetting. In the context of this critique, the episodic, phatic model of communication on Facebook was advanced as practicing a model of community that transcends divisive narratives and identity constructions. This unorthodox perspective, then, took yet another turn with the call for “weak thinking,” as habitual tolerance that results from “working through” conflict, as opposed to unreliable indifference, avoiding all conflicts, in the model of phatic communication. “Weak thinking” responds to discredited narratives not with opposing stories but as a “story” that is opposed to stories. It is the alternative, less popular reaction to the vacuum left by the loss of traditionally meaning-creating stories: an alternative to self-satisfied communication on Facebook, which enjoys the popularity it does because it is experienced as liberation from communication that is weighed down by meaning. This liberation is the site of the “go-for-broke” gamble of history, which Kracauer, ninety years ago, wrote about in relation to photography.

From his description of the increasing popularity of photography as a loss of society’s knowledge of itself, Kracauer derived a surprising prognosis. Photography, he said, makes society fall silent because in it the material itself, bypassing its meanings, speaks as a “barren self-presentation of spatial and temporal elements.” In doing so, photography, he claimed, frees consciousness from the narrative orders given to things by human beings and brings it into direct contact with nature. This liberation makes it possible to reframe meaning, in order to “awaken an inkling of the right order of the inventory of nature,” presuming that “a society that has succumbed to mute nature” does not persist. The risk, then, lies in muteness persisting after all, which, for Kracauer, would mean the “eradication” of consciousness: “The turn to photography is the go-for-broke gamble of history.”1

A medium as a game of chance? Is this metaphor anything more than slyly formulated cultural pessimism? Does it anticipate Walter Benjamin’s “positive concept of barbarism,” with which, five years later, the latter would greet the gambling away of the “human heritage” for the “small change of the ‘contemporary’ ” as “making a new start”?2 Is history, today, once again betting everything on a single card—with objective forms of self- and world representation that elevate the “foundation of nature devoid of meaning” characteristic of photographic documentation—going all in on the operating mode of Facebook society?

The barbarism of the new lies in gambling away narrative, reflective consciousness, which is increasingly being suppressed by numerical, visual, and automatized forms of communication and types of information processing. Central factors in this development are the methods of quantification found in self-tracking and Big Data mining, along with visualization technologies like Snapchat and future VR/AR technologies like Facebook’s Oculus. The future is “frictionless sharing,” without reflection and ultimately also without control by the sharers. This is precisely what, to some, promises the end of the distortion produced by the subjective bias of narrators and the compulsory coherence imposed by the narratives. For these observers, the outcome promises access to a “mystery of being” that transcends narrative concepts of explanation; for them, the episodic model of identity is welcome as the end of exclusion and heteronomy through collective narration.

The exclusion of the I from first-person narrative is the paradoxical equivalent of the self-presentation of the material in photography. With the new technologies, the self-presentation of the I occurs without its conscious participation. This posthuman self-abnegation of humanity is comparable to the nature, that, as Kracauer has it, sits down at the table consciousness has just vacated. If it were to become the new head of household, in analogy to Kracauer, history would have lost everything. For the outsourcing of narrative to alien authorities also means the abandonment of the practice of reflection—a loss that should not be seen, somehow, as a technical translation of the complex concept of “weak thinking,” which can be difficult to convey, but that instead destroys any foundation for it. Could our hope lie in the return of the old, to resume its seat at the table alongside the new?

The linking of the numerical and the narrative, of algorithmic analysis and hermeneutical techniques, is the contemporary topic in the realm that falls most essentially to hermeneutics and narration: the humanities. The catchword for a humanities that would be dedicated to algorithmic methods of analysis is “digital humanities”; the anxiety-inducing words are “distant reading” and “the end of theory.” A hint of reconciliation is bruited about in the concepts “algorithmic criticism” and “ecology of collaborating.”3 There are, somewhat simplified, three camps: those who hold fast to the process of interpretation as the most integral and essential method of the humanities, those who want to produce authoritative knowledge by means of quantifying data analysis, and those who expect data mining to produce new approaches to the business of interpretation. The third camp also sees the new means of data collection as a source of challenges and opportunities for our understanding of rationality, consciousness, and self-experience. This group remains committed to narration but does not exclude counting from recounting. It allows itself to be inspired, but not corrupted, by the new. Finally, it defends the position of “weak thinking” and “nihilist hermeneutics” against the “spatial appearance” and “barren self-promotion of ‘facts’ ” and tries, when it comes to social networks, to combine the insights of posthuman self-representation with aspects of the cognitive activity of conscious self-description.

The digitalization of the humanities is itself part of a “transformation of the human” that adds a wholly new dimension to such general humanities themes as reason, consciousness, and self-understanding. Two of this transformation’s catchwords were already mentioned in the context of Facebook’s posthuman, algorithmic autobiography: “nonconscious cognition” and “distributed cognition environments.”4 The outsourcing of narration and recollection and the shift from narrative to numbers on social networks and in the humanities are phenomena and building blocks of a development that originated long before Facebook and that point far beyond the concept of Facebook society. Material things, which today—in both nature and society—are beginning to present themselves in a way that bypasses people, no longer consist only of the objects in a photograph but also of the data in the feedback loops of cybernetics. The “barren self-promotion of the spatial and temporal elements” that Kracauer saw in photography is now happening in the logic and from the perspective of cybernetics, which is a logic of computation and decision making, of analysis and control, of conditionality and lack of discussion—it is the paradigm of logocentrism in the form of the numerical. The stakes being gambled with in the twenty-first century, fundamentally, are mathematics.

This new risk has a long prehistory. Half a century ago, framed as “technocratic rationality,” it was a popular target of critique in the humanities and already formed the basic theme of Dialectic of Enlightenment (1944), as Theodor W. Adorno and Max Horkheimer discussed the transformation of rationality from a means of human emancipation to a means of its reification. If we translate this sociological observation from the past into the technological determination of the present, which gushes enthusiastically about cybernetic recursivity and “deep learning” algorithms, reification amounts to the control of humankind by the artificial intelligence that humans themselves have created. This disempowerment was already illustrated by Stanley Kubrick’s 1968 film 2001: A Space Odyssey, in which a computer locks man out in space, and more recently in Alex Garland’s film Ex Machina (2015), when the robot locks the human being in a room. The new Turing test consists not in the computer convincing us that it is a person but in its convincing us that as a machine it is nevertheless acting like a human. If we believe it—at least this is the upshot in Ex Machina—we have/are lost.

Benjamin’s one-time praise of barbarians is expressed, in the current constellation, as enthusiasm for the “technical intensification of complexity” or “disenchantment of the Anthropocene’s control fantasies” and as critique of any “negativistic media theory” that complains about the “cybernetization of the means of existence” as being equivalent to human heteronomy and the mathematical reduction of humanity. What is astonishing about this position, which does not lose any time worrying about the shift in control among humans (catchwords: cybercrime, hacked control systems), is not so much the joy elicited by this “fourth insult to humanity, following Copernicus, Darwin, and Freud” as its timing. Humanity’s exceptional position is called into question at the moment of its greatest triumph, when it has advanced the capacity for thought given to it by nature to such a degree that it can now pass this capacity on. This passing on, the delegation of the tasks of control to the environment and the application of artificial intelligence, can only be understood as a loss of power and an insult to humanity if one suspects that operational accidents—the shutting out or locking up of humanity—are the rule, a position that reveals the individual who thinks this way to be either a cultural pessimist or, if accompanying feelings of joy are to be taken seriously, a cynic.5

The promise inherent in the gamble becomes clearer if “environmental cybernetics” is seen as an “epistemological and ontological correction” not of humanity’s predominance but of the human dilemma: the dilemma of being entangled in narratives, in “perspectives” or “views” “in view of which we have disfigured humans [les hommes] and driven them to despair,” in Nancy’s formulation. Can cybernetics, if it actually replaces politics with technology and does not just dress politics up technologically, correct for this entanglement, since its mode of operation, which does not recount but only counts, can’t be influenced by perspectives and views? Can we imagine that in the so-called state of nature of cybernetic control circuits, autonomous artificial intelligences will neither include nor exclude humans but merely act as partners and educators that help humans to be “nothing but earth and human,” to borrow Nancy’s description of the alternative? If we, then, start by thinking the technological determinant together with Nancy’s concept of community, does the historical point of cybernetic transhumanism consist in humanizing the human by technological means?6

The twofold outcome of the gamble is the topic of differing philosophies of technology. Heidegger’s enframing (Ge-stell) is a scaffolding that offers support but also limits movement; Stiegler’s Pharmakon, depending on how it is used and on its dosage, is either poison or medicine; even cybernetics has a dual value, as “left” or “right” cybernetics, depending on whether it is viewed as static and system preserving or as creative, learning-friendly autopoiesis.7 The future will show how the new go-for-broke gamble of history can be won and how the “cybernetic state of nature” that is evoked (its structure, mode of operation, information sources, knowledge criteria, levels of complexity, and rules of recursion) can be thought concretely. In any case, as this book has discussed, a life that is surrounded and besieged by numbers (as we might tendentiously describe “ubiquitous computing”) is already the permanent object of archiving and surveying. Already today, algorithms are filtering a closed system of knowledge from out of life. Perhaps at some point, using AR/VR technologies, they will be able to avoid culturally determined conflicts by cleverly interposing individual parallel worlds. Artificial intelligence, in popular forms like Siri, Alexa, or Jibo, is already part of our activities and will only become more and more active thanks to the input of social media.

If we turn from speculation about future technological constellations to an analysis of contemporary media interactions, the result of our discussion of Facebook and Facebook society can be summarized as follows:

First, permanent talk about oneself on social media is flight from the events occurring in a person’s life; we are exhibitionistic not because we are narcissists but because we cannot bear ourselves and the present. Sharing on Facebook should be understood as a stopgap; it gives us a decent option for delegating our own experiences to others. Second, self-representation on Facebook happens less in a way that is narratively reflective than as a spontaneously episodic and documentary event. The outcome is a quasi-automatic autobiography whose central narrative authority is the network with its algorithms. The self-image that is presented by Facebook is pointillist, postmodern, and posthuman. Finally, information management on Facebook and on the internet suppresses collective memory. With its lack of narrative points of reference in the framework of phatic communication, it creates a quasi-cosmopolitan community that transcends cultural values and national barriers. However, at the same time, the avoidance of discursive interaction prevents the development of skeptical, metareflexive thought as long-term security against new forms of assertive dogmatism.

Let me add one more thing. Even a description of society from the perspective of cultural studies can hardly avoid being drawn in by the perspective of its speakers and the force of coherence that their narrative exerts. No analysis of the present can escape the analyst’s past. Thus, in the end, I may appear more critical than I wished to be when I began. But at the latest, when ten thousand kilometers away my students enthusiastically report (grinning, it is true, but ultimately without any awareness of guilt) how much time they have spent on Weibo with their “idols and celebrities” and in sharing bits of knowledge like “How you can freeze a can of Coke very fast,” I see clearly that outside my own culture of thought there is an entirely different relationship to the world and its media—an agreement in principle, a childlike enthusiasm even, unburdened by social-political ambitions and left melancholy. Then, at the latest, I cannot avoid the sense that everything can be viewed entirely differently. Then, at the latest, it is high time to give the word to the voice that opened this analysis:

The food arrived and, as always, this was the most exciting moment. Everyone grabbed their camera; together they rearranged the plates for the perfect photo. “Seung gei sik see” is the saying in Hong Kong: The camera eats first. The camera is the modern saying of grace, the sharing of “bread” in its symbolic representation. For now, the first thing that happens is the passing around of the result, the exchanging of the best images, their sharing on Weibo, Facebook, Instagram … Then the eating can begin. The last morsels haven’t been eaten, and already the first results are in. This, too, is cause for all kinds of conversation. Everyone knows the same friends, who now send their “foodies” from the places where they are eating, alone or together. A complex dialogue emerges among the images and texts, full of revelations and inside jokes. The foodie is not proof of loneliness in a crowd, as cultural pessimists like to think, but a vehicle for communicative action that is full of fun and deeper meaning among different groups, in different places, via mobile media and social networks. This communication accompanies every bite and later continues for days, in commentaries on Facebook or Weibo. It is engaged and generous, for it includes pizza as much as oysters. The foodie demonstrates food not as an object but as an action. It creates community through its link to the most essential things, to that which joins all human beings without regard for their political position and cultural values. What the mass media manage to do with murder cases and reports of catastrophes, the social networks achieve with everyday banalities. The only blood that flows here comes from the steak.