Photography grasps what is given as a spatial (or temporal) continuum; memory-images retain what is given only insofar as it has significance.
—Siegfried Kracauer, “Photography,” 1927
Dear diary, it’s only now that I get around to writing to you again because all week I was reading a book that was so fascinating I didn’t find time to write anything down. It was Hermann Hesse’s Steppenwolf, which Christian recommended. Christian said if I want to know more about breaking free of boring, petit-bourgeois life I should read this book. He was right! The book is full of sentences I’d like to hurl at so many people! For example this one: “In reality, however, no I, even the most naïve one, is a unity, but instead it is an extremely diverse world, a little heaven full of stars, a chaos of forms, levels and states, of inheritances and possibilities.” I think this is exactly the way it is, because …
Except for the salutation, everything in this diary excerpt is authentic. It was Christian who recommended the book twenty or thirty or forty years ago, and reading it actually did keep the author from writing in his diary for a whole week, resulting in an entry that was that much longer—ten pages or more, enough to keep him busy for a whole evening. There are some crossed-out passages and a lot of exclamation points, and afterward the writer was able to quote the mentioned passages repeatedly over the years, if not always word for word. Those were the days, when along with the things people had experienced they also wrote down their feelings and ideas! When things that had happened were endowed with meaning at the special moment when they were being written down—thoughts that came to completion in the act of writing. Insights that helped a person grow. Does anyone, anymore, on a Sunday evening, try to reflect on the week that has just passed? Are there still people who keep diaries?
If you are conscious of the fact that diaries, unlike poetry albums, were never a mainstream phenomenon, and if you think you can count words as you do numbers, you might conclude that the twenty-first century is the dawn of a new era of self-knowledge, for never has there been so much writing as there is now.1 But if you judge the situation less quantitatively, you will also want to know what kinds of words are being uttered, in what circumstances, and what form they take. In principle, it is clear that when you are writing on social media like Facebook you express things differently than you would in a diary and that a hundred status updates don’t add up to a deeply felt and thoughtful comment. The more the communication form formerly expressed in the diary is occupied (that is, displaced or redefined) by social networks, the less we can look forward to the kind of self-reflection the diary can offer. When autographical writing is hailed as the “master form of the 21st century,”2 we should therefore ask how self-representation and self-knowledge are related to each another in the context of the technical and social frameworks that now shape our social communication.
A first answer to this question is that social networks are not only used for conscious self-branding; they also prompt their users to engage in unconscious self-revelation. The self-revelation occurs more implicitly than explicitly, depending on the network—naturally, it is more carefully designed on a network for business contacts like LinkedIn than on Facebook. Showing replaces saying (or writing), as photos bear witness to more or less spontaneously experienced moments, and likes more or less spontaneously express personal preferences.3 Even textual updates and comments, when they happen as part of spontaneous communication, are no guarantee that the act of making something visible to others also promotes self-knowledge. Yes, precisely those comments that are less controlled can be harbingers of potential self-knowledge because they lack any subjective communicative intent and are therefore free of unconsciously imposed rules for the construct of the self. Users are free, after some time has passed, to sift through the photos, likes, and comments on their pages in search of themselves. The question, however, is: How many people take the time for this self-encounter, and does it go beyond mere wall cleaning and the removal of old postings that have become embarrassing?
Along with spontaneous showing, there is also an automatic showing, which takes place when the mechanism that inserts specific actions by internet users—visits, likes, shares, or comments—directly into their timeline is activated.4 When this happens, individuals no longer describe themselves more or less implicitly, through their actions, but instead it is the actions that describe the individuals. The subject’s “internal automatism” is replaced by the external automatism of the system the subject has become a part of. A popular example of this automatism is self-tracking, which promises self-knowledge through numbers instead of words. Here, the self’s body produces data that is independent of the person’s own consciousness (even though she initially agreed to it) and that makes statements about the person independently of her self-understanding and the values that have been internalized in her self-construct. This is generally true even when the data must be entered by hand into the relevant app. For as long as it is not a matter of describing one’s own mood but of factual indicators (physical movement, foodstuffs consumed, sleep behaviors), there is little incentive for reflection.
Facebook’s interest in acquiring as much data as possible for statistical computations and its creation of individual profiles in the interest of effective advertising are well known, and while occasional revelations about the extent of secret data mining may be shocking, they don’t come as a surprise.5 The fact that self-representation on social networks like Facebook produces economically and socially exploitable knowledge about the social body is not what is decisive here. More important is the question of what effects the context of this self-representation has on the subject’s own self-perception and self-knowledge. Here, it is important also to take account of the outright avant-gardist aesthetics that this context creates: the paradoxical phenomenon of a simultaneously actionist and postactive, automatized autobiography, one more lived than narrated by its subject and “author.”
From a media-theoretical perspective, social media, as a new genre of autobiography, mean a shift from writing to photography. This is not because there are more images than text on Facebook but because the reporting takes place in the mode of mechanical reproduction typical of photography. Activities on the internet are reproduced on Facebook exactly as they occur. From a historiographical perspective, storytelling on Facebook turns back to domination by numbers. This suggests that we should turn our attention to a source of history writing for which the simple fact of events was more important than their narrative coherence.
PHOTOGRAPHIC TEXTS
In the beginning was the number. This characterization of the history of historiography would be permissible if we skip over Homer, Herodotus, and Tacitus and start with the medieval annals, which order history not by events but by years. “Hard winter. Duke Gottfried died” is an example from the Annals of St. Gall, part of the Monumenta Germaniae Historica for the year 709. Two events without any context or narrative setting. And when it states, for the year 710, “Hard year and deficient in crops,” there is no explanation of the causes of the poor harvest.6 What mattered were not events and the relations among them—not at all. Famine and war were not the actual event but merely witnesses to the passage of time in “the fulnes [sic] of the ‘years of the Lord.’ ” The passage of time was news enough, for it enacted the connection between the beginning (the birth of Christ) and the end (the Last Judgment). Time was thought cosmologically; the years, in their numerological uniqueness, were its phenomenological element. And because time came from God, the latter was the real hero of the events. This, not war and hunger, was what history was meant to proclaim. Thus it is quite natural when the Annals of St. Gall also include the years in which nothing happened and finally conclude with an abbreviated listing: “1057. 1058. 1059. 1060. 1061. 1062. 1063. 1064. 1065. 1066. 1067. 1068. 1069. 1070. 1071. 1072.”
From ascribing value to the dates of the years themselves, it was a long way to writing history without naming any years at all. In between came the chronicles, as a method of recording events that were both contemporary and part of a history—of a city, a royal family, a business, an individual. Narrative history, by contrast, offers a finalized, retrospectively recounted and coherent assemblage of events that, as a kind of “historical law of the conservation of energy,” are anchored in a solid network of causes and effects. During the eighteenth century, the accumulation of facts without perspective became the object of critique, and the history of isolated events gave way to the universal history of coherently connected happenings, a history that, in the end, was quite logically conceived as a “history without years and names.”7
Historiography became narrative at the very time—the end of the eighteenth century—when it was proclaiming itself a science by projecting the narrative forward, from the level of reception, where the connection between a hard winter and the failure of the harvest was always potentially available, to the level of production. Chains of causality were now part of the business of historiography. To offer “only what actually happened,” the era was convinced, “would be to sacrifice the actual inner truth, well-founded within the causal nexus, for an outward, literal, and seeming truth.”8 Once “inner” truth became the crux, as a way to view the past from the perspective of the present, a paradoxical equation emerged: “The historian is one who prevents history from being merely history.”9 This shift in the methodology of writing history raises the question of how dense it is possible for description to be without it becoming narrative and how porous it has to be to avoid the risk of unraveling the chosen narrative. The German historian Johann Christoph Gatterer gave an astonishingly honest answer to this conundrum in his programmatic text from the year 1767, Vom historischen Plan und der darauf sich gründenden Zusammenfügung der Erzählungen (On the historical plan and the connection of the narratives based on it): “Events that do not belong to the system are now, for the writer of history, so to speak, non-events.”10
The disappearance of actual events from the spirit of a narrative, along with a preference for “real” truth in lieu of the “literal” kind, are themes that would return a century later, at a time when both painting and literature were being defended against the new medium of photography. In the mid–nineteenth century, literary realism was judged to have “daguerrotypical similarity” to the representation of reality; people accused it of “idolatry of the raw material.” Literature, in effect, can never be as objectivist as photography, and photography, in effect, is less a comment-free representation of reality than the representation of a personal relationship to reality, as expressed in the choice of motif and the moment when it is captured, as well as the perspective used and focal length employed, not to mention the choice of camera and type of film. Nonetheless, painting and photography represent two essentially different methods of producing an image. The distinction is therefore justified and ultimately applies to literature as well: Painters (writers) must decide how to represent the object that, at times, exists only before their inner eye. In other words, they follow their own understanding of the “truth” of a thing. In photo-graphy, on the other hand—this is the origin of the name—the object writes itself into the photo with light, a situation that inspired the photography pioneer William Henry Fox Talbot to speak of the “pencil of nature” and led the philosopher of semiotics Charles Sanders Peirce to define an indexical type of signs, arguing that a photo is the direct, physical consequence of whatever was in front of the lens, just as smoke is a consequence of fire or a footprint the result of a step.11
This physical relationship between sign and signified is also, for the most part, characteristic of Facebook on a grand scale. The written comments that are automatically presented in the news feed and activity log, referring to articles recommended, videos viewed, and music listened to, are indexical from the perspective of semiotics because they are the direct result of the action represented by the sign, not a retrospective description or, at least, an announcement that the action has occurred. They are the “smoke signals” of our online existence. This indexical relationship also basically applies to the texts we compose ourselves, the status updates and comments that appear in our own Facebook Timeline exactly as they were expressed on the homepage. Here, there is no corresponding later entry, as there was in the case of the traditional diary, where it says: “Talked to Christian today about the book and told him that …” Now, what we say is already stored at the instant when it is expressed. Facebook’s Timeline is a “photography” of events, including communicative acts. In the programmed feedback mechanism of the social network, the event reports itself in real time: it is the report.
The technical shift from conscious description to automatic recording means a return from the “real” truth to a “literal” one, in the service of even those events that, as Gatterer noted, do not belong to the system. The indexical nature of the entries prevents the past from being made present again in ways that are emphatic, related to the present, or guided by theory. In 1927, the German essayist and media theorist avant la lettre Siegfried Kracauer formulated this specific quality of photography as a loss of meaning: “Photography grasps what is given as a spatial (or temporal) continuum; memory-images retain what is given only insofar as it has significance.”12 The French philosopher Jean Baudrillard would later dramatize Kracauer’s declaration of loss by invoking a conflict between the object, as given, and the perceiving subject: “Against the philosophy of the subject, of the gaze, of distance to the world in the interest of comprehending it better, stands the anti-philosophy of the object, the decoupling of objects from each other, the philosophy of the aleatory sequence of partial objects and details.”13 Photography allows objects to prevail, in their discontinuity and momentary quality, independent of the subject’s gaze and in principle also in opposition to his perspective and interest. As described by Baudrillard, photography, inasmuch as it eliminates the distance that is necessary to understand the world better, forestalls the contemporaneousness that Agamben invokes.
Facebook—and this is even more true of an app like Snapchat—makes Baudrillard’s finding paradoxically more acute. Now, thanks to the sequence of fragmentary experiences and expressions of the Facebook user, all of which are registered and delinked as the here and now of what was once previous, the “philosophy of the subject” is confronted by the “antiphilosophy” of the very same subject. The distanced gaze is lacking, since Facebook, and Snapchat to an even greater extent, permit no distinction between the subject who experiences and the subject who reports. Distance, now, is still possible only in the receptive mode, which corresponds nicely to Zuckerberg’s ideal of “frictionless sharing”—of everything, all the time, and above all automatically. The immediacy and nonsubjective character of the report reinforces the moral imperative of authenticity and radical transparency that defines the Facebook community’s sharing ethos. Authenticity—in an about-face from earlier historiographic and media-theoretical positions—is viewed as consisting in the automatism of documentation, which is used as a weapon against the “distortion” of retrospective reporting.
The consequence of this departure is a fundamental change in the philosophy of information, from information as an affirmative act to information as the consequence of a quasi-unconscious sharing automatism. The song you hear on Spotify, the film you watch on Netflix, and the article you read online are not communicated to your Facebook friends because you liked them but because you heard, saw, or read them. The communication loses its subjective stamp and hence its value as something that, from the perspective of the sender, is worthy of being communicated. But precisely for this reason, from the perspective of the database behind all the Facebook pages, there is a gain in reliability. As the ex post facto weighing of experience in diary mode is replaced by “insular” reports in real-time mode, even the strategic temporal placement of an update becomes subject to the law of attention economy. The subjective description of the events gives way to their mechanical reproduction.
NARRATIVE IDENTITY
Just as, for Kracauer, it was not the photograph but the remembered image that constituted a person’s “actual history,” so, in the theory of the French philosopher of narrativity Paul Ricœur, time is only human time to the extent that it is “narratively articulated.”14 Humans—this is the basic idea in both cases—must weave the events of their lives into a coherent, meaningful web of relations in order not to feel homeless in them. From this perspective, neither the isolated representation of an event nor the chronological accumulation of many events makes sufficient sense. Moreover, for Kracauer, Ricœur, and likeminded thinkers, the primary addressee of an autobiographical narrative is the narrator herself, not (only) when she later reads what she has written but (above all) in the here and now of the writing process itself. The identity value of the autobiographical act lies less in its documentary than in its performative effect. As the I speaks about itself, it creates itself—this is the core conviction of narrative psychology: “our experience of human affairs comes to take the form of the narratives we use, for we use them not only to tell, but also, and first of all, to form them.”15
It follows that the evolution of Descartes’s formula from “cogito ergo sum” to “I narrate, therefore I am” cannot be reduced to the subject’s construction of content. The second meaning, which is ultimately central here, has to do with the practice of linguistic and analytical competencies. On the syntagmatic level, narration requires the storyteller to work through contingencies and to create plausibility, synthesis, and conclusion. On the paradigmatic level, it requires formal, aesthetic considerations in the choice of words. The questions “Why?” or “What for?” and the ordering of parts into a before and an after strip events of their episodic nature, which saw them as rising up out of the past without explanation and disappearing into a future without consequences. The questions and the ordering of the parts compose time and give the subject, as both narrator and reader, orientation. They are the source of the plausibility that must be narratively achieved through the use of causal connectors and intentional markers—a plausibility that on occasion can also synthesize apparently heterogeneous elements. Awareness of the storyteller’s unique perspective is also sharpened formally, as she wrestles with convention in her struggle to find the right linguistic expression. More simply stated, you come to understand yourself in the process of grasping what occurred and why.
The surplus value of “I narrate, therefore I am,” in comparison to recently popular self-representation formulas like “I post, therefore I am” or “I share, therefore I am,” lies in the cognitive activity that is involved. On Facebook and other social media, cognitive competencies, including analysis, synthesis, and formulation, are required only in specific situations and in a rudimentary way. They remain entirely unused when the report is automatic: “Roberto has shared a link … Michael, Antje, Eric, and 20 others like this.” From the perspective of narrative psychology, such automatism leads to an utterly absurd variation on Descartes’s formula: “It posts, therefore I am.” The model of an insular and partly mechanical self-representation leaves behind a void that—this is the point here—is no less problematic than the economic and political exploitation of the accumulated data. The individual appears, above all, as the object of her history—and scarcely at all as the subject of its narration.
This constellation reminds us of the diagnosis of the late-modern individual’s relationship to the self, according to which our era creates “an intensified compulsion to focus on the self and an urgently felt need for articulation, accompanied by a poverty of expression,” and favors the “ ‘de-temporalization’ of life in favor of situational practices in regard to time and the self.”16 The preference for episodic over narrative self-perception that is suggested here will concern us in what follows. First, though, we need to ask to what extent social networks, as a means of identity management under conditions of accelerated relations to the world and the self, are the technological expression of this very contradiction between the compulsion to focus on the self and the failure of articulation. To what extent do social media make it possible, in a tradition-starved, radically flexible world, to decouple self-reflection, with its attendant gains in meaning, from self-representation? Or, to put it a different way: Does Facebook, as a technology for the permanent archiving of situations, cause the present to disappear by demanding and permitting no distance from it? The related counterquestion is: Does a narrative self-understanding that sees events (only) from the perspective of a given “system” really create sufficient distance from the present to be close to it in Agamben’s sense of “true shared contemporaneousness,” or does it, rather, fail to perceive the present in its actuality precisely because of its pregiven narrative perspective?
Before trying to answer this question, it is necessary to interrogate the claim that there is a discrepancy between self-representation and self-reflection. Wasn’t it Facebook that first made self-description a central factor in our lives, even beyond the 5 or 10 percent of individuals who kept a diary anyway, especially when they were young? Don’t millions of people experience themselves more intensely since Facebook appeared, starting with the choice of profile picture and welcome statement and responses to the list of questions and repeated with every status update and comment on the updates of other people? Doesn’t the chronological sequence of events on a person’s Facebook page already produce a certain narrative unity that possibly also sharpens her awareness of the way it all coheres? Doesn’t Facebook itself, with its automatic “Say Thanks” collages and “Year in Review,” with its monthly sequence of the most popular images and associated texts, teach our pictures themselves to tell a story? Shouldn’t we be talking about a “narrative turn” instead of about the end of narrative?
The concept “narrative turn” points to the rediscovered power of stories before Facebook and irrespective of Facebook’s role. It has been used in the business world for company identity and product marketing; in politics, as an effective means of earning voters’ loyalty; in sociology, as a methodological defense against empirical social research; in medicine, as an alternative/traditional method of diagnosis; and in television, which hadn’t dared allow itself this much focus on narrative for a long time. Even in literature, stories are once again being told.17 The question that remains is how narratives in politics, economics, medicine, and research relate to the process of emerging (self-)consciousness in the sense of narrative psychology. Is there, along with the narratives of the specialists and the narrative-identity creation of businesses, also a return of narration in the concrete behavior of individuals toward one another and themselves?
All of this suggests—indeed, this is what is being argued here—that the popularity and ease of information exchange on digital media is displacing traditional forms of communication that have a strongly narrative character: the reflective diary, the letter reporting on events, the story that unfolds while looking at a photo album or watching an evening slide show. The thesis is that both the social and the technical dispositif of social and mobile media have a deleterious effect on storytelling. The smartphone’s keyboard does not invite lengthy writing, and the logic of attention economy does not permit long reports on the social network.
In other areas beside these social media, we also find phenomena that work against the trend to rely on narratives. Some of these are once again a consequence of the new media. In participative journalism (tweets, smartphone videos, life blogs), for example, when the “truth of ordinary witnessing” replaces the voice of the reporter, it is undoubtedly a move away from the narrative style of the New Journalism. News is then no longer what actually happened (and, above all, why); it is how participants experience it. Narrative work is replaced by “truth beyond doubt,” for no matter how inappropriately witnesses may report, the moments are always authentic—after all, the eyewitnesses are not actually reporters; they are the report! Ontologically, what is occurring here in regard to media is fundamentally a reorientation from the subjective mode of textuality to the objective mode of photography. At the same time, these changes can also be understood as a shift from the narrative principle to the database principle: The narration of a reflective reporter is replaced by disconnected mininarratives—a timeline of events unaccompanied by interpretation.18
The newsfeed principle of “always-on” journalism brings us back to Facebook, where the livestream also replaces comprehensive narratives. The shunting aside of narrative journalism relativizes talk of a “narrative turn.” Further research is needed to determine whether narratives, today, play a greater or lesser role in individual lives.19 We need to ask whether the explosion of communication via mobile media and social networks fosters or discourages the narrative turn in everyday situations. We need to ask whether the social and technological dispositif of these technologies and forms of communication fosters a culture of conversation in which listening and responsive questioning occur. Facebook, which for well-known reasons prefers information that is suitable for the database, must also be interrogated. We need to explore how the various forms of self-representation, from self-reports on questionnaires to status updates and automatic documentation of activities, are related to the process of narration. Do they encourage the communicative form of narration, or are they, as is suspected here, the expression of an antinarrative turn dressed up as narrativity?
Anyone can see that the protocols for self-description, with their forms to be filled out, do not require or encourage narrative competencies. They merely ask you to respond to the questions on the form (education, employment, home address, family members, favorite quotes, and “some details about yourself”). The self-description on the form is subordinated to the authority of the form, with its assumptions about what constitutes identity. In the case of Facebook, this includes statements about a person’s favorite books, films, music, and athletes but (for example) not her favorite number, color, animal, mineral, season, or time of day. Every occasion for self-description that goes beyond the empty pages of a diary contains culturally determined implications and aims. To the extent that an identical questionnaire is employed for significant numbers of people, it aims at standardization. Autobiographical self-discovery, then, is thwarted in principle by the collective compulsion of the frame in which it takes shape.20
Compared to the multiple-choice segments, the fill-in segments “About You,” “Favorite Quotes,” “Religious Views,” “Political Views,” and “Life Events” naturally afford a certain freedom of self-description. Here, too, however, the parameters reveal how the employees of Facebook understand identity. For example, the heading “Life Events” contains five sections: “Work & Education,” “Family & Relationships,” “Home & Living,” “Health & Wellness,” and “Travel & Experiences.” Each rubric contains further subdivisions. For “Family & Relationships,” they are “First Meeting,” “New Relationship,” “Engagement,” “Marriage,” “Anniversary,” “New Family Member,” “New Pet,” “End of Relationship”—and more recently also “Create Your Own.” These segments almost read like the stations of a very normal life story, with the profound (and undoubtedly unintended) irony that neither the birth of a child nor the acquisition of a new pet could prevent the breakup of a relationship. In the segments themselves, there are blanks to fill in for “Who,” “When,” “Where,” and “With Whom” and for more specific information (such as the name, type, breed, and gender of your pet). There is also the possibility to upload photos, and—“optional”—to tell a story about the event. The life events are—rather secretly—listed chronologically in the “About” section but are not connected or connectable in any other way. It is evident that even the section on life events, despite its partially narrative option, represents, above all, yet another form of data request, to be input in a database-friendly form.
A less formalized possibility for self-description is offered by the status updates and comments, which can be understood as “small stories” and from which, in some cases, the plotline of a bigger story could be pieced together, especially by the “friends” of the Facebook user who are acquainted with her offline.21 However, to the extent that the updates usually take the form of spontaneous snapshots and are limited to unannotated information about places or activities, they do not cohere with Ricœur’s concept of narrativity, and neither can they be understood as a kind of pointillist self-portrait, as has been claimed. For the problem here is not the narrative style (or lack of it) but the uncertain authorship of this kind of portrait. The sum of the collected “small stories” does not result in a pointillist self-portrait, which, despite the particular style of brushstrokes it employs, has been created intentionally with an eye to the result. Instead, they are more like what Zygmunt Bauman describes as “moments into which the pointillist time of liquid modernity is sliced.” These moments are less narrated than noted during the moments of their passing, partly without the user even remarking on them, and are instead automatically registered by the technical frame.22
The “pointillism” of “narration” on Facebook is not just an extension of the episodic perception that Bauman already remarked on when it comes to postmodern subjects. It is simultaneously an inevitable consequence of the technical frame. The interface constrains coherent storytelling by not foreseeing or permitting an internal link between the events on a person’s own Facebook page. This omission is quite astonishing. After all, links are part of the fundamental technological structure of the internet, and “connecting” is integral to its philosophical self-concept. On the other hand, it is not really so surprising that Facebook does not encourage any narrative activities on its front end, if the central goal of all this user activity is the data analysis being carried out at the back end. Thus, instead of causal relationships created by autobiographical narrators, we find only chronological connections based on the temporal sequence of the events. The technical dispositif creates a situation in which the individual subject/object of the updates no longer creates the narrative order of her life while writing but more or less unconsciously produces it while living it. The work of narration is thus simultaneously actionistic and postactive: By letting actions speak for themselves at the very moment when they occur, the narration no longer occurs at the level of presenting data but instead at the level of its production. This reduction in narrative consciousness was initially advanced by Facebook’s decision, in December 2007, to abandon its original status-update prompt “Username is …” This meant that users no longer had to assume the rather distanced and reflective posture required by the—quite unusual (almost avant-garde)—practice of speaking about themselves in the third person.
From the perspective of narrative psychology, such undermining of narrative consciousness appears to be a loss, but according to the logic of the participation culture that characterizes the Web 2.0, it is just another example of democratized communication processes. The shift in narration from the autobiographical subject—including the self-censorship that goes along with it—to the more or less automated collection of status updates and reports and the contributions by the networked public means the end of rule by experts, even when it comes to autobiography. What remains to be ascertained is the likelihood that this narrative activity will occur on the part of the public. While we cannot exclude the possibility that Facebook users create an overarching narrative image based on the various data available on a Facebook site (and offline), we do need to ask to what extent the habitual practice of isolated statements obstructs the creation of narrative structures during the reception process and whether, in principle, it encourages an antinarrative view of the world.
The question of the psychological implications of the technical dispositif should also be posed with reference to other forms of everyday storytelling: letters, telephone conversations, or shared conversations in physical proximity. If self-representation, which in face-to-face personal contact is dialogical, reflective, and narrative, is increasingly shifting to the realm of digital communication, it is also increasingly subject to the rules of attention economy and self-management that apply there. The next question—whether status updates on the social network make personal stories in the offline realm superfluous or, on the contrary, serve as occasions for responsive inquiry—requires empirical investigations that are outside the scope of this study. But the topic of delegated narration should be further explored.
DATABASES AND MECHANICAL NARRATORS
Facebook outsources narration to readers only potentially and in a limited way. The actual narrator does not sit at the front end of the interface, where users read, write, post, and imagine, but at the back end, where algorithms analyze the data that has been collected. At the back end, Facebook, as is sufficiently well known, is a giant database that collects data sets for every user under some sixty categories.23 There, the data points that have been disaggregated by the questions on the forms are reassembled twice: into a profile of the person in question and into networks of relationships among the many. The best basis for this work is not the subjective construction of a person’s own history but its “raw material,” as the objects of photography were dismissively termed in the nineteenth century. In the jargon of Big Data, this is celebrated as “raw data.” Any narrative treatment of the data at the front end of the interface represents an obstacle for this project, since, from an information-theoretical perspective, narrative attempts to extract the “real” truth of the whole from the “literal” truth of details can only result in distortion. A complete lack of narrative connection is therefore the best guarantee that a given data set is all-embracing: If there is no story to be told, there also cannot be any data that fall outside it.
The busywork of algorithmic narration reaches all the way to the front end, where user data is accumulated according to specific criteria. The “activity log” includes the user’s likes and comments, together with other people’s photos on which the user is tagged, while the function “Suggested Friends” collects messages to or from a specific Facebook friend. There is also the “Say Thanks” service and the “Year in Review.” In assembling all this data, so far, the system does not exhibit very much narrative energy; it merely presents the explicitly declared (and marked) links between two individuals, not the “deeper” relationship of two Facebook friends who, for example, have watched the same video and afterward read the same article. That the system is not capable of producing this kind of correlation should hardly be assumed. Instead, we should ask when applications are going to make the more complex forms of data gathering that are taking place at the back end available to users on the front end.24
Facebook’s algorithmic storytellers are symptomatic of the development toward postactive narration in Facebook society. Other technologies produce images that involve no personal involvement and reports that have no human authors. Narrative Clip, for example, is a small portable camera that can be worn on the lapel and produces a photograph every thirty seconds, thus relieving the subject of the need to make any decisions during the taking of autobiographical photos—all she needs to do is make a selection of usable photos from among the day’s yield. The process encourages the exposure of the optical unconscious that Walter Benjamin identified as a characteristic of photography. Another program of this kind is Narrative Science, which embeds concrete data within a narrative scaffolding and produces reports (for example, in the realm of sports or finance) scarcely distinguishable from those prepared by human authors. The question that concerns us here is, naturally, whether at some point a program of this kind could also be adapted for Facebook, employing existing tags and visual references (which can be collected by facial-recognition software) to tell a story that informs the relevant subject what she has actually experienced and who she really is. Could the technology make the objectivity of the story, after having extracted it from the control of the subject, valid and persuasive for that person? Or is the “narrative turn” on Facebook, Narrative Turn, Narrative Science, and other sites offering automatic narratives a turn toward narration that no longer has anything to do with its hero, because its actual goal is the investigation of proclivities, interests, and activity models?
Are people, at least, still narrating their own history with the help of the diary apps now being offered in increasing variety by the market? These apps are not merely technical enhancements to journals that allow them to add images, sound, and options for copying and sharing. Nor are they merely the next step after Word, which, while it functions multimedially and easily allows copies, basically operates on the model of the blank page. Diary apps are the negation of the diary because they bring about the event of self-description in a way that transcends narration. The limited space for text, the uninviting writing tools, and the user context already speak against the contemplative pause and personal report that traditional diaries once offered. For example, the smartphone’s small screen and narrow keyboard are hardly conducive to lengthy writing. Besides, in the apps’ interface the reporting function generally plays a lesser role compared to photo uploading and factual reports. Moreover, the installation of this kind of app on a smartphone favors a kind of diary “on the go,” which corresponds to the acceleration of modern life, rather than countering it with a moment of stillness.
Naturally, the app marketers don’t find anything amiss in a “diary” that requires no leisure, instead praising the relatively effort- and thoughtless report on the Now as an advantage of their product: “With Momento in your pocket you can write your diary ‘on the go’ ” (momentoapp.com); “beautifully automated. Effortlessly remember every single day of your life” (roveapp.com); “makes remembering effortless, beautiful & fun” (hey.co). It is ironic that precisely the app that at first blush seems most invested in speed is the one most likely to encourage retrospection: “1 Second Everyday” asks users to add a single image or brief video each day to a history of the month or the year; thus it at least requires some reflection as to which image best represents the day. The focus of diary apps, as with social networks, is on database-friendly information (time, place, participants, tags) linked to photos and a brief text. Instead of the coherent narration of experience, we get an episodic report; instead of a reflective diary, a logbook for short answers to preprogrammed questions that can be responded to in the midst of other activities: How did you sleep? What are you doing? How do you evaluate your current creativity, on a scale of 1 to 100?
The database logic behind all this, advertised with a view to the possibility of keyword searches, demands activity that is less narrative than identifying. Users are meant to mark places, people, and events: “Food,” “Dreams,” “Business,” “Friends,” “Vocation,” “Love,” “Joy,” “Idea,” and “Movie” are all set to go on Diaro, with the option of adding your own criteria. The idea is to organize activities into pregiven categories: Optimized offers “Creativity,” “Routine,” “Pleasure,” and “Health,” without the option of associating a single event with more than one category. Like Facebook, some of these apps (for example, Momento) also automatically integrate external communications (on Twitter, YouTube, Facebook) and reactions to them, repeating the actionistic, postactive method of storytelling described here. Like Facebook, these apps permit no links between entries. Here, too, technical updating of the old self-observation technology of the diary takes place in the interest of the database paradigm, without concomitant updates in the interest of the narrative paradigm. Facts supersede links—a technical decision with explosive social impacts.
The answer to the question previously posed is: On diary apps, an individual is still the narrator of her own story only in significant entanglement with the logic of the database. The goal of these apps is not to work reflectively through lived events as part of hard-won life experience; it is instead a maximally clear, factual report, a kind of spontaneous eyewitnessing of oneself. Thus, the app Reporter markets itself with the slogan “Snapshot your Life,” promising that by filling out brief daily questionnaires (What are you doing, when, with whom, for how long?), its users will shed light on the nonmeasurable aspects of their lives. Its name already harks back to the phenotype of “cool” photographic observation as it has been described, promising a database-centered self-description shorn of any narrative self-deception. It is a self-description that, after the information “This is your relationship to …” in the “Say Thanks” collage and “This is what your year looked like” in the “Year in Review,” proceeds purposefully toward the result: This is who you are.
The next stage in nondescriptive witnessing is already foreseeable in new software and hardware that is guaranteed to record and share every single thing a person sees and experiences: Twitter’s Periscope, Google’s Glass (or whatever its future equivalent will be), and Facebook’s Oculus Rift. Zuckerberg is convinced that virtual reality (VR) technology will bring the next great turn in the medial ecosystem, comparable to the effect of the smartphone on desktop computing. This time, he wants Facebook to play a defining role in the change. At F8, Facebook’s developer conference, in March 2015, the integration of immersive 360-degree videos via the Oculus Rift (“spherical video”) was one of the main projects for the future. This recalls scenarios of the unbroken, automatic recording, replay, and sharing of experience as envisioned in late 2011 by the British science fiction TV series Black Mirror, whose episode “The Entire History of You” depicted technology that let people record their inner experiences audiovisually, like an external camera that also grasped things the individual wasn’t consciously aware of. Users could play these experiences back on an external screen and share them with others. Zuckerberg’s plans in this regard have been clear at least since his Q&A session on July 1, 2015:
We’ll have AR and other devices that we can wear almost all the time to improve our experience and communication. One day, I believe we’ll be able to send full, rich thoughts to each other directly using technology. You’ll just be able to think of something and your friends will immediately be able to experience it too if you’d like. This would be the ultimate communication technology.25
If all goes according to Zuckerberg’s plans, the future of self-presentation will transcend any conscious (linguistic) representation. Whatever is to be communicated communicates itself without the distortion of passing through the reporter. This is why Facebook is simultaneously working on artificial intelligence systems that recognize all the elements in an image. In this way objects—particularly if the automated image is generated quasi-unconsciously—can present themselves while bypassing the subject. It is certainly not surprising when this loss of self-observation—as observation by the self—is presented by Zuckerberg as a gain in experience and communication. The automatization of the report and the expulsion of the subject from self-narration are the logical consequences of the transparency doctrine: Human beings, consciously and unconsciously, always want to conceal something; only machines have an objective interest in knowledge.26
The real-life precursor of the future Oculus automatism is the increasingly popular app Snapchat. However reliable the promise may be that the images sent via this app disappear a few seconds after they are viewed, it has already contributed significantly to a shift of communication toward the nonverbal realm of image sharing. For many users of Snapchat, the guarantee that the snapshots will self-delete is less important than the possibility of engaging in visual communication that is as spontaneous and banal as possible. Instead of saying what you are doing and how you feel, you send a snapshot. The descriptive-communication model of language gives way to the indicative model of images; the individual perspective on things is reduced to an effect of the camera. With corresponding image-recognition software, it is possible to analyze billions of situations while bypassing their actors and reporters. As long as the images are not also erased from the back end of the interface (which is what we should assume), this method of communication represents a goldmine for data collectors, which may explain why the two founders of Snapchat did not sell their app to Facebook in 2014, although they were offered three billion dollars for it, but went public in 2017 for $29 billion.
The irony in all of this is that an application that started out as a means of protecting the private sphere is helping impose the transparency doctrine, in the sense of the self-reporting of objects and events. It is equally ironic that although the snapshots of the most recent twenty-four hours are collected under the title “My Story,” this “history” is less available to the person who created them than it is to the system. By evening, committed Snapchatters scarcely know any more what they have photographed and shared during the day. What they do recall is merely the pattern that guides their actions in each case: The “pre-gym selfie” is followed by “ready for the gym” and “in the gym” pictures and then by “after running 5 miles” and “post-gym food.” To the extent that Snapchatting replaces texting—not to mention diary writing—it means not just the disappearance of the images but also the evaporation of the person’s own history. Now only the algorithms at the back end know better—and anyone who has access to them.27
As silly and banal as the results of the “Thank You” collages and Narrative Clip may be at present, however innocent an app like Snapchat may seem at the moment, we are experiencing the beginning of something that in a few years may fundamentally change our narrative self-understanding. The tendency is toward an automatic immediacy of reporting, which is actionistic and simultaneously postactive. As with “news as it happens,” events are recorded the moment they occur, and, as with instant journalism, the lack of distance of this type of autobiographical “writing” bars the path to a reflective view of things. Today, there are undoubtedly more people filling in a Facebook page or a diary app with data about their life than there once were people writing in a diary. But the quality of self-observation on social networks is significantly different from that of diary writing. Subsequent reflection is replaced by spontaneous reports, mostly oriented to externally imposed questions and criteria. The famous self-discovery of the diary writer moves away from reflective exploration and turns into the question of how the individual fits into a prescribed pattern of recommendations and expectations—assuming the report is not automatic anyway and hence completely removed from consciousness.
The contradiction and active opposition between the database and narrative models of information processing were already subjects of discussion fifteen years ago. At that time, the database version was seen as the form of self- and world perception that was most adequate for our era.28 That the structure and orientation of the database are equally determined by cultural assumptions can already be illustrated, in an odd way, by Facebook’s questions about life experiences, where the rubric “weight loss” is not accompanied by a parallel rubric “weight gain,” and the question is asked “with whom” (one lost weight) but not “for whom” or “against whom.” Self-tracking apps show similar oddities. It is evident that categorizations have a decisive influence on how reality is perceived and are therefore neither epistemologically nor politically innocent. “Categorization is a powerful semantic and political intervention: what the categories are, what belongs in a category, and who decides how to implement these categories in practice, are all powerful assertions about how things are and are supposed to be.”29 But the “narrative pollution” of the database changes nothing about the fact that in the twenty-first century this explanatory model is becoming ever more central to our perception of ourselves and our world. Facebook is an example of this tendency not only because it prefers to tabulate the results of questionnaires and blocks narrative forms of presentation at the front end of the interface. The algorithm at the back end that decides whether status updates from our friends appear in our newsfeed also prefers photos, videos, and links, rather than textual entries, which are less compatible with databases.
The key question is what problem the move to databases is designed to answer. The Russian-American media theoretician Lev Manovich throws three names and slogans into the ring: the death of God (Nietzsche), the end of grand narratives (Lyotard), and the arrival of the World Wide Web (Tim Berners-Lee). The world, he argues, appears as an endless unstructured collection of images, texts, and information; hence it is only logical for us to model it as a database.30 Manovich actually fails to provide the underlying reasoning that would support his thesis, but his list clearly indicates that the database is to be seen not merely as a consequence of technological inventions (the web) but also as the result of cultural development (end of grand narratives). We may certainly see the database as the overthrow of postmodernism’s relativist model of knowledge or as the victory of quantitative certainties over narrative, theoretical, or ideological constructions. More paradoxically, in a formulation that harks back to the origin of the concepts: The database is the return of narrative as number.
This supposition is supported by the recognizable effort to break information down into calculable units, something that, as the example of nanopublications illustrates, can take verbal as well as numerical form.31 The shift from narrative to numbers is even more evident in forms of contemporary self-knowledge such as the Quantified Self movement, whose slogan “Self-Knowledge Through Numbers” shows an unmistakable mistrust of narrative self-observation. Self-tracking, or “scanning,” has also been viewed as an extension of confession or psychoanalysis and as another form of “egotistical cultural practices” that humans invent in order to discover their “true self.”32 This, however, would be to deny the difference between a process of self-observation through scanning and one that is reflective, or, more pointedly, to confound the self-exploration of an athlete with that of Augustine. It would be equally problematic to see the Quantified Self movement as an expression of the “care of the self” that Foucault, in the early 1980s, discussed as the ancient Greeks’ art of living and recommended to his contemporaries under the rubric of an “aesthetics of existence.” The objective of this care and this aesthetics was a self-conscious and self-determined life under the sign of a person’s own needs and values. The care was directed equally toward body and soul, whereby exercising the body also always serves to care for the soul: “as physiotherapy that in truth is psychotherapy.”33
The primarily physical self-optimization of the Quantified Self movement turns Western culture’s hostility toward the body into hostility toward the mind and spirit when it allows the commandment “Know thyself” (once an inscription on the Temple of Delphi and later the main ambition of self-exploratory seminars and trips in the 1970s) to degenerate into an obsession with self-measurement. Corresponding apps and social networks provide the necessary technologies for collective control of jogging and pulse rates or of movement, sleep, and eating behaviors. One of the movement’s heroes is Nicholas Felton, whose “Annual Reports,” appearing on the internet since 2005, present important data from his life with statistical exactitude and a pleasing design that indicates how often he took a subway, taxi, bus, airplane, ferry, or ski lift; how often he visited a museum or the gym; and how many books he read, with how many pages. Felton is an extreme symbol of the shift from reflective narrative (about the themes of books or the experiences had on the trips) to detailed numbers (calculations of pages read and miles driven or flown), for which Felton employs the concept “numerical narratives.”34
The self-knowledge favored here is based on numerical values and correlations that, while they must ultimately also be interpreted, are nevertheless—at least this is what the self-trackers assume—more reliable than self-description. “It is possible that the data-mapped, virtual self offers a more accurate picture of who we really are than the subjective stories we tell,” one observer holds, suggesting we should befriend the “encountered” alter-ego: “We can learn to love the data-mapped self that reveals our real behaviours, in all their complex, contradictory, hypocritical glory.” If this encounter is described not as alienation but as the consolidation, using physical means, of a psychically destabilized ego—“In self-tracking, we are literally trying to keep track of the body, to rephysicalize it, in an adaptive reaction to the ungrounding of the self in contemporary life”—then the assumption that objective data is reliable is also simultaneously revealed as a method for managing anxiety during times of rapid changes and increasing uncertainty.35 The obsession with data becomes an ersatz action that the individual expects will provide a new source of orientation. The identity crisis that results from the loss of narrative forms and formats (and, above all, from the lack of trust in them) is managed by a methodical change in the epistemological model from words to numbers. Against the reintegration of the individual in narrative wholes, the self-trackers bet on self-assertion through self-quantification, with the idea they can thus avoid being implicated in any grand narratives other than that of the number itself.36
Naturally, the methodical point of departure for numerical self-knowledge is also culturally determined. The practice of self-measurement is coherent with the general social trend, as an adequate control mechanism in the era of digital media. Therefore, the self-quantifiers’ so-called body hack is not an act of rebellion against a governmental or economic system but an attack on their own body in order to create more data about it and put that data at the disposal of scientific but ultimately also governmental and economic interests. To avoid conspiracy theory foreshortenings, both of the motivating factors behind this control-friendly hacking should be recognized. On the one hand, there is naturally an interest “from above” in the biometric datafication of the subject, not least because this data can be applied to the post-Fordist labor process, under the euphemism of gamification, as an imperative both for the collective struggle to get ahead and for individual self-optimization. On the other hand, however, there is also an interest “from below” in technologies of self-tracking that make it possible to realize traditional values, such as healthy nourishment and physical exercise, with the help of external methods of measurement and motivational assistance. Thus, the datafication of the subject, like her “wiring” into the “internet of things,” is a phenomenon that belongs to the cultural logic of modernity. It is not generally enforced against the individual’s will and is also not always contrary to her interests.
The turn to numbers is, admittedly, only perfect once the instruments of measurement are directly connected to the body of the subject, bypassing her awareness. Only the automatic collection of data protects it from being manipulated. The examples that have been cited are “frictionless sharing” on Facebook and the objective snapshots of Narrative Clip. Additional forms of machine-generated self-representation include the Foursquare app Swarm, introduced in 2014, which offers a “Neighborhood Sharing” function to automate the “Check-in” that communicates a person’s location to the social network. Then there is the app SpreadSheets, which automatically records data on sexual activity, using an accelerometer and a microphone (its predecessor Bedpost required entering this information by hand).37 The two apps exemplify the trend to document individual behavior based on data provided by the user’s body. Cultural content, as expressed in words, is negated by the nature of the body, which offers information about itself via technologies of digital measurement that bypass consciousness. Thus we produce traces that are unavailable to us personally, even if we did, at one point, turn the mechanism on and were conscious of our actions in each concrete situation. But in the end, it is the algorithm that can provide information about what places we visited three years ago, what we did a year ago on this or that day, or which status updates we “liked” a month ago. It is the algorithm that knows what we are up to, to the extent that knowing means knowing about the external data that our life yields.
POSTHUMAN SELF-DESCRIPTION
Jean-François Lyotard problematized grand narratives long before the internet and databases had become symptoms of cultural change. Accordingly, Lyotard’s reception, at first, did not include references to the new media, although it did focus on the work’s philosophical and narratological implications. Thus, the German-Korean philosopher Byung-Chul Han characterizes Lyotard’s accentuation of the that (something occurs) rather than the what (occurs) as a “turn towards being”: “In the age characterized by narration and history, being retreats into the background in favour of meaning. But when meaning retreats in the course of de-narrativization, being announces itself.” The “dissolution of the narrative chain” frees perception from the “chains of narration”, that is, narrative coercion, to events in the proper sense of the term. The event—contradicting Gatterer’s bon mot of 1767—remains the event even when it is not part of any (narrative) system.38
In the context of the narrative theory of meaning, which Han represents, the moment that has been freed from the fetters of narrative possesses “profoundness of being” but no “profound meaning”: “its profoundness only concerns the pure presence of the There. The moment does not re-present … The There is all it contains.” This pure presence, this phatic communication with no aim other than itself, is precisely the problem for Han. Against Lyotard’s opinion that the “end of narrative time” makes it possible to come close to the “mystery of being,” Han argues for the “nihilistic dimension” of such a perspective: “The decay of the temporal continuum renders existence radically fragile. The soul is constantly exposed to the danger of death and the terror of nothingness, because the event which wrests it from death lacks all duration.”39 Modern humans’ experience of time, Han continues, is a “rugged, discontinuous event-time”: fullness without direction. We are constantly starting over, channel-hopping through “life possibilities,” precisely because we are no longer able to carry a possibility through to the end: “The time of a life is no longer structured by sections, completions, thresholds and transitions. Instead, there is a rush from one present to the next.”
The finding can be expanded phylogenetically, as the problem not only of the individual but of humanity as a whole: “The end of history atomizes time into point-time … history gives way to information. The latter does not possess any narrative width or breadth.” Precisely because information is a phenomenon of “atomized time” or “point-time,” it must strive all the more hysterically to fill the voids between these points, which can no longer be experienced as part of a narrative line. Thus, perception is always supplied with new or drastic materials. Atomized time permits no contemplative lingering. Not knowing where we are going leads to a narrative stasis, which is camouflaged with a flood of events and which, with Heidegger, could be disqualified as an absence of dwelling (Aufenthaltslosigkeit) in a meaning-deprived, sped-up sequence of mere happenings: We are channel-surfing ourselves through the world. Since the death of God, humans naturally no longer redeem themselves from the lack of dwelling in earthly existence by means of cosmological experience, for example by living toward the fulfillment of divine time. Yet at the same time, since the “end of history” has also rendered world-changing heroism impotent, humans are no more heroes of time than the monks in the Annals of St. Gall once were. Time, fragmented into its individual moments, is no more, now, than what must be experienced or, depending on one’s perspective, endured.40
For Han, distracted busyness is no more a solution to the crisis of existence than was Lyotard’s existential profundity. Han turns back to Heidegger’s construction of contemplative lingering and votes for a “return-to-self”: “The end of narrative, the end of history, does not need to bring about a temporal emptiness. Rather, it opens up the possibility of a life-time that can do without theology and teleology, but which possesses a scent of its own. But this presupposes a revitalization of the vita contemplativa.”41
Han’s construction repeats a figure of thought that is found not only in Heidegger but also in Kracauer, who, at the time of the publication of Heidegger’s Being and Time, was writing about “metaphysical suffering from the lack of a higher meaning in the world, a suffering due to an existence in empty space.”42 For Kracauer, there were three kinds of alternatives to the “cult of distraction” that emerged as a reaction to metaphysical homelessness: principled skeptics (also called “intellectual desperados”), “short-circuit people” (who fled “headlong” into a new belief), and “those who wait” (and who tried to achieve a “relation to the absolute” by means of a “hesitant openness, albeit of a sort that is difficult to explain”).
Kracauer did not specify what “those who wait” were aiming at with their “hesitant openness.” But elsewhere he made clear that their stance was preceded by a refusal of distraction and a willingness to be bored: “The world makes sure that one does not find oneself. And even if one perhaps isn’t interested in it, the world itself is much too interesting for one to find the peace and quiet necessary to be as thoroughly bored with the world as it ultimately deserves.”43 Kracauer’s early critique of the quality of being “interesting,” which, long before Facebook, was evidently presenting and imposing itself as something that it really was not, is worth noting. His suggestion of a way to fight back prefigures Picard’s praise of quiet and implies abstinence from the media: “But what if one refuses to allow oneself to be chased away? Then boredom becomes the only proper occupation, since it provides a kind of guarantee that one is, so to speak, still in control of one’s own existence.” Kracauer’s mandate refers to the passage from Nietzsche’s Thus Spake Zarathustra, with which Han also concludes the German edition of his book, in which Zarathustra criticizes all those “to whom rough labour is dear, and the rapid, new, and strange”: “If ye believed more in life, then would ye devote yourselves less to the momentary. But for waiting, ye have not enough of capacity in you—nor even for idling!”44
The world to which Han and Kracauer recommend waiting and laziness as the royal road to contemplation has become unfamiliar with phylogenetic stories that could have offered an ontogenetic foothold. Postmodern man no longer experiences himself as part of a social project. He is not a pilgrim on the “path of progress” toward himself and the deeper meaning of life; he is a tourist who doesn’t want to be determined by the past or constrained by the future, a “flexible” man with a “situational identity” who “lives at the vanishing point of individualization and acceleration” and has forfeited the “claim to (diachronic) continuity and (synchronous) coherence.” He lives under the “impression of racing stasis: things change, but they do not develop.”45 This is the more recent summary of the postmodern subject’s loss of narrative that Bauman similarly observed twenty years earlier: “The overall result is the fragmentation of time into episodes, each one cut from its past and from its future, each one self-enclosed and self-contained.”46
What is decried here as the loss of a narrative home for the self, other commentators celebrate as the lightness of a “thin subject”—a radically ephemeral self that not only exists separately from its lived experiences but essentially also vanishes along with the event that has just been experienced, and reappears with the next one. This perspective undermines the theory of psychological and ethical narrativity according to which humans only experience their life when they tell it to others and themselves, allowing them to develop a responsible personality. It also challenges the model of the diachronic self, which emerges experientially in the coming together of past, present, and future, by positing an episodic type, which always lives in and understands itself exclusively in relation to the present: “One has little or no sense that the self that one is was there in the (further) past and will be there in the future, although one is perfectly well aware that one has long-term continuity considered as a whole human being.” This does not mean that the episodic type lives without any memory of the past. But the memory occurs without any narrative passion. Yesterday is unreflectively present in today the way the past rehearsals of a musician are present in an actual performance.47
The critique of the identity concept of “ethical-historical-characterological developmental unity,” which in a sense transfers the modern concept of development to the individual, puts paid to Heidegger’s assertion that episodic personalities are necessarily “inauthentic” in their experience of temporal existence: “But I think that the Episodic life is one normal, non-pathological form of life for human beings.” The proposal that is developed in response—of a life lived in the moment—is troubling to all those who, with the help of traditional criteria like identity, authenticity, or coherence, seek to describe a society in which other values (hybridity, change, momentariness) have long since come to determine the actions and self-understanding of individuals. It opens a positive perspective on the “presentism” of the (post)modern subject, on life in the now in the context of mobile media and social networks, and it resists the model of psychoanalysis (which provides the foundation for the ethical-narration thesis), according to which moral growth happens through reflection that simultaneously entails the overcoming of the narcissistic id by the social ego. Thus, not least of all, the construction of an episodic identity relativizes the critique of digital media’s antinarrative dispositif.48
In a 2014 essay, the French grandmaster of autobiographical research Philippe Lejeune bemoans the decline of autobiographical identity in the era of acceleration and predicts coming forms of autobiographical writing on social media that will be incoherent, hypertextual, and multimedial.49 Autobiography on Facebook is, in fact, incoherent, hypertextual, and multimedial. It is simultaneously posthuman, on all three levels of possible authorship: users, network, and algorithms. The sovereignty of the autobiographer is already fundamentally compromised when, following the “authority of the form,” users adhere to the value assumptions and standardizations on lists of questions and categories. It is further weakened by the montage that is created when a person’s own status updates are combined with the comments and status updates of friends. It is utterly lost when the algorithm becomes a “ghostwriter” with plans of its own.50
Naturally, Facebook is not the first to challenge the sovereignty of the autobiographer. The discourse of postmodernism already introduced external entities as the actual actants and affirmed the “death,” or disappearance, of the author, since the ego is not the sovereign source of its feelings and thoughts but merely the point of intersection of its discourses. In both postmodernism and posthumanism, the subject lacks agency and self-determination. Yet we should not overlook the different natures of the heteronomy at work here. In the postmodern context, the subject’s competitors for sovereignty are human actors (relatives, acquaintances, shapers of discourse present and the past), while in the posthuman context the competitors of the subject are technological: algorithms. While postmodern autobiography is determined (or “distorted”) by the internalized perspectives of the culture to which a person belongs, in the case of posthuman autobiography “alien authors” (the network and algorithms) take over the writing, which then occurs outside the consciousness of the subject. Unlike the posthuman subject, which transfers its role as an autonomous actor to software whose decisions it cannot control, the postmodern subject, despite its heteronomy, maintains authority over its actions and identifies with the perspectives that are presented to it to the extent that it adopts them. The difference is the internalization of the heteronomy in writing (postmodern), as opposed to the outsourcing of the writing itself to heteronomous sources (posthuman).
Rather than making a premature claim of continuity, we must therefore emphasize the new quality of the subject’s disempowerment, which is now a disempowerment of rather than by culture. The dethronement of the autobiographical subject (as narrator) by the algorithmic narrator is simultaneously a “liberation” of the autobiographical subject (as narrated) from cultural heteronomy. For the algorithmic narrator operates independently of the predetermined assumptions of cultural value that are inevitably manifested in the reports of human narrators. It is true, as software studies emphasize, that codes and protocols are, in principle, culturally determined, but the example of the Foursquare app Swarm sheds light on the difference at stake here: While the human narrator will skip the “Check-in” at certain points because he finds it unimportant or discrediting, the automatic place identifier ascribes no values and allows no concealment—unless, of course, the user has programmed it to do so. The “algorithmic auto/biography” that comes into being in the posthuman mode of writing is the “blackboxing of the self.”51
This autonomatism, it must be said, is liberation only if the report’s precision and accuracy are valued more highly than the activity of making a report, in other words, if the performative act of narrating (with its practice of reflection, necessarily accompanied, as it is, by distortion) is factored out in favor of greater objectivity. The discussion of narrative psychology pointed to the problematic nature of this approach and emphasized the necessity of narration as a praxis of linguistic and analytic competencies. This praxis, too, should be further explored and problematized as a form of heteronomy of the subject—a process of disciplining enforced by means of cultural expectations that are to be fulfilled. The more negative the resulting judgment turns out to be, in this respect, the more readily some people will welcome posthuman narration. This response has been prepared, in turn, by the critique of narration as a form of world and self-representation.
Parallel to the death of the sovereign author, around the middle of the twentieth century narration began to be accused of betrayal. A coherent story, it was claimed, creates the illusion of order and reduces reality to pure logic. This insight, injected into the literature of French existentialism (Camus’s The Stranger, Sartre’s Nausea), was radicalized by the noveau roman, which fragmented reality and made it appear incoherent. Impelled by similar concerns, Marxist and poststructuralist writers criticized the conservative character of the genre of autobiography as the false appearance of an individually determined, coherent, and meaningful life. In this situation, when narrative incoherence, discrepancy, and confusion are promoted as being more true to life, the development of historiography, as it was described at the beginning of this chapter, makes another about-face. Where it had previously evolved from an accumulation of facts lacking perspective into a causal concatenation of events, historiography now turns back again, as narratives cease to be understood as “actual, inner” truth, the way Wilhelm von Humboldt had seen them, and now appear to be nothing but “euphoria” and the creation of “serenity” through order.52
In light of this critique of the illusion of coherence, posthuman practices of narration on social media can also be understood as a radicalization of postmodern poetics. The author—in postmodern thought still identifiable as an enunciating web of quotations from innumerable cultural sites—is further reduced until it becomes a merely mixing web of experienced events, composed of data administered by mechanical narrators. If the portrait that emerges in this fashion is experienced as an alien self, this, in turn, recalls postmodern concepts of identity that conceive the encounter with one’s “own foreigner” as fostering the disintegration of the self in a way that encourages tolerance. The disempowerment of the self on Facebook—initially problematized as a loss of narrative engagement, then relativized by the rehabilitation of the episodic type of identity—ultimately appears as rescuing the self from narration’s techniques of self-deception through the use of “unimpeachable” methods of data collection. This “human” aspect of the posthuman will continue to occupy us in the following chapter, as we turn to the negative aspects of narrativity for individual and collective identity formation.53
Let us, for the moment—less as a conclusion than as a prognosis—keep in mind the three-step evolution leading up to the posthuman narration of the self: (1) from words to numbers, when description is replaced by statistical information, as demonstrated by the example of the Quantified Self movement; (2) from mechanical to automatic processes, when the inputs are no longer consciously entered by the subject but are involuntarily provided by the body or, as in the case of Snapchat, emerge spontaneously and more or less unconsciously; (3) from option to duty, when the creation and analysis of data is no longer initiated by the person who produces it but is forcibly imposed or secretly undertaken by employers, insurance companies, or government agencies.
Let us also keep in mind the challenge that the episodic identity type, as confirmed by many contemporary observers, poses for the narrative self-understanding of previous generations. In Douglas Coupland’s 1991 novel Generation X, one of the main characters says, “it isn’t healthy to live life as a succession of isolated little cool moments. ‘Either our lives become stories, or there’s just no way to get through them.’ ”54 What Generation X still cared deeply about—giving life, which at the time already seemed like an aimless collection of insignificant events, meaningful coherence—may have forfeited all relevance for Generation Y, the “Facebook generation.” A dispassionate review must ask what the consequences are likely to be when the narrative wholeness of life disappears, history degenerates into mere information, and existence becomes nothing but the rush from one present to the next. The question is how the loss of individual and collective storytelling changes the way not just individual people but humanity as a whole deals with the past and the future and with others. It is a question about the political consequences of episodic identity.