If codes—say, the alphabet—were created to break the world down into distinct units and rework them into legible reality, then this process amounts to a kind of digestion. The code decomposes the world into discrete building blocks, places them in sequence along an arrow of time, and makes them readable, interpretable, and, ultimately, programmable. Like a spider’s prey injected with gastric juices, the world dissolves. From this point on, it can be shaped and molded. In other words, the alphabetic code breaks down the world by breaking it into distinct elements (peptides) and reworking them into reality, usable matter. Inasmuch as this demiurgic principle dissolves the riddle of Creation, the principles of use and exploitation—that is, extracting, potentiating, and transforming isolated elements—come to be posited absolutely and taken to be second nature. Enigmatic Creation is replaced by simple, docile matter. It is no accident that money came into being at the same time as the alphabet. Money is another peptide, which helps break down the world into discrete parts that then can be enriched, accumulated, and put back together in practically unlimited ways.
Today, the digital code is eating up the linear, alphabetic code. If the alphabet once served to change the body of the world (soma) into a sign (sema), the digital code is now going still further and dissolving the world of signs (sema) into digits (bits). Thomas Hobbes wrote, “In the philosophy of nature, I cannot begin better … than from privation; that is, from feigning the world to be annihilated”; in this light, the liquidation of signs, their digitization, signifies a second-order annihilation of the world. This recoding proves all the more radical—this holds for DNA, too—in being able to erect a system of signs where a body formerly stood. It is not by chance that, wherever the world is transfigured into digital copy, ideas of a second genesis—optimized Creation—emerge. In the digitized realm, as in synthetic biology, the body is not broken down and recombined in its material form; instead (as in the case of designer babies), it is drafted as a kind of wish formation. But when the very construction of the body becomes a matter of design, the body (as a destiny) falls mute and can be experienced only to the extent that it constitutes a social body. Digitization disintegrates the body as a physical phenomenon and transfers it (big data) into a world of signs; there, it loses all definition. The body turns into pure information, a site of transition, where intentions, actions, and ideas manifest themselves only in passing. Even when one particular body proves lucky enough to have been preferred to a host of rejected possibilities, it remains—as per genetic law—just one particular body among all other bodies (xn → x); alternatively, it amounts to a particular body standing in for the population as a whole (x ← xn).
The simultaneous presence of all that has ever been created, medieval theology taught, is Hell. Under these auspices, the Internet may be understood as a site of world constipation: a digital inferno. The Net never forgets. And even when the party in charge of one server or another takes care to keep his digital lawn free of unseemly waste, the prevailing logic of proliferation ensures that each and every item of data is destined to last forever. If, in the innocent childhood of the Internet, the scientific community was still inclined to view the swell of stored information as proof that universal knowledge was growing, it is now clear that apathy, malice, and superstition are spreading as much—if not more—than before. In Hell, of course, all such depravity is part of universal knowledge, too. Toward the end of his life, Flaubert toyed with the idea of writing an encyclopedia of stupidity. If he rose from the grave today, Flaubert would have to admit that this work has long since been done—worse still, that it’s constantly being rewritten and expanded. Information means advertising, philosophy, photos, threads, posts, and shops. It includes decapitation videos, funny cats, vanilla sex, hate-filled comments, high-end liquor sales, and front-page news. The whole landscape of Dante’s Inferno opens wide—except that the Net has done away with the circles that once provided orientation for the medieval pilgrim. Moreover, systematic falsification and disinformation—which, in this deluge, are no longer identifiable as such—cause tsunami-like waves of agitation. Marketing firms and governmental agencies make virtuosic displays of how such quakes in social networks can serve the interests of propaganda.
The most powerful effect of information overkill, perhaps, is that users feel they can lay hold of this flotsam at any time and, consequently, no longer bother committing anything to memory. Indeed, they fall deaf to temporal depth—to historical consciousness. How did Alexander Kluge put it? “The attack of the present on the rest of time.”
Since the start of the computer age—since 1970—money has no longer resided in (a) capital; rather, as a free-floating formation, it has become placeless. Once upon a time, the steel walls of Fort Knox, housing a great reserve of gold, fed the illusion that money is not an arbitrary sign but instead represents a concrete value. But now, the headless markets of world finance determine the value of any given currency. In the same way that our economy has become fluid, money has liquefied and become an electronic, digital sign. Indeed, money exemplifies the digital sign: it violates borders, oozes in, and penetrates every last pore of not just the economy but even the way we perceive the world. On the one hand, this development gives rise to hopes for exorbitant profits; on the other hand, the promise of added value yields the threat of inflation (xn). Such instability—which finds expression in increasingly common stock-market and currency crises—is not an aberration to be blamed on the greed of particulars; instead, it is inherent in digital logic. Inasmuch as money, in keeping with the logic of x = xn, operates at the speed of light, conflict necessarily arises between capital and labor: between the global, atopic sign and backward nation-states. As a result, the handmaidens of capital (as they used to be called) needn’t even be particularly malicious. They need only carry out the rationality inherent in digital logic—when decisions aren’t just left to algorithms for high-speed trading or risk minimization. Capital then flows to wherever the lowest tax burden holds and the greatest profits may be expected. The most recent financial crisis made plain the effects of this mechanism: money was (or is) no longer invested in the “real economy,” but in the financial sector. As such, money spawns more money, while the “real economy”—the relic of an age in decline—is sucked dry. Today, revenue in the financial economy—that is, the economy that operates with placeless signs—is roughly seventy times greater than revenue in the “real economy.”
If money, transformed into an electronic sign, is no longer underwritten by anything but itself, then viewing its function as a matter of measuring and storing value becomes problematic: the threat of inflation looms, and its harbingers arrive more and more often in the form of speculation bubbles. How can we preserve money’s scarcity function? The answer is as follows: in the digital world, anything and everything is infinitely multipliable—except for the consumer’s attention. It’s impossible to read two books or watch two movies at the same time. In this context, the attention paid by the user/consumer takes the place of natural scarcity—say, the scarcity of gold. Such transubstantiation has given rise to the attention economy. When so-and-so many people click on a YouTube video, it receives a value. If, formerly, value was an act of purchase, now it is a “hit” or a click—some trace signaling that users have devoted attention to this particular object, site, or event. From here on out, value derives from the total amount of clicks (xn) generated. In the attention economy, we encounter a strange dialectic: money is not covered by an equivalency in the framework of the formula, but is possible only given a natural limit. Only because the consumer’s time is limited (in contrast to any and every digitized object) can the attention economy command any credibility. The analog (in other words, the human being, insofar as she or he is not x) underwrites the digital. The shift from a standard based on natural scarcity to one based on finite perception signals a far-reaching change. Value is no longer generated when something is wrested from nature; instead, it arises where, against the backdrop of superabundance, choosing one commodity or another counts as a significant act—as a display of belonging, lifestyle, narcissistic aggrandizement, or what-have-you. Once upon a time, commodities were material in nature. But today, opinions, perceptions, and emotions can be marketed—capitalism has taken root in psychic life. Capitalist rationality has not shifted so much as exploded. Wherever the economy involves rates, any and every object becomes a matter of shareholding. In this sense, all of today’s talk about “Me, Inc.” is hardly misguided: individuals stand exposed to permanent pressure from polls and quizzes—ratings rule. But at the same time, this means that we are facing a divided scheme of logic: on the one hand, we remain natural beings; on the other hand, we dissolve into digits that, in keeping with the logic of the formula, can be spread and multiplied at the speed of light.
That said, the image created by this process possesses value only inasmuch as it is covered by a personality standard with the same function as the gold standard of old. Indeed, Marx anticipated this problematic division when he observed that only human beings create added value. Wherever the machine comes into play, the profit rate tends toward zero; added value is possible only insofar as a human being introduces a plus ultra, which the machine cannot provide. But for all that, it is not really enough to declare that value comes about when and where the individual ≠x. Value results when a human being does something that a machine cannot do—it results from action eluding the logic of the formula (even though this same logic can help to convert it into a mass product). It is no accident, then, that the personality standard is not an abstract quantity, but instead shines forth in the face of celebrities: human beings offering performances that, while ultimately incomparable, are generally valued.
It does little to help our understanding of contemporary capitalism to invoke the truism that capital is a voracious monster. The term new economy is much better suited to describing the change of the capitalist order: it signals that we’ve undergone a tectonic shift. But what, exactly, is this shift? For one, digitization has made it plain that the age of mass production is over. If unit labor costs for manufacturing further products tend toward zero, the formula x = xn sounds like a farewell to the classical economy. Anything that exists in surfeit is difficult to sell. The very success of capitalism has killed it. It is not just that products have become worthless patterns of uniformity, but that all the structural elements of classical capitalism, from human labor to the means of production, have undergone the same process of devaluation and hollowing out. The patient lying in intensive care is being kept alive by artificial means: scrapping incentives, aid packages, debt relief, and financial injections. If, in metropolitan temples of finance, the triumphant gestures of self-proclaimed Masters of the Universe once built towers rising to the heavens, economic crisis has revealed that these edifices are in fact giant tombstones, whose only real purpose is to cover up a historical nadir. The very phrase real economy points to the gaping abyss behind the Potemkin village. In a decisive sense, capitalism has drifted into the virtual sphere—where everything that happens involves simulated labor and descriptions of products and services. Even when it’s still possible to exploit the rules of classical capitalism, profits tend toward zero. After all, perfect markets are entropic: the profit rate approaches nothing. That’s why the magic word of the postcapitalist world is “disruption.” Only causing disturbances in the market—be it by developing an entirely new service (e.g., cloud computing) or by digitizing an inherited business model (say, moving from taxis to Uber)—promises any gains.
Even though innovations and shocks of rationalization may elicit shouts of joy from portal operators, it’s worth noting that even portals (= markets) used by hundreds of millions of people do not operate at a profit. If firms without a product manage to be successful all the same—by tying a significant number of users to their services—this means that the user has become a product. Accordingly, Facebook bought WhatsApp for 19 billion dollars—even though the company wasn’t in the black and had no business model for monetization. In other words, Facebook handed over about forty-two dollars for each of WhatsApp’s 450 million users at the time. In fact, portal operators are interested not in the market—where assorted vendors engage in competition—but in achieving a monopoly. The network effect can be exploited profitably only when the provider enjoys users’ unrestricted attention. This involves a fundamental shift in the relationship between capital and labor: to harness the activities of followers (crowdsourcing), the provider must take the stage as a perfect administrator of the public interest, a kind of superuser. Even if he alone derives profit from the work that users provide, the social illusion must be preserved. As such, feudalistic thinking returns: following this logic, users serve as members of the provider’s retinue. Providers no longer have anything to do with industrial barons. Their motto is “Don’t be evil!” Their mental world is neo-feudalistic. Their religion? The gift.
If transformation is understood in terms of reproduction, one paradigm for the proliferation that the digital promises is agriculture. Inasmuch as it is concerned with increased yield above all, and since x = xn is a genetic formula, agriculture represents a kind of digital sphere avant la lettre. There used to be three cows in the barn, now it’s a thousand. Five pigs were wallowing in the sty, now there are ten thousand. Seven or eight hens pecking away multiplied into one hundred thousand.
Genetic technology suggests that life is nothing more than a process of reading and writing. When the code in DNA is recombined or changed by human intervention, it counts as a kind of authorship. Thus, it is only logical that such “intellectual property” be patented. Indeed, federal judges in the United States have declared the work of copyists who transcribe and alter the original text to be a form of invention (which does not even presume full understanding of genetic structure). This lack of philological precision means that since the late 1970s, patents have been granted for genetically modified animals and plants. With this tool, companies like Monsanto have managed to bring whole markets under their control. Strictly speaking, modified genetic matter is patented not to protect a specific product, but to gain mastery over its distribution. As such, we can say that biotechnology is trying to apply the mad proliferation promised by the Boolean formula—x = xn—to the “real economy.” When genetic material is patented, the documents might concern an “invention,” but in fact the real point is to patent the Boolean formula itself.
If trademarking forms of life represents the ultima ratio of modern society, it rests on a questionable equation of life and information. As such, it reflects the proof of the existence of God that comes with digital logic. If genetics has usurped the place of God, a brief look at the field’s history reveals that the claim to have decoded the whole genome is a marketing trick: affirming authorial rights in spite of the fact that science has taken the very idea to absurd limits—and for some time now. After all, the exceedingly complex processes involved in translating the genetic code into phenotypical articulations and the interactions with other organisms that this entails hardly admit scientific description, much less juridical evaluation. What would happen if jurisprudence not only awarded rights to genetic copy-pasters, but also made them answer for the damage caused by their monsters? What court in this world would allow God to stand accused for the faults in Creation?
Capitalism made the work ethic the basis of our self-understanding. Work is the ladder one climbs to reward and blessedness. That said, the history of capitalism also teaches that drop height grows in proportion to ascent. What’s more, humankind has made a companion and competitor whose abilities already surpass the understanding and potential of most people: the machine. Whenever it is euphemistically declared that the human being has been liberated, the messy truth still holds that people, once set free, often plunge into moral nullity. Stripped of a station, the individual by turns resembles a useless creature of luxury or just tracks in dirt. If one tries to join the system again, it often means facing up to the fact that one has no prospects: working memory can recall the catalog of labors already performed, and it operates at a speed exceeding the human capacity for reaction. The formula x = xn heralds a world where we interact with constructs without any humanity: no fatigue, boredom, or moods. A computer isn’t a colleague. It’s actually a foreign body—and precisely insofar as it knows how to incorporate human labor and intelligence. If the Chinese company Foxconn, which assembles iPhones for Apple, has replaced its suicide-prone workforce with an army of robots, such measures follow from the logic of working memory, according to which the human being fundamentally amounts to a defective entity.
For ages, society made a point of equating human fulfillment with industria. Now, however, the bankruptcy of this virtuous image is plain. If we understand the logic of x = xn as capitalism’s operating system, it is evident that added value arises only where the individual performs something surpassing the production level of machines. Since it’s impossible for a single person to compete with the speed of working memory, all that remains for him or her are spheres where the machine has not (yet) achieved domination. Of course, there are still niches that elude digital logic; all the same, there’s no doubt that these habitats (as sociotopes) are growing smaller and smaller. Even for somebody who—unlike most people—knows how to program the operating system of x = xn, there is nowhere to hide. Once a task has been programmed, it lands in the museum of labor—and is devaluated accordingly. What Joseph Schumpeter called creative destruction boils down to a split. In the realm of the formula, we are faced with asymmetry: the human being, in keeping with human nature, can think the equation, but she or he cannot comply with it physically. Though the spirit is willing, the flesh is weak.