chapter four
“Signs of Signs”
Watching the End of Modernity at the Cineplex
Reading for the Ending
Robert Bork and William Bennett—Mrs. Cheney, too—were of course railing against a culture that reached beyond the campuses. Their purpose was to reform (or debunk) the university and thus to redeem that larger culture. But the evidence suggests that Americans needed no prodding from tenured radicals as they moved in the 1980s and 1990s toward acceptance of equity between races, classes, genders, and sexual preferences, on their way to bemused tolerance of almost anything, including animal rights.
To be sure, education as such remained the object of cultural critique from self-conscious conservatives. Pat Robertson, for example, the televangelist and presidential contender—he participated forcefully in the Republican primaries and debates of 1988—claimed that the public school system in the United States was “attempting to do something that few states other than the Nazis and Soviets have attempted to do, namely, to take the children away from the parents and to educate them in a philosophy that is amoral, anti-Christian and humanistic and to show them a collectivist philosophy that will ultimately lead toward Marxism, socialism and a communistic type of ideology.” Jimmy Swaggart, another televangelist, was more succinct: “The greatest enemy of our children in this United States . . . is the public school system. It is education without God.”
Even so, the larger culture exhibited unmistakable signs of rapid, irreversible, and enormous change. The educators themselves, high and low, were being educated by that change. How, then, should we summarize that change—how should we gauge its symptoms?
One way is to import Samuel P. Huntington’s notion of a “clash of civilizations” as the characteristic divide of the late twentieth century. Huntington was the Harvard political scientist who made his political bones back in the 1960s by planning “forced draft urbanization” in Vietnam—if the peasants aren’t out there in the countryside helping the guerillas (the Viet Cong) because you have removed them to existing cities or concentrated them in newly constructed “urban centers,” he surmised, you can then stage direct military confrontations between American soldiers and communist insurgents. In keeping with his policy-relevant duties in the aftermath of Vietnam, he suggested in a 1995 book that impending global conflicts would turn on cultural (read: religious) divisions rather than the older divisions of political economy, which had placed capitalism and socialism at the opposite extremes of diplomatic decisions and developmental options. The domestic analogue would be the so-called culture wars, which dispensed, for the most part, with arguments about economic arrangements and instead engaged the problems of moral values, civic virtues, and familial integrity—“cultural indicators,” as Bennett called them in a flurry of articles and books.
Another way to summarize the same great divide is to enlist Daniel Bell and to propose that the “cultural contradictions of capitalism,” as he calls them, reached their apogee in the early 1990s, when the so-called culture wars got formally declared. In the sequel to The Coming of Post-Industrial Society, Bell argued that the American social structure—the mundane routine of work, family, and daily life—“is largely bourgeois, moralizing, and cramped,” the arid domain of “traditional values” but meanwhile the culture “is liberal, urban, cosmopolitan, trendy, fashionable, endorsing freer lifestyles, and permissive.” In other words, the bourgeois values inherited from the nineteenth century became problematic if not merely obsolete in the postindustrial rendition of twentieth-century consumer capitalism. To borrow the terms invented by Raymond Williams, the residual (bourgeois) culture was still committed to deferring gratification, saving for a rainy day, and producing character through hard work, whereas the dominant (capitalist?) culture was already animated by a market-driven hedonism—a “consumer culture”—in which such repressive commitments seemed quaint.
But notice that the Cultural Left ensconced in the universities was aligned with the bohemian values validated by a postindustrial consumer capitalism, whereas the New Right enfranchised by the churches and the think tanks was opposed to these same values. In this sense, once again, conservatism in the late twentieth century was not a blanket endorsement of what free markets make possible; like the radicalism of the same moment in American history, it was a protest against the heartless logic of the market forces created and enforced by consumer capitalism.
From either standpoint, however, Huntington’s or Bell’s, we witness a nineteenth-century version of self, family, and nation competing with a twentieth-century version. From either standpoint, bourgeois society struggles to survive against the global tentacles of postindustrial consumer capitalism. Perhaps the impending conclusion of this struggle, the impending extinction of bourgeois society, is what we mean—and is all we can mean—by the end of modernity. The modern world, the “era of the ego” was, after all, created by bourgeois individuals eminently capable of deferring gratification.
But most Americans were not reading Huntington or Bell in the 1980s and 1990s. Nor were they using Judith Butler’s poststructuralist vocabulary to understand what was happening to them. How then did they experience and explain the end of modernity? The question can be asked in more specific ways. Were these academic theorists just making it up? Or were they making sense of new realities—of fundamental changes? Was there a colloquial, vernacular idiom in which these changes were anticipated, recorded, codified? To answer, let us revisit some hugely popular movies of the late twentieth century—let us see what they require us to experience and explain—and then, in the next chapter, turn to some equally popular cultural forms, TV and music.
Big Movies, Big Ideas
To begin with, let us have a look at The Matrix, Terminator II, and Nightmare on Elm Street, each a part of a movie “franchise” in which increasingly intricate—or ironic—sequels retold the same story from new angles. The preposterously complicated plot of the original Matrix (1999) is almost beside the point. But for those of you who haven’t seen it, here goes. In a postholocaust future that resembles the scorched earth of the Terminator movies, machines have taken over the world: technological hubris has finally put an end to progress. Human beings have been reduced to dynamos whose metabolism is converted into the energy the machines need to—what?—go about their evil business. These benighted human beings just think that they’re going to work on those familiar city streets (the “city” looks like Chicago). In fact, they’re only holograms projected by the machines to keep their energy source happy. As in the Nightmare franchise, appearance and reality are identical, at least in the beginning.
But an underground movement exists to wake these unwitting creatures up by bringing them out of the Matrix and teaching them how to fight the power on its own holographic terms. This movement recruits Neo, a young blank slate of a man—played of course by Keanu Reeves, a young blank slate of a man—onto whom the underground leader has projected his millennial ambitions. Neo (his screen name) turns out to be the “chosen one” after all; he quickly surpasses his teacher, the leader, and becomes a virtual martial artist who kicks virtual ass.
Neo learns to enter and disable the Matrix, thus revealing the awful reality beneath the normal, hopeful images that sustain the physical life of the dynamos down on the energy farm. The assumption here is that human beings can’t stay alive without hopes and dreams: if they knew that they were merely cogs in a vast energy-producing machine, they would surely die. By the same token, if they lived in a perfect world, they would know from their experience of Western religion—which insists that you can’t get to heaven until your body expires—that they were dead. In both settings, they would be unhappy, but their hopes for a brighter future that is somehow different from the abiding present would keep them alive; the evil designers of the Matrix introduce imperfection into the grid when they realize this simple truth of human nature.
In The Matrix, the artificial finally overpowers the real, or rather the natural; meanwhile, the expectation of an end to the illusions of the holographic world finally becomes a religious urge that displaces any residual pretense of science fiction. The monstrous agents of the Matrix are shape-shifting, indestructible machines that inhabit and impersonate human beings. But Neo has no oppositional force or effect against them unless he’s “embodied” as a slice of computer code and inserted into a holographic “reality”—until he’s “embodied” in recognizable human form as a part of a machine. And his triumph over these agents of dehumanization is a result of his belief in himself as the messiah (the “chosen one”), which requires first a consultation with “the Oracle”—a woman who, by the way, inhabits the Matrix, not the scene of resistance—and then the loss of his corporeal form. At any rate, the laws of gravity and mortality no longer apply to our hero by the end of this movie: he has become a godlike creature who can soar like Superman.
Terminator II (1998; the original was in 1984) has no less of an appetite for biblical gestures and sacrificial rites. But the cyborg from the future who helps save the world from the bad machines isn’t an immaterial, possibly immortal presence like Neo. He’s always embodied. And even though he’s mostly machine—his apparent humanity is only skin-deep—he’s a better father to young John Connor, the leader of the coming rebellion against “Skynet,” than anyone else in view. “This machine was the only thing that measured up,” his mother, Sarah, says while watching the son and the cyborg do manly, mechanical things under the hood of a car.
In the original installment of the franchise, Sarah Connor is impregnated by a soldier sent back from the postapocalyptic future to protect her from the cyborg intent upon killing her; somehow everybody knows that her offspring will some day lead the rebellion against the machines. In Terminator II, the stakes are even higher. Sarah wants to abort the apocalypse, and her son pitches in with the help of the same model of cyborg that, once upon a time, came after his mother. In doing so, she is of course relieving her son of his heroic duties in the dreaded future—in the absence of real fathers in the flesh, after all, mothers have to do what’s right.
The apocalypse is finally aborted in three strokes. The Connors and their protector destroy the computer chip from the original cyborg of Terminator I, which has fueled research and profits at the malevolent corporation that invented Skynet, the digital universe of knowledge to be captured by the bad machines on August 29, 1997. Then they defeat a new, more agile and flexible cyborg sent back to kill young John by dipping the thing in molten metal—the end of the movie is shot in what looks like a cross between a foundry and a steel plant, both throwbacks to an imaginary, industrial America where manly men worked hard and earned good pay (Freddy Krueger stalks his teenage victims in a strikingly similar dreamscape, as if what haunts them, too is an irretrievable and yet unavoidable industrial past). Finally, the old, exhausted, even dismembered protector cyborg lowers himself into the same vat of molten metal that had just dispatched his robotic nemesis, thus destroying the only remaining computer chip that could restart the train of events that led to Skynet.
So the narrative alternatives on offer in Terminator II are both disturbing and familiar: Dads build machines—or just are machines—that incinerate the world, or they get out of the way of the Moms. Like the cowboys and outlaws and gunfighters of the old West, another imaginary landscape we know mainly from the movies, such men might be useful in clearing the way for civilization, but they probably shouldn’t stick around once the women and children arrive.
The endless sequels to the original Nightmare on Elm Street (1983) follow the trajectory of the Terminator franchise in one important respect—the indomitable villain of the 1980s evolves into a cuddly icon, almost a cult father figure, by the 1990s. But the magnificent slasher Freddy, who punctures every slacker’s pubescent dreams, always preferred the neighborhood of horror, where apocalypse is personal, not political: it may be happening right now, but it is happening to you, not to the world.
Here, too, however, the plot is almost irrelevant because it turns on one simple device. It works like this. The violence and horror of your worst nightmares are more real than your waking life; the dreamscapes of the most insane adolescent imagination are more consequential than the dreary world of high school dress codes and parental aplomb: welcome to Columbine. Freddy teaches us that the distinction between appearance and reality, the distinction that animates modern science—not to mention the modern novel—is not just worthless, it is dangerous. If you don’t fight him on his own postmodern terms, by entering his cartoonish space in time, you lose your life. If you remain skeptical, in the spirit of modern science or modern fiction, you lose your life.
The enablers of every atrocity in sight are the parents and the police (the heroine’s father, for example, is the chief of police), who are complacent, ignorant, and complicit, all at once. They killed the child molester Freddy years ago when he was freed on a legal technicality—or at least they thought they killed him—and so his revenge on their children seems almost symmetrical: the vigilantes in the neighborhood are now victims of their own extralegal justice. And their hapless inertia in the present doesn’t help the kids. In fact, their past crimes have disarmed their children. The boys on the scene aren’t much help either—they’re too horny or too sleepy to save anybody from Freddy’s blades, even when the girls explain what will happen if they don’t stand down, wake up, and get right with their bad dreams.
The Cultural Vicinity of The Matrix
So what is going on in the cultural vicinity of these hugely popular, truly horrific scenarios? At least the following. First, representations are reality, and vice versa. The world is a fable, a narrative machine, and that’s all it is. The directors of The Matrix make this cinematic provocation clear by placing a book in the opening sequences—a book by Jean Baudrillard, the French theorist who claimed a correlation between finance capital and the historical moment of “simulacra,” when everything is a copy of a copy (of a copy), not a representation of something more solid or fundamental. At this moment, the reproducibility of the work of art becomes constitutive of the work as art: nothing is unique, not even the artist, and not even you, the supposed author of your own life. Everything is a sign of a sign. The original Nightmare had already proved the same postmodern theorem with more gleeful ferocity and less intellectual pretensions, but it performed the same filmic experiment and provided the same audience experience. Terminator 2 accomplishes something similar by demonstrating that the past is just as malleable as the future: again, the world is a fable waiting to be rewritten.
Second—this follows from Baudrillard’s correlation of finance capital and the historical moment of simulacra—the world is, or was, ruled by exchange value, monopoly capital, and their technological or bureaucratic offspring. The apocalypse as conceived by both The Matrix and Terminator II is a result of corporate-driven greed (in the latter, the war that arms the machines is fought over oil). An ideal zone of use value beyond the reach of the market, a place where authentic desires and authentic identities are at least conceivable, is the coast of utopia toward which these movies keep tacking. The movies themselves are of course commodities that could not exist without mass markets and mass distribution, but there is no hypocrisy or irony or contradiction lurking in this acknowledgment. Successful filmmakers understand and act on the anticapitalist sensibilities of their audiences—none better than Steven Spielberg. Even so, they know as well as we do that there’s no exit from the mall, only detours on the way.
Third, the boundary between the human and the mechanical, between sentient beings and inanimate objects, begins to seem arbitrary, accidental, inexplicable, and uncontrollable. Blade Runner (1982) and RoboCop (1987), perhaps the two best movies of the 1980s, are testaments to this perceived breakdown of borders, this confusion of categories: the good guys here are conscientious machines that are more human than their employers. That these heroes are both victims of corporate greed and street gangs does not change the fact that, like the tired old cyborg of Terminator 2, their characters and missions were lifted directly from the Westerns of the 1930s and 1940s—they’re still figuring out what it means to be a man while they clean up Dodge City, but now they know that machines, not lawyers, might do the job better. Again, the artificial overpowers the natural and remakes the world. A fixed or stable reality that grounds all representation and falsifies mere appearance starts to look less detailed and to feel less palpable than the imagery through which we experience it; or rather the experience is the imagery. So the end of Nature, conceived by modern science as an external world of objects with its own laws of motion, is already at hand, already on display. The world has been turned inside out.
That is why the eviscerated world on view in these movies seems “posthistorical”: technological progress can no longer look like the horizon of expectation, not even for the citizens of the most advanced capitalist nation on the planet. Even here the machines are taking over, downsizing every sector, but particularly manufacturing, making good jobs in the factory or the foundry—or for that matter in the back offices—a thing of the past. When the machines do everything, the prospect of getting a better job than your father (if you have one) becomes unlikely, and the prospect of human civilization looks no better than bleak. Put it another way. If the future of Man doesn’t look so good because the difference between sentient beings and inanimate objects has become arbitrary, accidental, inexplicable, and uncontrollable, the future of men looks even worse.
Fourth, the self, the family, and perhaps the nation are at risk in a world ruled by simulacra—that is, where you can’t specify the difference between appearance and reality, between machines and men, or when you realize that everything, maybe even your own self, is a sign of a sign. We have already noticed that John Connor’s adoptive father is a cyborg; and we’ve noticed that the parents in the original Nightmare are a big part of the problem our pubescent heroine faces.
We should also notice that only two of the small band of heroes which recruits Neo to the cause have been born outside the Matrix—you can tell because they don’t have metal inserts in their necks and arms—but there’s no explanation of who Mom and Pop are, except that, like the leader, they’re African American. This is a family? We must assume so, because these two are designated as “brothers.” Meanwhile the others are trying to figure out where they begin and the computer code ends (we in the audience are as well, especially when the traitor decides he wants to go back into the Matrix and eat virtual steak). Their creaky old craft—it, too, looks like a remnant of industrial America—is named Nebuchadnezzar after an ancient king of Babylon who had conquered Judaea in accordance with a cranky God’s wishes, but the key to their survival is “Zion,” the mainframe that unites the resistance.
This naming of the thing that keeps them together is significant because it is shorthand for a nation that is imminent but not extant—it’s an idea whose time has not yet come, as in the “promised land.” The question it raises is, how are we to imagine a productive relation between these three categories (self, family, nation) now that we have put them at risk, that is, in motion, in cyberspace, where the weakened condition of a fixed, external reality loosens all ties?
Generic Panic
So the end of modernity was not the intellectual property of academics isolated in their ivory tower, lacking any connections to the “real world.” It was deeply felt and widely perceived in the popular culture organized by film (and by TV and music, of course, which we’ll get to later). One way to measure the breadth of this feeling, this perception, is to notice how it informed really bad movies as well as really good ones and how it reanimated—in the most literal sense—the politics of cartoons. Or, to put it in the terms proposed by Carol Clover, the brilliant analyst of horror films, one way to measure the widespread panic induced by the end of modernity is to watch how the thematics and sensibilities of truly awful movies entered and altered the mainstream.
Let’s begin with the panic.
Many historians and critics have pointed to the profound sense of an ending that permeated American film of the 1970s, 1980s, and 1990s. But it was not just the American century that was waning in Oscar-winning movies like The Deer Hunter (1978), which dramatized the military defeat of the United States in Vietnam as a crushing blow to American manhood. The fin-
de-siècle feeling built into the approach of a new millennium was compounded and amplified by realistic reports—and hysterical fears—of pervasive criminality, random yet universal violence, maybe even ineradicable evil; by the decline of patriarchy, which accompanied the decomposition of the traditional nuclear family and the deindustrialization of the American economy; by the rise of the new “postfeminist” woman whose bodily integrity, moral capacity, and sexual autonomy were validated by the Supreme Court in the Roe v. Wade decision of 1973, then contested by the emergence of the Religious Right; by corporate malfeasance and government corruption—
incessant scandal, public and private—from Watergate to Gary Hart on toward Iran-Contra and the dangerous liaisons of the Clinton years; by damning revelations of the uses to which American power had been put during and after the Cold War from Iran to Chile to Nicaragua, where revolutions in the 1970s were designed to discredit and disarm the Great Satan, the Whited Sepulchre based in Washington, D.C.; and by the public, determined, sometimes flamboyant display of homosexual affection and solidarity in the name of gay rights, a movement both complicated and magnified in the 1980s by the eruption of a deadly new sexually transmitted disease, HIV/AIDS.
When everything—law and order, manhood, fatherhood, womanhood, family, heterosexuality, even national honor—is ending, the apocalypse is now. At any rate that is the feeling that permeates the atlas of emotion etched by American culture in the late twentieth century. To illustrate this feeling, let us take a look at what happens generically in American film from the late 1970s to the late 1990s.
Probably the most important trend is the ascendance of the horror genre, in all its weird permutations (slasher, possession, occult). It remained a lowbrow, B-movie genre from the early 1930s into the 1970s, but then, with the rapid expansion of the Halloween (1978) and Friday the 13th (1980) franchises in the 1980s, it became the stuff of blockbuster box office. As Mark Edmundson and others have noted, when Silence of the Lambs, a tasteful, muted, sublimated—almost stuffy—slasher film won the Oscar for Best Picture in 1991, horror had become the mainstream of American film. It had meanwhile reshaped every other genre, even Westerns, for example, Clint Eastwood’s High Plains Drifter (1972).
Another important trend is an integral part of the ascendance of the horror genre. Where once female protagonists were hapless victims of violence unless they could rely on their fathers, husbands, and brothers—or the law—to protect them from the slashers, psychopaths, and rapists, they now take the law into their own hands and exact a new kind of revenge on a world of pervasive criminality coded as male. Here the thematic movement “from the bottom up,” from truly awful to pretty good movies, is unmistakable. A terrifically bad movie called I Spit on Your Grave (1976) first installs the female victim of rape in the role of single-minded avenger, for example, and it thereafter presides, in spirit, over more polished, upscale films like Silence of the Lambs.
Yet another important trend in late-twentieth-century movies is the hypothesis that the family as such is dysfunctional, perhaps even destructive of social order and individual sanity. As Robin Wood has argued, the horror genre is the laboratory in which this indecent hypothesis has been tested most scientifically, from The Texas Chain Saw Massacre (1974) to The Omen (1976) and Poltergeist (1982), all movies about families permeated or penetrated by unspeakable evil—families confused by the modern liberal distinction between private and public spheres. But the return of the repressed gangster, begun by The Godfather cycle in the 1970s, magnified in the 1983 remake of Scarface (the original appeared in 1931), and completed by The Sopranos on cable TV in the late 1990s, also demonstrated, in the most graphic terms, that strict devotion to family makes a man violent, paranoid, and finally unable to fulfill his obligations to loved ones.
If all you inhabit or care for is your family, both these genres keep telling us, you are the most dangerous man alive. At the very least you’ll forget your loyalties to a larger community, contracting your commitments until they go no further than the boundary of your own home; at that point, you will have destroyed your family and broken the rules that regulate life out there where everybody else lives. But how do you situate yourself in relation to a larger community—to the state, the nation—in the absence of this middle term, the family? It was an urgent political question in late-twentieth-century America, as the decomposition of the traditional nuclear family accelerated, and it was raised most pointedly on screen, by a culture industry supposedly out of touch with “traditional values.”
A fourth important trend in the movies of the late twentieth century is an obsession with the ambiguities and the consequences of crime. Film noir of the 1940s and 1950s was predicated on such ambiguities and consequences, of course, but the sensibility of that moment seems to have become a directorial norm by the 1980s. The difference between the good guys and the bad guys is at first difficult to discern in the battle between the criminals and the deputies staged by Arthur Penn in Bonnie and Clyde (1967), in part because it is clear from the outset that our heroes are deranged. It gets more and more difficult in the 1970s and 1980s, when Clint Eastwood’s Dirty Harry franchise makes the detective less likable than his collars; when drug dealers, pimps, and prostitutes become lovable characters in “blaxploitation” movies (Sweet Sweetback [1971], Shaft [1971], Superfly [1972]); when gangsters become the unscrupulous yet dutiful bearers of the American Dream (The Godfather [1972]); when Custer’s Last Stand becomes a monument to imperial idiocy (Little Big Man [1970]), even as the Indians become the personification of authentic America (Dances with Wolves [1990]); when the origin of civic renewal is a crime that appears as both domestic violence and foreign policy—it begins as incest and ends as the colonization of what was once exterior to the city fathers’ domain (Chinatown [1974]); and when the assassination of a president becomes comparable to the “secret murder at the heart of American history” (JFK [1991]: this is the district attorney talking to the jury!).
That not-so-secret murder is of course the American Dream itself—the dream that allows you to become father of yourself, to cast off all the traditions and obligations accumulated in the “Old World,” to treat the past as mere baggage. If you are father to yourself, you don’t have a father except yourself: you don’t have a past to observe or honor or, more importantly, to learn from. But when you’re on your own in this fundamental sense, as Americans like to be, you lean toward radical visions of the future and radical resolutions of problems inherited from the past. As D. H. Lawrence noted in his studies of classic American literature almost a hundred years ago, the masterless are an unruly horror.
And when you know that every cop is a criminal—and all the sinners saints—sympathy for the devil becomes your only option as a viewer of movies. The lawful and the unlawful intersect in startling ways in this social and cultural space. So do the natural and the supernatural, as witness Quentin Tarantino’s easy transition from Reservoir Dogs (1992) and Pulp Fiction (1994)—movies about the redemption of the most callous criminals—to the vampire flick written with Robert Rodriguez, From Dusk Till Dawn (1997), a movie that mixes so many genres it seems as contrived as a cocktail invented in SoHo. Witness as well the epidemic of celestial messengers, angry demons, impossible conspiracies, and talented witches on TV after Tony Kushner, an avowed Marxist, won the 1993 Pulitzer Prize for his two-part Broadway play Angels in America. Buffy the Vampire Slayer was waiting in the wings, state left. The X- Files entered earlier, stage right.
One more important trend, which tracks the other four quite closely, is the remarkable increase in spectacular violence done to heroes, victims, and villains alike. The analogue of video games is not very useful on this score, however, because the recipient of excruciating violence in the movies of the late twentieth century is typically a female who then exacts revenge (I Spit on Your Grave, Ms. 45 [1981]) or a male who revels in the physical torture he’s “taking like a man,” presumably because this debilitating experience equips him with the moral authority he will later need to vanquish the enemy without ceremony or regret. The Rocky (1976) and the Rambo (1982) franchises sponsored by Sylvester Stallone are the founding fathers of the latter movement, in which masochism finally becomes unmistakably male.
The Lethal Weapon (1987) franchise animated by Mel Gibson’s jittery impersonation of Norman Mailer’s “White Negro”—Gibson’s cop character has to teach his African American partner (Danny Glover) how to live in the moment, how to be existential if not suicidal—is the parallel film universe in which guys get crazy because they have to, because the world has excluded them from the theater of good wars and good jobs, where boys once learned how to be men. Fight Club (1999) is the final solution to this fear of male irrelevance and the apogee of male masochism at the movies. In its moral equivalent of war, men keep trying to mutilate themselves, but we know it’s okay because they use their bare hands: until the ugly and inexplicable ending, they’re purposeful artisans, not mindless machine herds.
Experience and Explanation at the Cineplex
Let us work backward in this list of filmic trends of the late twentieth century to see if we can make historical sense of them, to see if they have anything in common. The increase of spectacular violence at the movies has of course been explained as a result of the recent decline in the median age of the audience—adolescents it is said, have always experienced the onset of their pubescence and then their reluctant graduation to adulthood in the unholy images of dismemberment. More scenes of carnage, more rivers of blood are what these hormone-fueled maniacs need and what Hollywood gladly delivers. It is an argument that works pretty well until you realize that adults still buy more tickets than the teenage crowd and that the violence on view increased exponentially in every genre toward the end of the twentieth century, to begin with in Westerns and war movies—for example, The Wild Bunch (1969), Apocalypse Now (1979), Platoon (1986), and Saving Private Ryan (1998)—where teenagers did not tread unless accompanied by their parents.
The better arguments are offered by film theorists who suggest that the extreme fury inflicted on the human body in the movies since the 1970s should be understood in terms of a general unsettlement of subjectivity—of selfhood—and who suggest that by the late 1980s, the signature of this unsettlement had become male masochism. In The Philosophy of Horror, a groundbreaking book of 1990, Noel Carroll suggests that the ever more elaborate violence visited upon the characters of his favored genre constitutes an “iconography of personal vulnerability.” Horror as such, he insists, is “founded on the disturbance of cultural norms.” The late-twentieth-century festival of violence in movies is, then, a visual depiction, a pictorial externalization, of the anxieties necessarily attached to the end of modernity, when “an overwhelming sense of instability seizes the imagination in such a way that everything appears at risk or up for grabs.” But the crucial cultural norm in question is the father of himself—the modern individual, the American Adam.
That is why Carroll correlates the “death of ‘Man’” postulated by postmodern theory with the “demotion of the person” expressed by the extraordinary violence of a recent horror film—the popular, colloquial, vernacular version of academic elocution can be seen at the Cineplex, he suggests, long before (or after) you are forced to read Foucault and Derrida by your demented professors. Carroll summarizes his argument as follows: “What is passing, attended by feelings of anxiety, is the social myth of the ‘American’ individualist, which, in the case of horror, is enacted in spectacles of indignity, [and is] directed at the body.” What is passing, right before our very eyes in the artificial night of the local theater, is that remnant of the nineteenth century, the bourgeois proprietor of himself. It is a violent business, this cinematic execution of our former self, and it can never be finished. No wonder we want to prolong the agony on screen.
What is also “passing” in the torrent of violence that floods every genre in the late twentieth century is manhood as it was conceived in the “era of the ego,” circa 1600 to 1900, as it was then embalmed in the canonical novels and the literary criticism of the 1920s—Ernest Hemingway and Lewis Mumford come to mind—and as it was reenacted in movies, mainly Westerns, of the 1930s, 1940s, and 1950s. The strong, silent types who inhabited that imaginary American space west of everything give way, by the 1980s and 1990s, to male leads who are anything but. All they want is to talk about their psychological afflictions, as if we—the audience—can cure them. Tony Soprano is the culmination of this cinematic species. And it is no accident that the backstory informing every episode is Tony’s search for meaning in a world turned inside out by race and gender (“Woke up this morning, the blues [that is, the blacks] moved in our town,” as the song goes over the opening credits). For it is here, in the world of therapy and thus the language of psychoanalysis, that the problem of male masochism at the movies becomes visible and, in the most old-fashioned sense, remarkable.
Kaja Silverman and Carol Clover are among the accomplished film theorists who have deployed the language of psychoanalysis to interpret the systematic abuse and abjection of males in late-twentieth-century movies (by then, a film theorist who did not trade in the currency of psychoanalysis was an anomaly, something like a chaperone at a bachelor party; Noel Carroll resisted the urge and found a voice by falling back on the Marxoid rhythms of Fredric Jameson). Like their counterparts—David Savran and Barbara Creed are probably their best contestants—both Silverman and Clover rely on two famous essays of 1924 by the founding father, Sigmund Freud, in which masochism is defined as the psychological space that permits, maybe even demands, male experimentation with an imaginary femininity.
In all the clinical/case studies Freud cites, it is men who are being maso-
chistic, but the passivity that allows their penetration, laceration, and so forth, is coded as female. “In the case of the girl what was originally a maso-
chistic (passive) situation is transformed into a sadistic one by means of repression, and its sexual quality is almost effaced,” he declares. “In the case of the boy,” on the other hand, “the situation remains masochistic.” For he “evades his homosexuality by repressing and remodeling his unconscious phantasy [of being beaten, penetrated, by his father]; and the remarkable thing about his later conscious phantasy is that it has for its content a feminine attitude without a homosexual object-choice.” In these psychoanalytical terms, maso-
chism on screen looks and feels like men trying to be women—men trying to identify as women—but without cross-dressing and without coming out of a closet to renounce heterosexuality. Again, it is the psychological space in which an imaginary femininity becomes actionable.
At any rate, it is the cultural space in which the mobility—the increasing instability—of masculinity can be experienced. Clover has shown that the predominantly male audience for crude horror films like I Spit on Your Grave is not indulging its sadistic fantasies by identifying with the rapists, as pious mainstream critics would have it; instead, that male audience is placing its hopes and fears in the resilient character of the Last Girl Standing, the young woman who ignores the law because she has to, the gentle female who comes of age by killing the slashers and the psychopaths. Violence is the cinematic medium in which this transference, this out-of-body experience, gets enacted. Violence is the cinematic medium in which male subjectivity gets tested, in other words, and is finally found wanting except as a form of emotional solidarity with the female character who outlasts her tormentors.
So male masochism at the movies looks and feels bad—it is hard to watch, particularly when Mel Gibson’s William Wallace is getting tortured in Braveheart (1995), when Sylvester Stallone’s Rocky is being beaten to a pulp, or when Brad Pitt is begging for more punishment in Fight Club—but it accomplishes something important. Its violent sensorium lets us experience the end of modernity as the dissolution of male subjectivity and the realignment of the relation between what we took for granted as feminine and masculine (keeping in mind that this realignment may well prove to be regressive and destructive). Freud was on to something, then, when he suggested that by way of male masochism, “morality becomes sexualized once more [and] the Oedipus complex is revived.” Translation: the identities we discovered as we detached ourselves from a primal, physical, emotional connection to our parent(s)—as we worked through the Oedipus complex—are perturbed and perplexed, perhaps even reconstructed, by the horrific experience of maso-
chistic violence at the movies. These identities now become fungible, divisible, negotiable, recyclable—in a word, scary.
The criminal element of late-twentieth-century film is of course related to the increase of spectacular violence done to heroes, victims, and villains alike. The American fascination with crime runs deep because rapid change is normal in this part of the world—here “crisis becomes the rule,” as a famous philosopher, John Dewey, once put it. His admirer Kenneth Burke explained that “any incipient trend will first be felt as crime by reason of its conflict with established values.” It’s hard to distinguish between criminals and heroes because they both break the rules and point us beyond the status quo, and they’re always urging us to expect more (the heroes of sports are heralds of this type, from Bill Russell and Mickey Mantle to Michael Jordan). They’re like the revolutionaries of the college textbooks—Max Weber’s “charismatic” leaders—but they’re more rooted in everyday routine, in what we call “practice.” They’re more visible, more approachable, more likable than, say, Lenin, Mao, Castro, or Che, because they don’t want to change the world, they want to change the rules.
So crime, like violence, is as American as apple pie. But the late-
twentieth-century filmic rendition of criminal behavior departs from its antecedents, especially in depicting gangsters. Where such felons were once represented as deviations from a norm of manhood and domesticity, and thus as a threat to the peace of the city and the integrity of the nation—think of Paul Muni, Jimmy Cagney, and Edward G. Robinson in their founding roles of the early 1930s—by the 1970s and 1980s, Gangsters R Us. By the 1990s, accordingly, crime as committed at the movies became the origin and insignia of everything American. This is a useful notion, mind you. It forces us to acknowledge that the Western Hemisphere was not a “new world” when Europeans invaded America and that the idea of original sin still has explanatory adequacy. At any rate, it lets us know that our country was not born free: it is no exception to the rules of history, no matter who—whether Marx or Freud or Weber—wrote them up as the laws of motion that regulate modernity.
It also lets us know that the private space of contemporary home and family is not exempt from the public atrocities of the political past. Poltergeist and the remake of The Haunting (1999) suggest, for example, that the extermination of Indians on the frontier of American civilization and the exploitation of child labor during the Industrial Revolution cannot be forgotten, not even by the most ignorant individuals and the most insulated, intimate, social organisms of the present—they suggest that the return of the repressed is always already underway from within the family. The political is personal in the late-twentieth-century United States. Otherwise it is almost invisible.
The “traditional” family was, after all, breaking down in the late twentieth century, and crime rates were, in fact, climbing in the 1960s and after: for many observers, such as George Gilder and Charles Murray, the relation between these two phenomena was clearly and simply cause and effect. And the extrusion of women from the home—from familial roles and
obligations—looked to them like the proximate cause of the cause. Feminism signified “sexual suicide,” in Gilder’s hysterical phrase. It takes on new meanings in the context of awful yet rousing movies like I Spit on Your Grave and Ms. 45 Here the female protagonists, young professional women who are victims of brutal and repeated rape, decide to kill the perpetrators instead of waiting on the law made by their fathers, husbands, or brothers. In doing so, they broaden the scope of their vengeance to include male supremacy itself. They’re carefully killing off the idea that men should have control
of women’s bodies. So the figurative mayhem on view is not suicide but homicide—it is patricide, the adjournment of the law of the father, the inevitable result of giving women the weapons they need to protect themselves against violent men.
And that brings us back to where we began, to the ascendance of the horror genre, wherein violence, crime, and family are typically represented, and sometimes rearranged, by the suffering of women at the hands of men. What links our five filmic trends of the late twentieth century, in this sense, is gender trouble—that is, “the disturbance of cultural norms” which derives from the social (and thus political) problem of the new, “postfeminist” woman and which redraws the perceived relations, the effective boundaries, between males and females.
Cartoon Politics
A similar disturbance, a similar problem, meanwhile recast the politics of cartoons. In the next chapter, we’ll get to the excremental visions of Beavis and Butthead and South Park and to the liberal tilt of The Simpsons. There we will note, once again, that even in these small-screen suburbs of Hollywood, popular culture in the United States kept moving to the left after Jimmy Carter, after Ronald Reagan, after George H. W. Bush, and, yes, after Newt Gingrich, leaving the learned scribes and earnest scolds from the New Right to wring their hands until the second coming of Bush in 2000. For now, to conclude this chapter, we’ll look at The Little Mermaid (1990), Beauty and the Beast (1992), and Toy Story (1995) on the big screen—for these were the movies that woke Disney Studios from its long artistic slumber and reminded us of the possibilities residing in the comic abstractions of animation.
Females are never very far from the leading roles in Disney’s feature-length productions: the studio’s memorable hits before the 1960s and 1970s, when it began cranking out musical extravaganzas starring dogs, cats, and mice, were Snow White and the Seven Dwarfs (1937) and Sleeping Beauty (1959), both driven by their female leads’ search for true love in a world dominated by incestuous envy or anonymous violence (and what you remember about Bambi [1942] is the death of the mother). The character of Cruella de Vil in One Hundred and One Dalmations (1961) affirms this precedent, but she has no human content or counterpart—she’s all cartoon. Something new happens in The Little Mermaid, a movie produced with the computer technology and political sensibility promoted at Disney by Jeffrey Katzenberg, a young visionary who, like Matt Groening over at FOX TV, wanted to make comics that mattered.
The questions they kept asking in the 1990s were, What are the functions of families if females have been freed from an exclusive preoccupation with domestic roles (as wives and mothers)? What can we do with a past we’ve outgrown? What are parents good for, anyway? And where’s Dad? In Disney films, however, as in fairy tales, these familiar antecedents are never given by the past—they’re not just there because they happened in the past,
because they somehow preceded and produced us. Instead, they get created by fantastic narratives that erase them, reinstate them, and reinvent them at inappropriate times, according to the bizarre but rule-bound logic of contradiction that regulates cartoons (to wit, every category or distinction sustained by common sense is now subject to violation by the principle of plasticity). They’re always there, in short, only not so that you would notice. They’re mostly missing, like absent causes or like real parents who can’t seem to show up when it counts but who shape your lives anyway.
Ariel, the little mermaid, is the “postfeminist” woman par excellence: she has Promethean curiosity and energy and ambition. “What’s a fire,” she asks, “and why does it, what’s the word, burn?” She wants to escape from the fluid, formless, watery world where the everyday objects that signify modern human civilization—things like eating utensils—have no name and no purpose and where fire, the origin of civilization as such, is impossible. She wants the forbidden knowledge available only to the people who walk around on land and use wooden ships to skim her ocean. Above all, she wants freedom from her father’s stifling supervision—she wants to be the father of herself.
Ariel introduces her agenda of desires by singing about all the things she’s found in the shipwrecks she explores. She calls this stuff treasures, wonders, gadgets, gizmos, whose-its, what’s-its, thingamabobs. She doesn’t understand what any of it is for—how or why it gets used by people “up there”—but she’s sure that all of it somehow fits together in a way of life that is very different from hers. When Ariel calls herself “the girl who has everything” (she’s using a phrase common in the 1980s), she’s really complaining that by themselves, as simple objects, her things don’t have any significance. So when she goes on to sing “I want more,” she doesn’t mean more stuff. What she wants is the way of life in which the stuff makes sense.
That’s why she doesn’t sing about the stuff in the rest of the song. Ariel sings instead about seeing, dancing, running, jumping, strolling—about walking upright as the prerequisite of “wandering free.” What would I give, she asks herself, “if I could live out of these waters?” But it’s obvious that she can wander more freely in the water than anybody her age can on dry land. So she must have some ideas about freedom that involves more than moving about in space. She makes those ideas clear in the next part of the song. “Betcha on land,” she sings, “they understand—bet they don’t reprimand their daughters.” She’s singing about how her father has scolded her, has tried to keep her in her proper place, under water, where nothing seems to matter. Then she identifies herself with all the other ambitious girls who don’t want to stay in that same old place where men get to make all the decisions: “Bright young women, sick of swimmin’, ready to stand.” Ready, that is, to stand up to their fathers (and husbands and brothers), to face them as equals on their own two feet.
Ariel’s agenda is eventually contained by her devotion to the human being she falls in love with (yes, he’s a handsome prince) and learns to want to marry. All the dangers of her proposed departure from her undersea world—all the sexual tensions and fears this departure from her family, this deviation from her father’s wishes, might create—are eventually tamed by the idea of marriage. When Ariel is paired off with the prince, the possibilities of her ascent become familiar: her goal becomes the start of a new family that marriage symbolizes.
But this use of the familiar let Katzenberg and his animators take chances with their mermaid story without offending or troubling their audience, which of course included millions of unsophisticated children as well as educated adults. The movie did contest the common sense of its time—it did contest the related notions that women should not want what men do and that women are so naturally different from men that equality between them is unimaginable. Even so, it ends up suggesting that marriage is the way to answer all the questions raised by those “bright young women” who were, then as now, “sick of swimmin’” under the direction of their fathers (and husbands and brothers). And that is a way of suggesting that the important questions of our time can be answered from within the family.
You may have noticed that Ariel seems to have no mother—no one to help her father, Triton, the king of the oceans, decide what’s best for her. But look closer. The only character in the movie who is the king’s equal is Ursula the Sea Witch (she’s the devil who gives the girl legs in contractual exchange for her voice, thus putting her on land in range of the prince, but also casting the little mermaid—that would be the teenager—as the Doctor Faustus of the late twentieth century). Ursula competes directly with Triton for control of Ariel’s future, as if she were the mermaid’s mother. And in her big number, she shows this “daughter” how to get the man she wants. She’s the closest thing to a mother Ariel has.
Now Ursula seems simply evil because what she’s really after is the power of the king—she used to live in Triton’s castle, she tells us in passing, and she wants to move back in. So it’s clear that, once upon a time, just like the devil himself, she challenged the king’s powers from within his own home, his own castle, and got kicked out as a result. No matter where we look in the movie, then, it seems that females, both mothers and daughters, have to leave the castle if they’re going to stand up to the king as his equal. This departure is either cause or effect of conflict with the father who rules that castle, but it happens to both Ariel and Ursula.
Yet we’re supposed to be able to tell the difference between them, apart from the obvious differences of age, size, shape—and species (the devil is a huge octopus). We’re supposed to know that Ariel is good and Ursula is evil. So we have to ask, what does Ursula want that makes her evil? What makes her rebellion so awful? And why is Ariel’s rebellion acceptable in the end, when Triton relents and uses his powers to give his daughter legs and a human husband? Both stand up to the father. But one is killed by Ariel’s husband-to-be; the other is rewarded with entry into the enlightened world of men on earth. One is driven out of the castle; the other chooses to leave. One tries to challenge the inherited (“normal”) relation between father and mother; the other wants to re-create this relation in a new family. One breaks the law of the father by defying him; the other upholds it by doing the same thing. How so?
If we read this movie as a comic retelling of the ancient Oedipus cycle, we can see that the law of the father still works only if Ursula’s rebellion gets punished by death. Only if the sea witch is removed can Ariel pair off with the prince by calling on her father’s great powers. But remember that Ursula is in effect the mermaid’s mother. And remember that she is killed by the prince—the future son-in-law of the father, the king—with the bowsprit of the shipwreck Ariel found at the very beginning of the movie. Everybody is cooperating, in the end, to aim this shaft, this phallic device, at the belly of the beast; everybody is cooperating to kill the mother, to remove her from the scene she’s tried to steal. By doing so, they preserve the law of the father and let Ariel ascend to earth.
That is what must happen if we believe that the important questions of our time can be answered from within the family—something’s got to give if we’re confined to this small social space. The Little Mermaid suggests, accordingly, that either the law of the father or the mother herself will give way. All those “bright young women” still “sick of swimmin’” need new ground
to stand on, but if that ground can be found only within the family, father or mother must go. No wonder single-parent households were the fastest-
growing demographic of the late twentieth century.
Where’s Dad?
Beauty and the Beast was the politically correct sequel to The Little Mermaid. It solidified Disney’s reputation as the place to be if you wanted to experiment with computerized animation—Pixar soon lined up with everybody else—but the movie was a boring love letter to the narcissistic nerds, the high school geeks, who wrote it as their revenge on the athletes who, once upon a time, got the girl. Here the female lead rejects the dumb jock and chooses the hairy intellectual with the big library, all in the name of her hapless father, the absentminded professorial type who invents useless, even dangerous machines. He’s the mad scientist without portfolio, without purpose—he’s the father figure who, like Homer Simpson, remains benign because he has no power, no purchase on the world. His inventions explode like fireworks, not like bombs.
So the law of the father still works in this monotonous movie, disabling the cartoon principle of plasticity by presenting as given, as frozen, the categories that were at issue, in question, in motion, in the larger culture. The first Pixar contribution to the new Disney mix of the 1990s, Toy Story, won’t stand for that kind of authentic abstention and its cynical attendant, resignation from the world as it exists. Like The Simpsons, it forces us to play with the idea of the father as an absent cause of everything that’s both funny and sad at the end of the twentieth century.
In any event it makes us want to ask, where’s Dad? Mom is in plain sight from the standpoint of all the toys who open the movie by planning a parallel birthday party for her son, their beloved Andy. Dad is invisible from beginning to end unless we read their story, the toys’ story, as a treatise on how to reinsert fathers into the family romance of our time. His absence is the premise of the movie, in other words, but the writers don’t leave it at that; they urge you to believe that Toys R Us by suggesting that Woody and Buzz, the cowboy and the astronaut who compete for the son’s undivided attention, are the father we all need.
Woody (Tom Hanks) is the distant echo of the Western hero who resists progress and fears the future because he knows they mean the eclipse of his independence, his unique and self-evident position in a fixed moral universe. He recalls Sebastian the Crab from The Little Mermaid, another unwitting opponent of development, who wants Ariel to stay down under “de water” because up there on dry land, they work too hard: “Out in de sun, dey slave away.” But Woody also stands at the end of a long line of white-collar middle managers who crowded TV screens in the 1950s and 1960s, pretending that “father knows best” (think of the staff meeting that Woody holds before Andy’s birthday party). So he represents a hybrid of male images drawn from film and TV, the characteristic cultural media of the twentieth-century United States.
Buzz (Tim Allen) is Thorstein Veblen’s engineer, Vince Lombardi’s shoulder-padded poet, and Tom Wolfe’s astronaut rolled into one—he’s the ex-jock with the right stuff who hates the past and believes almost religiously in the technological armature of progress. He wants to go to the moon and thinks he already has the equipment he requires. He represents “the machine,” the favorite metaphor of American writers in the twentieth century, but he reminds us of cheesy Saturday-morning cartoons, too, and not just because he recognizes himself in a TV commercial.
Together, and only together, Woody and Buzz are the missing father the son needs. At the outset, Woody is simply afraid of the future—he might be displaced by a new toy—and Buzz is just contemptuous of the past, when space travel was science fiction. By itself, neither side of the modern American male can make the family whole again by restoring the father to his proper place at its head. And each, by himself, is vulnerable to Sid, the nasty boy next door who captures them.
This boy is another mad scientist, a young Frankenstein, but he means business—he’s perverted modern technology to the point of absurdity if not criminality by turning beloved toys into ugly and malevolent machines. Like the folks who are exploring the human genome, he takes things apart and reassembles them as if he’s trying to invent brand-new species (think of the disfigured baby doll’s head glued to Erector Set spider legs). He scares everybody, including his sister, but mainly he scares the parents in the audience, who still like to think they’re the origin of the next generation.
For this kid’s goal is to downsize Dad by dividing him up and dispersing his parts among many new contraptions. Sid represents the hard side of
Microsoft—he’s bound for glory in Silicon Valley by depriving dads of the good jobs they had when baby boomers were dutiful sons. Like the audience, and like dads everywhere, Woody and Buzz experience this threat as dismemberment, perhaps even as impending castration. No kidding. Susan Faludi’s best-selling book of 1999 quotes Don Motta, a middle-aged middle manager laid off by McDonnell Douglas in 1992, as follows: “There is no way you can feel like a man. You can’t. It’s the fact that I’m not capable of supporting my family. . . . I’ll be very frank with you. I feel I’ve been castrated.”
Only by joining forces, only by coming together and trading on each other’s strengths, can Woody and Buzz—the two sides of American manhood (thus fatherhood)—defeat the perversions of technology and familial relations that Sid represents. Only then can they resurrect the mangled
toy soldiers, the heroic fathers from World War II, who were buried by Sid in the sandbox—buried that is, in the post-Vietnam memories of their postwar children. Only then can they reappear, courtesy of old-fashioned rocket science, at their son’s side, just as his family reaches the horizon of no return.
So Toy Story was a great deal more complex and frightening than another parable of “more work for mother” or another evasion of the question, where’s Dad? This movie spoke directly and productively to the sense of loss—the loss of jobs, the decomposition of “traditional” families, the fear of the future—that permeated American culture back then. The sequel did not.
In Toy Story 2, Woody still fears the future in which Andy will outgrow his favorite fatherly toy. He fears it so much that he had decided to retire to a museum of TV memorabilia, the cartoon equivalent of a nursing home, where he won’t have to beg for attention. Buzz understands Woody’s fear, but he’s willing to let go, even to be put on the shelf where childish things languish and get forgotten until the garage sale—he’s willing to let this son grow up and decide for himself what’s worth keeping and caring for. Before Woody can leave for the museum, however, he gets kidnapped by a crazed antique dealer. Buzz then leads the other toys on a successful rescue mission that ends with Woody’s return to the familial fold, just in time for Andy’s return from summer camp. The newfangled father apparently knows best, even though his purpose is merely to retrieve the wanderer, to keep Woody where he belongs, in the bosom of his—or is it the?—family.
The original Toy Story taught us that a usable model of fatherhood and family—of selfhood as such—can’t be imagined by committing ourselves to the past or the future, as if these are the terms of an either/or choice. It taught us that we exile ourselves from the present and from our families when we try to stay in the past along with Woody or when we try to flee the past along with Buzz. In the end, these two understood that each had to adopt an attitude toward history which allows for both previous truth and novel fact—an attitude that lets each of them change the other and so makes their cooperative effort, their unison, greater than the sum of its parts. They taught us that the point is to keep the conversation going between the past and the future, not to choose between them. The point is to live forward but understand backward.
So the sequel is less hopeful, more elegiac (more funereal, more nostalgic) than the original. It spoke directly, but not very productively, to the sense of an ending that our strange millennial moment afforded us. For it suggests that you can’t teach an old toy new attitudes. All you can do is get older and watch as your kids grow up, move out, move on. And pretty soon you’ll be on the shelf like your own father, downsized not by technology but by time—by the growing gaps of memory you share with the people who say history is bunk.
Such weary resignation probably seems realistic as the baby boomers start thinking about how to finance their retirements. Even so, Toy Story 2 inadvertently advertises another and more useful truth. It goes like this. Only when our attitudes toward history become fixed do the past and the future look the same, that is, impervious to change. When those attitudes are unhinged by watching the end of modernity, when we understand that the world is a fable waiting for our editorial revisions, we know better.
You might now ask, Ye gods, how and why did cartoons become so serious and so central in American culture at the end of the twentieth century? Good question. We start with a tentative answer in the next chapter.