And their children wept, & built
Tombs in the desolate places,
And form’d laws of prudence, and call’d them
The eternal laws of God.
—WILLIAM BLAKE, THE FIRST BOOK OF URIZEN (1794)1
The fate of our times is characterized by intellectualization and rationalization and, above all, by the disenchantment of the world.
—MAX WEBER, SCIENCE AS A VOCATION (1917)2
“Don’t write about romanticism.”
In the course of working on this book, I heard that advice—if not always in those words—from my wife, a brilliant writer and editor; my actual editor (who cleaved the original manuscript in half); and any number of trusted friends. Sometimes they just said it with their eyes. There’s something about the word “romanticism” that elicits both an eye roll of the soul and a reflexive crouch of the mind. What is romanticism again? I had to learn about that in college, but I never really got it.
The academics haven’t helped. It’s a term that has been stretched and twisted like taffy. No word, not even “fascism,” has proved more difficult to define, particularly among the scholars who study it.3 For starters, every country has its own kind of romanticism, because romanticism expresses itself in the language and culture in which it appears. Thus, romanticism is often found everywhere and nowhere depending on whom you listen to. Adding to the problem, romanticism often manifests itself most acutely as a rebellion against definitions, distinctions, and classifications. The romantic is usually quick to say, “Don’t label me, man.”
Even Isaiah Berlin began his seminal 1965 A. W. Mellon lectures on the “Roots of Romanticism” by declaring he wasn’t going to make that blunder. “I might be expected to begin, or to attempt to begin, with some kind of definition of Romanticism, or at least some generalisation, in order to make clear what it is that I mean by it. I do not propose to walk into that particular trap.”4 In 1923, the philosopher Arthur O. Lovejoy gave a lecture to the annual meeting of the Modern Language Association of America titled “On the Discrimination of Romanticisms.” After getting some laughs by recounting all of the different people who’d been declared “the father of Romanticism”—Plato, Saint Paul, Francis Bacon, Immanuel Kant, etc.—he then listed all of the different kinds of romanticism and declared: “Any attempt at a general appraisal even of a single chronologically determinate Romanticism—still more, of ‘Romanticism’ as a whole—is a fatuity.”5
In his acclaimed book Bobos in Paradise: The New Upper Class and How They Got There, David Brooks argues that modern American culture is suffused with romanticism. But perhaps because he was better at taking editorial advice, he rejected the term in favor of “bohemianism.” “Strictly speaking, bohemianism is only the social manifestation of the Romantic spirit,” Brooks writes. “But for clarity’s sake, and because the word Romanticism has been stretched in so many directions, in this book I mostly use the word bohemian to refer to both the spirit and the manners and mores it produces.”6
Given the success that book had, maybe I should have listened to those warning me off from romanticism. But I’m sticking with it because, as a historical and social phenomenon, it’s the better term. No one talks about “Bohemian nationalism.” So herewith I will offer some notes toward a definition.
As we saw in the early discussions of Rousseau, romanticism is often described as a rebellion against reason, and it often is. Others describe it as the primacy of emotions and feelings. That gets closer to it, I think. The emphasis on feelings certainly explains why romanticism’s first and most powerful expression was artistic.
The Age of Reason was a revolution not just of the mind but of the whole of society. And for many, it felt like an invasion. In Germany and large swaths of Europe, that feeling was literally true. Romanticism, Joseph Schumpeter noted, “arose almost immediately as a part of the general reaction against the rationalism of the eighteenth century that set in after the revolutionary and Napoleonic Wars.”7 The poets—and painters, and novelists—were the frontline soldiers in the human soul’s great counteroffensive against the Enlightenment. William Blake, the great romantic poet, loathed everything that John Locke and Isaac Newton bequeathed to the world. When Blake proclaimed, “A Robin Red breast in a Cage / Puts all Heaven in a Rage,”8 the cage he had in mind was the Enlightenment. It wasn’t quite the humans versus the machines; it was more like the champions of the soul against the promoters of machine thinking.
I happen to think the romantics were on to something. Even sticking to my promise to keep God out of this book, I believe that there is more to life than what can be charted, graphed, or otherwise mathematized. And these things are important. What we imbue with significance is in fact significant; there is meaning to the meaning we impose on things.
It is my argument in this chapter that the romantic era in our culture—including that slice we call popular culture—never ended. It has ebbed and flowed, I suppose, but it has always dominated much of what we call popular culture. And today it essentially defines our shared culture. In fact, “shared culture” might be a better term for popular culture because popular culture is seen as for the masses, when the truth is that almost everyone, rich and poor, goes to the movies, watches at least some popular TV shows, and is at least somewhat familiar with pop music. Class differences explain far less than age differences when talking about different tastes in popular entertainment.
The main reason I think the romantic era never ended is that it wasn’t an era; it was a reaction. Until the scientific revolution and the Enlightenment, humans did not, as a rule, divide the world into secular and religious, the personal and political, reason and superstition. Science was magic and magic was science for most of our time on this earth. The ancient Roman priests who studied the entrails of birds to predict the future weren’t asking anyone to make a leap of faith. This was sound science. And it was magic. And it was religion. Scholars have debated the strange and often beautiful relationship between magic and science for a very long time. Was medieval magic rational? Anti-rational? Non-rational? Were the alchemists just the first chemists?9
The Protestant Reformation, the Enlightenment, and the scientific revolution are widely credited with giving birth to a more secular and less superstitious world. All of that is true enough. But the process was more complicated than it may seem. The image at the heart of the Enlightenment is light—the idea being that science and reason banished the shadows of ignorance. But that metaphor is misleading when it comes to the human mind. The Enlightenment was really more like the Unbundling. In the medieval and primitive mind, science, magic, religion, superstition, and reason were all more or less fused together. The pieces started to separate with the Protestant Reformation, when magic and religion started to be considered unrelated things. Then science and religion drifted apart from each other. Then, after the Enlightenment, traditional religion and politics split off.
The pre-modern mind was like an enormous iceberg entering new waters, and as it got closer to the modern period, huge chunks calved off. But here is the important point: The chunks don’t melt away, at least not completely. They just become smaller icebergs. The scientific revolution did not get rid of religion. The Age of Reason did not banish superstition. One need only look around to see that religion and superstition (not the same thing) endure. The triumph of reason didn’t mean turning us all into Vulcans from Star Trek, slaves to cold logic. (Indeed, even Vulcans have emotions; they just work very hard to keep them in check.) Rather, after the Enlightenment, priority was increasingly given to reason in law, public arguments, and most institutions, at least most of the time.
That’s what is so funny about the at times visceral hatred for Isaac Newton among the early romantics (second only to their hatred for John Locke). To Blake and Coleridge and other romantics, Newton’s physics had demystified the cosmos and our place in it, thus paving the way for a mechanistic and soulless universe. But for all of Newton’s monumental contributions to science, he was, at heart, a mystic. He was more interested in alchemy than in gravity. He saw himself as an explorer of the occult, determined to rediscover the lost magical secrets of the ancients. As John Maynard Keynes once put it, Newton wasn’t the first scientist; he was the last of the magicians. Thomas Edison and Alexander Graham Bell regularly attended séances. Edison tried to invent a phone to talk to ghosts. Guglielmo Marconi, the inventor of the radio, wanted to do the same thing using radio waves.
Today, neuroscientists and psychologists fill their days documenting the ways the human mind acts irrationally.10 Our animal brains have programs and subroutines designed to keep us alive, not to determine the truth. The ability to reason is an important tool for survival. But is it more important than fear? Anger? Loyalty? Remember, for primitive man, survival was a collective enterprise, and the cognitive tools we developed were far more varied and complicated than simply rational. For instance, a superstition or taboo about the importance of cleanliness may be passed from one generation to another for a thousand years without a single shaman, priest, or parent making any reference to microbes or germs. But the group that follows prohibitions against eating unclean food has an evolutionary advantage all the same. Similarly, groups that adhere to notions of retributive justice—both internally for traitors and externally for strangers—will be more likely to pass on their genes. More broadly, groups that have a coherent vision of group meaning—religious, political, social, etc.—will likely be more successful at cooperating, and cooperation is the core evolutionary adaptation of humanity.
The Enlightenment didn’t erase these apps from our brains. They’re running all the time, generating emotional, instinctual responses to events and ideas that we sometimes recognize as part of our baser natures, and that we sometimes mistake for some higher ideal. Romanticism’s emphasis on emotion and the irrational, the significance of that which cannot be seen or explained through science but can be felt intuitively, is the tribal mind’s way of fighting its way back into the centrality of our lives. My argument here is that popular culture gives us the clearest window into the romantic dimension that we all live in. To demonstrate this, I will focus on some of the classic hallmarks of romanticism. But it needs to be emphasized that popular culture isn’t romantic because of the lasting influence of some romantic writers and poets (though some influence is surely there). Popular culture is romantic because Enlightenment-based society naturally invites a romantic reaction; we crave the unity of meaning that we have lost, and we yearn for the enchantment purged from everyday life.
I believe I could make this case almost entirely by looking at rock and roll—and rock and roll alone. My claim is not so much that there are elements of romanticism in rock and roll, but that rock and roll is romanticism. In fact, I suspect one could say something similar of popular music generally, so as to include hip hop and country music as well. What are the key themes of rock and roll and these other genres? Any list would include: defy authority and throw off the chains of “the Man,” true love, damn the consequences, nostalgia for an imagined better past, the superiority of youth, contempt for selling out, alienation, the superiority of authenticity, paganism and pantheism, and, like an umbrella over it all, the supremacy of personal feelings above all else.
Rock and roll, from its most commercial forms to its most authentic, fancies itself as outside “the system.” It claims a higher or truer authority based in feelings that, like the poets of earlier generations, defy the tyranny of the slide rule and the calculator. Its more grandiose champions put rock on a par with all of the higher forces, like a Titan or a god in eternal battle with the tyrannical deities of the system. “Christianity will go,” John Lennon assured us. “It will vanish and shrink…We’re more popular than Jesus now—I don’t know which will go first, rock and roll or Christianity.”11 “You see,” U2 guitarist The Edge tells us, “Rock and roll isn’t a career or hobby—it’s a life force. It’s something very essential.”12
Robert Pattison, in his The Triumph of Vulgarity: Rock Music in the Mirror of Romanticism, argues that rock and roll is vulgar both in the classic sense—“vulgar” is derived from the Latin vulgus meaning crowd or the common people—and in the snobbish sense of being crude. Because rock is democratic, it appeals to us all and makes no pretension to higher culture or higher ideals. It speaks to the gut, the pantheistic primitive in all of us. By now the reader should have a good sense of what the primitive is, but pantheism requires a bit of explication. Pantheism, from the Greek pan (all) and theos (god), is the belief that all of reality is divine, and that God (or gods) suffuses us and everything around us. Earth is heaven and heaven is earth.
Is there any art form that is more successful at re-enchanting the world, to borrow Max Weber’s phrase, than music? Who hasn’t had that feeling of being elevated or transported from the mundane world by music? “Music expresses that which cannot be put into words and that which cannot remain silent,” observed Victor Hugo.13
Put on your headphones and walk down a busy city street; the world seems set to music. Or watch people listening to music on their iPods as they walk past you, giving new literalism to the line widely—and falsely—attributed to Nietzsche: “And those who were seen dancing were thought to be insane by those who could not hear the music.”14 This feeling is the conceit behind countless movies that use music to transport us to that feeling of isolated oneness with the world around us. (The recent film Baby Driver is a good example of the genre.)
Think about how humans first enjoyed music. A primitive band sits around the fire pounding drums and singing its treasured folk songs and chants. No doubt this practice was entertaining, but it was also a way to commune with the gods or with fellow members of the troop, or a means to honor revered ancestors or mourn fallen warriors, or a method of warding off evil spirits—or some combination of the above. It was democratic and personal, divine and worldly, all at once.
Rock and roll is the primitive’s drumbeat hooked up to killer amps. It ties together meanings we are taught to keep separate; it ratifies the instincts we are instructed to keep at bay. It tells us, in the words of Jethro Tull, “Let’s bungle in the jungle,” because “that’s all right with me.”
Nowhere is the romantic mixture of pantheism, primitivism, and the primacy of feelings more evident than in rock’s appeal to inner authority and authenticity. Despite the fact that we may be surrounded by thousands of fellow fans dancing or head banging in syncopated unison, rock still tells us that we must move to the beat of our own drummer. For Hegel, romanticism could be summarized as “absolute inwardness.” This idea that the artist is a slave solely to his own irrational muse is no doubt ancient, but its obvious modern echoes can be found in the romantic philosophy of Friedrich Nietzsche’s early writings. “Nietzsche,” writes music historian Martha Bayles, “echoed the robust Romantic view that the only worthwhile use of reason in art is to confront, wrestle with, and finally incorporate the irrational.”15
It is no accident that drugs and rock and roll are so linked in the popular imagination. Both promise to take us out of the realm of daily concerns and rational priorities. They are both forms of escapism from the workaday, from the shackles of the here and now. The ancients celebrated wine, women, and song. Today the mantra is “sex, drugs, and rock and roll”—and so long as we remain human, it will be ever thus.
Nor is it a coincidence that rock appeals most directly to adolescents. Your teenage years are the time when the civilized order and your inner primitive are most at war. It is when glandular desires are most powerful and our faculties of reason are the most susceptible to all manner of seductions. Everyone who has experienced teenage angst—which is to say everyone who has lived long enough to legally buy booze and cigarettes—knows full well that the romantic revolution and the Enlightenment wage war in every teenage heart.
It is also no coincidence that the post–World War II era of peace, prosperity, and conformity largely created the idea of the teenager. The buttoned-down 1950s gave adolescents something to rebel against. Similarly, the peace and prosperity of the post–Cold War world created the adolescent forty-year-old. The comfort of prosperity leads, in Schumpeterian fashion, to a cultural backlash against the established order and bourgeois values.
Now let us move on from rock and look instead to popular culture more generally. For it is my contention that the same romantic impulses that define rock and roll also define much of the rest of our culture as well.
The simplest place to start: monsters. Primitive man believed in all manner of monsters, broadly defined. The Dungeons & Dragons geek in me wants to distinguish between monsters qua monsters and, say, dragons, spirits, orcs, and the like. But we’ll stick with the broadest understanding of monster: unnatural creatures that terrify us. The primitive mind creates monsters to personify fears, and fear is one of the greatest defense mechanisms in the state of nature. The growl we hear from the back of a cave causes the mind to race to the worst possible scenario because the credo “Better safe than sorry” is written into nearly every animal’s DNA. Young children have to be taught that there are no monsters lurking under their beds because humans are born with an innate sense of their vulnerability. In adults, fear of monsters endures, usually to manifest our anxiety about the unknown. The frontiers of medieval mapmakers’ knowledge were marked off with the words “Here there be dragons.”
From the late Medieval Period to the present day, we still worry that if we press the boundaries of the known or if we trespass on God’s authority, we will find—or create—monsters. Part of the romantic indictment of science and reason is hubris, which not only means arrogance but in the original Greek means prideful defiance of the gods and their plan. How dare we try to tame nature or disenchant the world? In this way, monsters serve as instruments of a revenge fantasy against “the system.” The monster that tears it all down is the ultimate radical.
At the end of Drew Goddard and Joss Whedon’s screenplay of The Cabin in the Woods, two world-weary millennials, fed up with the hypocrisy of the world, willingly allow ancient evil Titans called “the Old Ones” to destroy the world rather than sacrifice their own lives. When told that five billion people will die, if they don’t kill themselves before the sun comes up in eight minutes, one of the youths replies, “Maybe that’s the way it should be. If you gotta kill all my friends to survive, maybe it’s time for a change.” In other words, for the millennial who plays by his own rules, planetary genocide is a just rebuke for not getting one’s way.
The most influential monster story of all time is, of course, Frankenstein. Mary Shelley based Frankenstein’s monster on the ancient Jewish legend of the Golem, a creature brought to life from inanimate material by magic. Dr. Frankenstein wasn’t a magician but a man of science, but the morals of the story are largely the same: hubris, playing God, mucking about with nature, finding the divine spark in worldly things. It is not difficult to understand why Shelley’s story of the mad scientist dabbling with mighty forces beyond his ken captivated the imaginations of millions of readers in the early 1800s.
Rousseau’s romantic indictment of progress mirrors the biblical story of man’s fall. Defying the natural law—i.e., God’s commandment—Adam and Eve partake of the forbidden fruit of knowledge and, ever since, man has been living in sin, cast out from Eden. In Rousseau’s version, when man embraced property and the division of labor, he left the happy life of the noble savage who lives in harmony with nature. The story of Frankenstein’s monster follows the same pattern.
The original title of Shelley’s story is largely forgotten: Frankenstein; or, The Modern Prometheus. In Greek mythology, the Titan Prometheus creates man out of clay and water—just like the Golem. Prometheus also gives man fire, against the will of Zeus, who famously punishes him for it by chaining him to a rock where an eagle would eat his liver every day, only to have it regenerate overnight.
The similarities between Dr. Frankenstein and Prometheus are too obvious to explore further—which is why Shelley invokes the Titan in her title. But it is interesting to note that electricity, then still a magical and miraculous phenomenon, played much the same role that fire did for the ancient mind. Indeed, it was Søren Kierkegaard who coined the phrase “Modern Prometheus” to describe Benjamin Franklin and his experiments with electricity.16 For had not Franklin plucked the symbol of godly power—lightning—and yoked it to the reins of science?
Was this not a great act of hubris?
When news of Franklin’s experiments in the New World reached the Old World, the shock was akin to the news of the first detonation of an atomic bomb.17
Since we’re on the topic, the atom bomb also unleashed its own wave of monster stories. The etymology of “monster” is relevant: warning, portent, demonstrate, show. Consider Godzilla, King of Monsters (and one of the most enduring pop culture icons in the world, not just in Japan). The first Godzilla movie was released in 1954, less than a decade after the bombing of Nagasaki and Hiroshima, and just two years after the formal end of the American occupation of Japan—amidst an enormous controversy over a Japanese fishing boat damaged during some American nuclear testing in the Bikini Atoll.
But most important, Godzilla was also a kind of Frankenstein’s monster, created by the invisible, seemingly magical force of atomic radiation. Deformities and mutations—precisely the kinds of conditions that gave birth to the original meaning of the word “monster”—were a very real consequence and omnipresent concern after the bombings. The fear that the atomic age would unleash unimaginable horrors was common around the world, but understandably acute in Japan.
“Godzilla has long mirrored public thinking in Japan,” writes Chieko Tsuneoka in the Wall Street Journal. “The monster’s origin as the mutant product of nuclear tests reflected Japan’s trauma from the atomic bombings of World War II and its anxieties over postwar American H-bomb testing in the Pacific. In the 1970s, as Japan choked with industrial pollution, Godzilla fought the Smog Monster. In the early 1990s, when U.S.-Japanese trade frictions intensified, Godzilla fought King Ghidorah, a three-headed monster sent by a foreign-looking group called the Futurians to prevent Japan from developing into an economic superpower.”18
The most recent Godzilla movie, 2016’s Shin Godzilla, captures growing nationalist sentiments in Japan as the country is agonizing over whether to remilitarize in the face of Chinese and Russian aggression and perceived American unreliability. The idea is wrenching because Japan turned its back on nationalist militarism after World War II in favor of a market democracy and pacifism. (Indeed, the first Godzilla movie sixty-two years earlier was a pacifist allegory.) The twin fears facing Japan are, on the one hand, that bringing back nationalism and militarism will awaken old demons, and, on the other hand, that it may be necessary to do so for Japan to survive. In Shin Godzilla, the beast returns to his original role as villain, and the heroes of a Godzilla movie are the actual politicians and military, who are traditionally seen as well-intentioned but hapless fodder for monster feet. Once again, they play to type, but eventually rise to the challenge, finding the will to defeat the beast (for now—there will always be sequels).
William Tsutsui, author of Godzilla on My Mind: Fifty Years of the King of Monsters (2004), writes that “Shin Godzilla leaves no doubt that the greatest threat to Japan comes not from without but from within, from a geriatric, fossilized government bureaucracy unable to act decisively or to stand up resolutely to foreign pressure. Indeed, this movie could easily have been titled ‘Godzilla vs. the Establishment,’ as Tokyo’s smothering quicksand of cabinet meetings, political infighting, and interagency logjams make Mothra, Rodan, and King Ghidorah seem like remarkably tame adversaries.”19
Shin in Japanese can mean new, divine, or true, but the filmmakers refused to disclose which meaning they had in mind—which surely suggests they intended all three.20
Of course, Godzilla and Frankenstein barely scratch the list of monsters that populate the popular culture and the warnings they foretell. Indeed, there’s a whole subliterature on what the villain-monsters in sci-fi movies really represent.
For example, one of my favorite horror films—and I don’t like many—is The Exorcist, in part because it is not really a horror movie at all. The Exorcist tells the story of an innocent young girl who is possessed and befouled by a demon. It is a brilliant piece of theological and psychological commentary.
In the early scenes, when the scientists and doctors are trying to figure out what is wrong with the little girl, Regan, we are made to feel the limits of modern, sterile technology. Later, when the priests try to expel the demon from the child, we are asked to grapple with the very existence of evil. The younger priest, Father Damien Karras—a psychologist who prior to these events had largely lost his faith to secularism—asks: Why this girl? Father Merrin responds, “I think the point is to make us despair. To see ourselves as animal and ugly. To reject the possibility that God could love us.”21 Though I prefer Father Merrin’s fuller response in the original novel:
Yet I think the demon’s target is not the possessed; it is us…the observers…every person in this house. And I think—I think the point is to make us despair; to reject our own humanity, Damien: to see ourselves as ultimately bestial; as ultimately vile and putrescent; without dignity; ugly; unworthy. And there lies the heart of it, perhaps: in unworthiness. For I think belief in God is not a matter of reason at all; I think it is finally a matter of love; of accepting the possibility that God could love us…22
There are many themes to The Exorcist: the limits of reason and technology, the power of faith, the reality of evil, and the very deliberate glorification of religion in both the book and the screenplay, both written by William Peter Blatty. The monster that takes over Regan is a warning against the dangers of nihilism, secularism, and even capitalism, wrongly pursued.
While obviously a supernatural thriller, the film can better be understood as part of, and a response to, a dark turn in American movies in the early 1970s. Whatever idealism there had been in the 1960s had largely turned to dross, as the costs of free love and sticking it to the Man mounted up. The loss of faith in politics and international and domestic turmoil all contributed to a very bleak—if well-executed—time in American cinema. The Exorcist came out the same year as American Graffiti, Mean Streets, Serpico, The Last Detail, Soylent Green, Walking Tall, and Magnum Force, the sequel to the first Dirty Harry film.23 The following year’s top movies included: Death Wish, Chinatown, The Godfather: Part II, The Parallax View, and Lenny. The year after that included One Flew Over the Cuckoo’s Nest and The Stepford Wives. What was the one thing these movies, including The Exorcist, all had in common? The idea that contemporary life was out of balance and off-kilter, inauthentic, or oppressive, and that elites and the system itself were broken, corrupt, or inadequate to the task of making life right.
The idea that the world—this world—is…wrong, off balance, fake, fraudulent, unnatural, has been one of the dominant themes of art since the Enlightenment. It is what motivated the romantic poets to fight back against what they saw as the mechanization of natural life. It is the central conceit of the Matrix films, in which a technologically oppressive system parasitically feeds off humanity. It can also be found in a host of baby boomer midlife crisis and middle-age anxiety movies and TV shows, such as Lawrence Kasdan’s 1991 Grand Canyon. In what was supposed to be “The Big Chill of the 1990s” (that was how it was marketed), the film focuses on how a diverse set of characters are lost in the chaos of modern American life, lacking any shared experiences or mutual empathy and desperately looking for a sense of control or meaning. As Danny Glover says in one famous scene, “Man, the world ain’t supposed to work like this.”
In fairness, the same trope can be found in every generation. So-called Gen-X films were also full of generational angst. Winona Ryder and Ethan Hawke spent much of the 1990s making films dedicated to the proposition that the system is a succubus draining out the authenticity of life and the souls of youth. Here’s Hawke in Reality Bites:
There’s no point to any of this. It’s all just a…a random lottery of meaningless tragedy and a series of near escapes. So I take pleasure in the details. You know…a Quarter-Pounder with cheese, those are good, the sky about ten minutes before it starts to rain, the moment where your laughter become a cackle…and I, I sit back and I smoke my Camel Straights and I ride my own melt.
In another scene, Hawke literally answers the phone: “Hello, you’ve reached the winter of our discontent.”24
Interestingly, the 1990s may have marked something of a high-water mark of this genre. One can speculate as to why. The 1980s had been a time when conformity and prosperity were on the rise. The end of the Cold War and the subsequent triumphalism of Western democratic capitalism appalled many of the artistic souls who had to endure it.
The execrable film Pleasantville was an extended metaphor on the horror of conformity. And so was the even more execrable 1999 film American Beauty. “I feel like I’ve been in a coma for the past twenty years. And I’m just now waking up,” declares Kevin Spacey playing Lester Burnham, an updated version of the man in the gray flannel suit desperate to break the chains of conventional morality and selling out to the consumer culture. He commences a “self-improvement” regimen that includes all of the staples: sexual obsessions, pot smoking, flipping off the Man.
“Janie, today I quit my job. And then I told my boss to go fuck himself, and then I blackmailed him for almost $60,000. Pass the asparagus,” Lester tells his daughter at the dinner table.25
In Point Break (1991) a small band—a tribe, if you will—of surfers living off the grid dedicate themselves to ripping off the system by donning masks of dead presidents and robbing banks. But Bodhi (Patrick Swayze) explains: “This was never about money for us. It was about us against the system. That system that kills the human spirit. We stand for something. To those dead souls inching along the freeways in their metal coffins…we show them that the human spirit is still alive.”26 The terrible remake of the film was even more ham-fisted in its treatment of these themes.
The romantic spirit is often at its least subtle when expressing its hatred for capitalism and the market. As John Steinbeck writes in The Grapes of Wrath, “The bank is something else than men. It happens that every man in a bank hates what the bank does, and yet the bank does it. The bank is something more than men, I tell you. It’s the monster. Men made it, but they can’t control it.”27
The brilliant TV series Mr. Robot offers the most recent exposition of these themes. Set in contemporary New York, the show follows the mentally unstable computer programmer savant Elliot, hauntingly portrayed by Rami Malek. Elliot seems to live half in dream, consumed with an ongoing dialogue with the ghost (for want of a better word) of his dead father, a Rousseauian rebel determined to take down the system. As Elliot explains to a therapist:
Oh, I don’t know. Is it that we collectively thought Steve Jobs was a great man, even when we knew he made billions off the backs of children? Or maybe it’s that it feels like all our heroes are counterfeit? The world itself’s just one big hoax. Spamming each other with our running commentary of bullshit, masquerading as insight, our social media faking as intimacy. Or is it that we voted for this? Not with our rigged elections, but with our things, our property, our money. I’m not saying anything new. We all know why we do this, not because Hunger Games books make us happy, but because we wanna be sedated. Because it’s painful not to pretend, because we’re cowards. Fuck society.28
(We later learn that he didn’t, in fact, say anything at all to his therapist. It was just another inner monologue narrated by his authentic self.)
Elliot and his tribe of hackers, “F-Society,” set out to tear down E Corp, which quickly becomes “Evil Corp.” That might sound didactic, even propagandistic, but the show’s creator, Sam Esmail, deftly avoids such pitfalls. The series is almost an allegorical tale of Rousseauians, who want to “save the world” by restoring it to something more human and natural, and the capitalistic Nietzscheans, who run the system through force of will and nihilistic disregard for morality. What both factions have in common is the romantic conviction that the only legitimate source of truth is found within oneself. In the first season, Tyrell Wellick, a brilliant corporate climber, explains what he felt after killing someone:
Two days ago I strangled a woman to death just with my hands. That’s a strange sensation. Something so tremendous done by something so simple. The first ten seconds were uncomfortable, a feeling of limbo, but then your muscles tense, and she struggles and fights, but it almost disappears in the background along with everything else in the world. At that moment it’s just you and absolute power, nothing else. That moment stayed with me. I thought I’d feel guilty for being a murderer, but I don’t. I feel wonder.29
One of the most remarkable aspects of the series is how it exemplifies the romantic spirit in the technological age. Romanticism always speaks in the language of its time. That’s partly why we think the romantic era ended: The language changed with the times.
But the two most egregious films of this neo-romantic genre must be Fight Club and Dead Poets Society. In Fight Club, which opened the same month as American Beauty, Edward Norton plays a young professional driven to madness by the cage of modern capitalism. The film is a riot of Rousseauian and Nietzschean vignettes masquerading as primal yawps. The premise of Fight Club is that young men are orphans of the system, forgotten, exploited, and downtrodden. They were born free but live in chains. “Like so many others,” Norton explains, “I had become a slave to the Ikea nesting instinct.”
The only way to rediscover the freedom and meaning bleached out of them by the system is to rekindle their primal, tribal inner flames and band together, first to fight each other and then to fight the system itself.
Norton’s alter ego, Tyler Durden, explains:
Man, I see in fight club the strongest and smartest men who’ve ever lived. I see all this potential, and I see squandering. God damn it, an entire generation pumping gas, waiting tables; slaves with white collars. Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need. We’re the middle children of history, man. No purpose or place. We have no Great War. No Great Depression. Our Great War’s a spiritual war…our Great Depression is our lives. We’ve all been raised on television to believe that one day we’d all be millionaires, and movie gods, and rock stars. But we won’t. And we’re slowly learning that fact. And we’re very, very pissed off.
Another member of Fight Club advises, “Reject the basic assumptions of civilization, especially the importance of material possessions.”30
And then there’s Dead Poets Society, in which a group of young men at a straightlaced boarding school seek to break the chains of convention and defy the authority of the oppressive system they are destined to inherit. How do they do this? By embracing the work of the romantic poets (and a hodgepodge of later transcendentalists) who had declared war on the Enlightenment two centuries earlier!
The film begins with the students learning poetry by formula, plotting its “perfection along the horizontal of a graph” and its “importance” on the vertical in order to find the “measure of its greatness.” It is almost as if the system has found its soulless answer to poet August Wilhelm Schlegel’s question “What can a poem prove?” John Keating, the student’s new charismatic teacher played by Robin Williams, tells the boys to rip the introduction from their poetry text books.
He exhorts his charges to stop and to look inward for meaning and authority: “Boys, you must strive to find your own voice. Because the longer you wait to begin, the less likely you are to find it at all. Thoreau said, ‘Most men lead lives of quiet desperation.’ Don’t be resigned to that. Break out!”
When one of the students discovers in an old yearbook that Mr. Keating was a member of something called the Dead Poets Society, Mr. Keating tells him, “No, Mr. Overstreet, it wasn’t just ‘guys,’ we weren’t a Greek organization, we were Romantics. We didn’t just read poetry, we let it drip from our tongues like honey. Spirits soared, women swooned, and gods were created, gentlemen, not a bad way to spend an evening, eh?”
So inspired, the boys get into all manner of trouble. Neil, the leader of the band, takes to heart Thoreau’s dictum, “To put to rout all that was not life; and not, when I had come to die, discover that I had not lived.” He decides he’s going to be an actor, against the wishes of his father, who wants him to dedicate himself to becoming a doctor. “For the first time in my whole life I know what I wanna do. And for the first time I’m gonna do it! Whether my father wants me to or not! Carpe diem!”
It all goes badly and Neil ultimately commits suicide out of despair at the prospect of having to sell out to the system. Mr. Keating is fired, but the surviving boys pay tribute to him by standing on their desks, shouting, “O Captain! My Captain!”
Throughout the film, we are always supposed to take Mr. Keating’s side in every dispute. When the stodgy headmaster chastises him for his unorthodox techniques, Mr. Keating shoots back: “I always thought the idea of education was to learn to think for yourself.” The headmaster responds, “At these boys’ age? Not on your life. Tradition, John. Discipline. Prepare them for college and the rest will take care of itself.”31
We are supposed to roll our eyes at this. But the headmaster is right, or at least less wrong than Mr. Keating. To be sure, Mr. Keating has something to teach the fogeys about how to make education interesting and entertaining. But what he is not doing is teaching the boys to think for themselves. He is teaching them to embrace the romantic imperative of finding truth—or at least the only truth that matters—within themselves. In other words, he is not teaching them to think for themselves; he is teaching them not to think at all. Dead Poets Society is a rock-and-roll song minus the rock and roll.
All of this matters, because films like this do not merely reflect our culture but also shape it, giving it voice and validation. Voted the greatest “school film” of all time, Dead Poets Society’s influence has been profound, not just on how normal people view education, but on how educators view themselves.32
I should note that this trope of a vampiric political or economic order sucking the life force out of humanity is usually described as left-wing, and in the hands of Hollywood it often is. But in other cultures and at other times, this romantic spirit has taken different—or allegedly different—forms. In the American South, the Southern Agrarian poets and writers were a decidedly conservative bunch, but their critiques of capitalism, democracy, and mass culture could easily be described as romantic. In the Soviet Union, the writers and intellectuals who yearned to restore the romantic and religious spirit of Mother Russia were not necessarily lovers of the free market. What they wanted was to restore the glory of soil, nature, church, and tradition to the sanitized world of Marxism. The Nazis were drenched in romanticism and romantic notions going back to gauzy myths of their pre-Christian Teutonic forefathers. In India, Hindu nationalists do not fit easily into our left-right schema (as far as I can tell), but in both their historic and contemporary desire to elevate ancient notions of folk, custom, and nation over “foreign” concepts like capitalism and socialism, they seem to fit perfectly well inside the romantic tradition. And, as will be discussed in another chapter, today romantic nationalism is in full, if noxious, flower amidst the fever swamps of the American right.
The classic character of the romantic novel is the Byronic hero. In many of Byron’s works, most famously Childe Harold’s Pilgrimage, the protagonist is a rebellious soul plagued with the memory of wrongs he’s committed in the past and determined to set things right, or at least atone for them. One can think of countless stock characters in film and television who fit this description. The brooding vampire with a soul archetype (Angel, Vampire Diaries, Twilight) is one. Martin Blank in Grosse Pointe Blank is a classically Byronic figure trying to do right after a career of doing wrong. Brad Pitt, Clint Eastwood, and Mel Gibson routinely play Byronic characters in such films as Legends of the Fall, Fury, Unforgiven, and Lethal Weapon.
One of the central traits of the Byronic hero is the man who “plays by his own rules.” This theme has come to nearly define what we mean by a hero. A fascinating example of this can be found in the changing view of the Muslim prophet Muhammad during the romantic era. In Christian Europe, martyrdom was always held in high esteem. But giving your life was laudable when you were sacrificing it for a capital-T Truth, most specifically for the Christian faith. (Giving your life for your country was also highly valued, but this was often seen as just another form of religious self-sacrifice. See “Arc, Joan of.”) But, Isaiah Berlin notes that by the 1820s “you find an outlook in which the state of mind, the motive, is more important than the consequence, the intention is more important than the effect.”33
In Voltaire’s play Muhammad, the prophet emerges, in Berlin’s words, “as a superstitious, cruel and fanatical monster.”34 Voltaire probably didn’t care much about the Islamic faith one way or another; he was trying to get around the censors, to attack organized religion, specifically Catholicism as practiced in France. By the 1840s, the height of the romantic period, Muhammad becomes a heroic man of will. In Thomas Carlyle’s On Heroes, Hero-Worship, and the Heroic in History, Muhammad is “a fiery mass of Life cast up from the great bosom of Nature herself.” Carlyle couldn’t give a fig about the tenets of faith found in the Koran either. What he admired was Muhammad’s radical commitment. The Muslim prophet’s example served as an indictment of what Carlyle considered to be a “withered…second-hand century.”35
Today, this fetishization of strength and will is everything in culture. It explains so much, from Donald Trump’s cult of personality to fandom for countless athletes and hip-hop icons, not to mention the hints of grudging admiration one often hears for Muhammad’s extremist followers.36
From Rebel Without a Cause to cooking shows, the man—or occasionally woman—who plays by his own code, even if that code is evil, has become the stock character of American popular culture. Batman, a.k.a. the Dark Knight, is not evil, but he is a vigilante who plays by his own rules. The pioneering comic book character Wolverine’s slogan? “I’m the best there is at what I do, but what I do best isn’t very nice.” In the cult classic comic Watchmen, the character Rorschach’s motto is “Never compromise. Not even in the face of Armageddon.”37 In the novel and Showtime TV series Dexter, we meet a brutal serial killer who has found a way to live with himself by following “the Code of Harry,” named after his dead father (who appears as a ghost throughout the series). According to his code, it’s okay for him to murder so long as he’s murdering other serial killers (and, on rare occasions, people who might bring him to justice). Omar in the HBO series The Wire insists that he will only rob and kill other drug dealers and gangsters, because “a man’s got to have a code.” “The Mountain” in Game of Thrones explains that, while he’s fine with slaughtering innocent people, he won’t steal “because a man’s got to have a code.” He later steals, but the audience doesn’t care.
In Breaking Bad, arguably the best television series ever made,38 Vince Gilligan, the show’s creator, set out to chronicle one man’s descent from decency to decadence. The idea was to show how Mr. Chips could turn into Scarface.39 Gilligan succeeded, but not before he seduced and corrupted the viewing audience too: By the time the story ended, fans no longer minded that Walter White had become a homicidal drug dealer. They rooted for him anyway.
Many of these Knights of the Self, warriors for their own code, end up dying in these stories. They are martyrs to the idea of “I did it my way.” It would all look very familiar to the original romantics and their view on heroism from an inner-directed light.
What was once considered the only noble motivation for a hero, a conception of good outside himself, has been replaced by what Irish philosopher David Thunder calls “purely formal accounts of integrity.” According to Thunder, “purely formal accounts essentially demand internal consistency within the form or structure of an agent’s desires, actions, beliefs, and evaluations.” He adds that, under purely formal integrity, a person “may be committed to evil causes or principles, and they may adopt principles of expediency or even exempt themselves from moral rules when the rules stand in the way of their desires.”40
In other words, if you stick to your code, no matter what you do, you can be seen as a hero. It’s this sort of thinking that has led Hannibal Lecter, a character who barbarously murders and eats(!) innocent people, to be seen as something of a folk hero. In the film The Silence of the Lambs, he’s a charming monster who has no problem with eating people but says that “discourtesy is unspeakably ugly to me.” In the TV series Hannibal, the audience marvels at the cannibalistic gourmand, who cares not a whit for bourgeois morality, preferring instead a Gothic-gastronomic overlay to the laws of the jungle and a simultaneously barbaric and noble savage who does his own thing.
Why do movies and other modern myths find purchase in our imaginations? Anyone who has experience in even high school theater probably knows the phrase “willing suspension of disbelief.” The term was coined by the poet Samuel Taylor Coleridge in his collaboration with fellow poet William Wordsworth on their groundbreaking work, Lyrical Ballads, widely seen as marking the birth of the English romantic movement. The idea of the willing suspension of disbelief, Coleridge explained, was twofold. Coleridge’s contributions were to give voice to the “inward nature” of our irrational imagination, to make the supernatural characters seem real enough to the reader. Wordsworth came at the project from the opposite direction. His task was “to give the charm of novelty to things of every day, and to excite a feeling analogous to the supernatural.”41
In other words, Coleridge was tasked with making the supernatural seem real, while Wordsworth was assigned the job of making the real seem supernatural. Combine these two approaches and you get not only pantheism but the whole gamut of romantic art. The workaday is magical, and the magical is all around us.
But what interests me about the willing suspension of disbelief is the unwillingness of it. No one walks into a theater or opens a book or plays a song only after rationally committing to suspending their disbelief. The “poetic faith” is already there as a feature of our inward nature. The poetic faith, in this sense, is no different from any other form of faith. When the faithful enter a church, mosque, or synagogue, they do not rationally argue themselves into believing; the program for belief is already up and running. Our faith is like our senses of sight or touch or hearing: We don’t turn it on and off; the engine is always running. It is primal, hardwired.
What fascinates me is how our moral expectations in the world of art differ from our expectations in the real world around us. The people we are at work, at the grocery store, play by one set of largely artificial rules: the rules of civilization. But beneath—or perhaps beside—the person of manners, custom, and law resides a different being. We’ve all heard the expression that some movie, novel, or piece of music “transported” us. Perhaps “transported” is the wrong word. Perhaps “liberated” gets closer to the mark. Comedians and pop psychologists often talk about our “inner caveman.” The reason this stuff appeals to us is because we sense there’s a healthy dose of truth to the idea. Beneath the layers of outward civilization lurks our more primal self, who finds the world around us complicated and artificial. Our primal self isn’t a noble savage, but he does feel like a more authentic person than the one who works hard and plays by the rules of modern society.
The moral universe of cinema sometimes mirrors the real world, but just as often the actors on the screen play roles more consistent with the moral universe of our inner savage. It’s like a scene in some science fiction movie where the protagonist develops a roll of film and finds that the people he photographed are different from those he saw with his naked eye.42 Art captures a reality that we tend to deny in the “real world” around us. In novels, movies, TV, rap music, video games, and almost every other realm of our shared culture, the moral language of the narrative is in an almost entirely different dialect from the moral language of the larger society.
For instance, we are rightly taught not to hit, steal, or torture. These rules, and ones like them, form the bedrock of virtually every halfway decent civilization. And yet, almost every time we go see an action movie, we cheer people who violate these rules. I am a sucker for a heist movie, but I don’t think robbing banks is laudable. As a general rule, I stand foursquare against using violence to settle disagreements or respond to insults. But a John Wayne who didn’t deck someone who insulted him wouldn’t be John Wayne.
Consider an extreme example: torture. For the last two decades, America has been roiled by an intense and passionate debate over the use of what critics call torture and defenders call “enhanced interrogation.” Many opponents of torture implicitly argue that it is worse than homicide. After all, almost no one disputes that there are times when the state has the power and authority to kill. But torture? Never. Not even in a ticking-time-bomb situation. In order to hold to this extreme position, torture opponents find they must argue that torture “never works.”
This is dubious in what we call real life, but it is flat-out lunacy according to famously liberal Hollywood. Steven Bochco’s NYPD Blue broke a lot of television taboos, but the least appreciated is its open endorsement of beating the truth out of suspects. In Patriot Games, Harrison Ford shoots a man in the knee to get the information he needs. In Guarding Tess, Secret Service agent Nicholas Cage blows off a kidnapper’s toe with his service weapon. In Rules of Engagement, Samuel L. Jackson executes a prisoner to force another to talk. In Pulp Fiction, we delight at contemplating the “short-ass-life-in-agonizing-pain” of Ving Rhames’s rapist (“I’ma get medieval on your ass.”). In the TV series 24, Kiefer Sutherland would routinely resort to torture if it meant thwarting some impending threat. And each time the audience cheers.
When we suspend disbelief, we also suspend adherence to the conventions and legalisms of the outside world. Instead, we use the more primitive parts of our brains, which understand right and wrong as questions of “us” and “them.” Our myths are still with us on the silver screen, and they appeal to our sense of tribal justice. We enter the movie theater a citizen of this world, but when we sit down, we become denizens of the spiritual jungle, where our morality becomes tribal the moment the lights go out.