Our subjectivity is so completely our own.
—SPIKE JONZE
Parallel with academia’s embrace of postmodernism was the blossoming in the 1970s of what Christopher Lasch called “the culture of narcissism” and what Tom Wolfe memorably termed the “Me Decade”—a tidal wave of navel-gazing, self-gratification, and attention craving that these two authors attributed to very different causes.
Lasch saw narcissism as a defensive reaction to social change and instability—looking out for number one in a hostile, threatening world. In his 1979 book, The Culture of Narcissism, he argued that a cynical “ethic of self-preservation and psychic survival” had come to afflict America—a symptom of a country grappling with defeat in Vietnam, a growing mood of pessimism, a mass media culture centered on celebrity and fame, and centrifugal forces that were shrinking the role families played in the transmission of culture.
The narcissistic patient who had become increasingly emblematic of this self-absorbed age, Lasch wrote, often experienced “intense feelings of rage,” “a sense of inner emptiness,” “fantasies of omnipotence and a strong belief in [his] right to exploit others”; such a patient may be “chaotic and impulse-ridden,” “ravenous for admiration but contemptuous of those he manipulates into providing it,” and inclined to conform “to social rules more out of fear of punishment than from a sense of guilt.”
In contrast to Lasch, Tom Wolfe saw the explosion of “Me…Me…Me” in the 1970s as an altogether happier, more hedonistic development—an act of class liberation, powered by the postwar economic boom, which had left the working and middle classes with the leisure time and disposable income to pursue the sorts of vain activities once confined to aristocrats—the “remaking, remodeling, elevating, and polishing” of one’s own glorious self.
Economic times would grow considerably darker in the twenty-first century, but the self-absorption that Wolfe and Lasch described would remain a lasting feature of Western life, from the “Me Decade” of the 1970s on through the “selfie” age of Kim and Kanye. Social media would further accelerate the ascendance of what the Columbia Law School professor Tim Wu described as “the preening self” and the urge to “capture the attention of others with the spectacle of one’s self.”
With this embrace of subjectivity came the diminution of objective truth: the celebration of opinion over knowledge, feelings over facts—a development that both reflected and helped foster the rise of Trump.
Three examples. Number 1: Trump, who has been accused of greatly inflating his wealth, was asked about his net worth in a 2007 court deposition. His answer, it depends: “My net worth fluctuates, and it goes up and down with markets and with attitudes and with feelings, even my own feelings.” He added that it varied depending on his “general attitude at the time that the question may be asked.”
Number 2: Asked whether he’d questioned Vladimir Putin about Russian interference in the election, Trump replied, “I believe that he feels that he and Russia did not meddle in the election.”
Number 3: During the Republican National Convention in 2016, the CNN anchor Alisyn Camerota asked Newt Gingrich about Trump’s dark, nativist law-and-order speech, which inaccurately depicted America as a country beset by violence and crime, and she was sharply rebutted by the former Speaker of the House. “I understand your view,” Gingrich said. “The current view is that liberals have a whole set of statistics which theoretically may be right, but it’s not where human beings are. People are frightened. People feel that their government has abandoned them.”
Camerota pointed out that the crime statistics weren’t liberal numbers; they came from the FBI.
The following exchange took place:
GINGRICH: No, but what I said is equally true. People feel it.
CAMEROTA: They feel it, yes, but the facts don’t support it.
GINGRICH: As a political candidate, I’ll go with how people feel and I’ll let you go with the theoreticians.
The tendency of Americans to focus, myopically, on their self-pursuits—sometimes to the neglect of their civic responsibilities—is not exactly new. In Democracy in America, written more than a century and a half before people started using Facebook and Instagram to post selfies and the internet was sorting us into silos of like-minded souls, Alexis de Tocqueville noted Americans’ tendency to withdraw into “small private societies, united together by similitude of conditions, habits, and customs,” in order “to indulge themselves in the enjoyments of private life.” He worried that this self-absorption would diminish a sense of duty to the larger community, opening the way for a kind of soft despotism on the part of the nation’s rulers—power that does not tyrannize, but “compresses, enervates, extinguishes, and stupefies a people” to the point where they are “reduced to nothing better than a flock of timid and industrious animals, of which the government is the shepherd.” This was one possible cost of a materialistic society, he predicted, where people become so focused on procuring “the petty and paltry pleasures with which they glut their lives” that they neglect their responsibilities as citizens; it was difficult to conceive, he wrote, how such people who “have entirely given up the habit of self-government should succeed in making a proper choice of those by whom they are to be governed.”
In the mid-twentieth century, the pursuit of self-fulfillment exploded within both the counterculture and the establishment. Predating Esalen and EST and the encounter groups that attracted hippies and New Age seekers intent on expanding their consciousness in the 1960s and 1970s were two influential figures whose doctrines of self-realization were more materialistic and more attractive to politicians and suburban Rotarians. Norman Vincent Peale, the author of the 1952 self-help bestseller The Power of Positive Thinking—known as “God’s salesman” for his hawking of the prosperity gospel—was admired by Trump’s father, Fred, and the younger Trump would internalize the celebrity pastor’s teachings on self-fulfillment and the power of the mind to create its own reality. “Any fact facing us, however difficult, even seemingly hopeless, is not so important as our attitude toward that fact,” Peale wrote, seeming to promote the doctrine of denial along with the doctrine of success. “A confident and optimistic thought pattern can modify or overcome the fact altogether.”
Ayn Rand, also admired by Trump (over the years, The Fountainhead is one of the few novels he’s cited as a favorite), won the fealty of several generations of politicians (including Paul Ryan, Rand Paul, Ron Paul, and Clarence Thomas) with her transactional view of the world, her equation of success and virtue, and her proud embrace of unfettered capitalism. Her argument that selfishness is a moral imperative, that man’s “highest moral purpose” is “the pursuit of his own happiness,” would resonate with Trump’s own zero-sum view of the world and his untrammeled narcissism.
As the West lurched through the cultural upheavals of the 1960s and 1970s and their aftermath, artists struggled with how to depict this fragmenting reality. Some writers like John Barth, Donald Barthelme, and William Gass created self-conscious, postmodernist fictions that put more emphasis on form and language than on conventional storytelling. Others adopted a minimalistic approach, writing pared-down, narrowly focused stories emulating the fierce concision of Raymond Carver. And as the pursuit of broader truths became more and more unfashionable in academia, and as daily life came to feel increasingly unmoored, some writers chose to focus on the smallest, most personal truths: they wrote about themselves.
American reality had become so confounding, Philip Roth wrote in a 1961 essay (1961!), that it felt like “a kind of embarrassment to one’s own meager imagination.” This had resulted, he wrote, in the “voluntary withdrawal of interest by the writer of fiction from some of the grander social and political phenomena of our times,” and the retreat, in his own case, to the more knowable world of the self.
In a controversial 1989 essay, Tom Wolfe lamented these developments, mourning what he saw as the demise of old-fashioned realism in American fiction, and he urged novelists to “head out into this wild, bizarre, unpredictable, Hog-stomping Baroque country of ours and reclaim it as literary property.” He tried this himself in novels like The Bonfire of the Vanities and A Man in Full, using his skills as a reporter to help flesh out a spectrum of subcultures with Balzacian detail. But while Wolfe had been an influential advocate in the 1970s of the New Journalism (which put a new emphasis on the voice and point of view of the reporter), his new manifesto didn’t win that many converts in the literary world. Instead, writers as disparate as Louise Erdrich, David Mitchell, Don DeLillo, Julian Barnes, Chuck Palahniuk, Gillian Flynn, and Lauren Groff would play with devices (like multiple points of view, unreliable narrators, and intertwining story lines) pioneered decades ago by innovators like Faulkner, Woolf, Ford Madox Ford, and Nabokov to try to capture the new Rashomon-like reality in which subjectivity rules and, in the infamous words of former president Bill Clinton, truth “depends on what the meaning of the word ‘is’ is.”
But what Roth called “the sheer fact of self, the vision of self as inviolable, powerful, and nervy, self as the only real thing in an unreal environment,” would remain more comfortable territory for many writers. In fact, it would lead, at the turn of the millennium, to a remarkable flowering of memoir writing, including such classics as Mary Karr’s The Liars’ Club and Dave Eggers’s A Heartbreaking Work of Staggering Genius—works that established their authors as among the foremost voices of their generation.
The memoir boom and the popularity of blogging at the turn of the millennium would eventually culminate in Karl Ove Knausgaard’s six-volume autobiographical novel—filled with minutely detailed descriptions, drawn from the author’s own daily life. Along the way, there were also a lot of self-indulgent, self-dramatizing works by other authors that would have been better left in writers’ private journals or social media accounts. The reductio ad absurdum of this navel-gazing was James Frey’s bestselling book A Million Little Pieces, which was sold as a memoir but which the Smoking Gun website reported in January 2006 contained “wholly fabricated or wildly embellished details of his purported criminal career, jail terms and status as an outlaw ‘wanted in three states.’ ” Frey, who seems to have engaged in this act of self-dramatization to make himself out to be a more notorious figure than he actually was (presumably so his subsequent “redemption” would be all the more impressive as an archetypal tale of recovery), later conceded that “most of what” the Smoking Gun site reported “was pretty accurate.” For some readers, angry that they had been sold a false bill of goods, Frey’s book was a con job, a repudiation of the very qualities—honesty, authenticity, candor—that memoirs are supposed to embody, but other readers shrugged off the differentiation between fact and fiction: their response a symptom of just how comfortable people had become with the blurred lines of truth.
Personal testimony also became fashionable on college campuses, as the concept of objective truth fell out of favor and empirical evidence gathered by traditional research came to be regarded with suspicion. Academic writers began prefacing scholarly papers with disquisitions on their own “positioning”—their race, religion, gender, background, personal experiences that might inform or skew or ratify their analysis. Some proponents of the new “moi criticism” began writing full-fledged academic autobiographies, Adam Begley reported in Lingua Franca in 1994, noting that the trend toward autobiography traced back to the 1960s, to early feminist consciousness-raising groups, and that it often “spread in tandem with multiculturalism: News about minority experience often comes packaged in the first person singular. Ditto for gay studies and queer theory.”
In her 1996 book, Dedication to Hunger: The Anorexic Aesthetic in Modern Culture, the scholar Leslie Heywood used events from her own life (like her own anorexia and a humiliating relationship with a married man) to draw analogies between anorexia and modernism, an approach that had the effect of reducing great masterpieces like T. S. Eliot’s The Waste Land into case studies in an anti-women, anti-fat aesthetic.
Personal stories or agendas started turning up in biographies, too. No longer were biographies simple chronicles of other people’s lives. Instead, they became platforms for philosophical manifestos (Norman Mailer’s Portrait of Picasso as a Young Man), feminist polemics (Francine du Plessix Gray’s Rage and Fire, a portrait of Flaubert’s mistress Louise Colet), and deconstructionist exercises (S. Paige Baty’s American Monroe: The Making of a Body Politic).
Arguably the most preposterous exercise in biographical writing was Dutch: A Memoir of Ronald Reagan, a 1999 book by Reagan’s official biographer, Edmund Morris, which turned out to be a perplexing Ragtime-esque mashup of fact and fantasy, featuring a fictional narrator who is twenty-eight years older than the real Morris and who was supposedly saved from drowning in his youth by the future president. Instead of using his extraordinary access to a sitting president and his personal papers to create a detailed portrait of the fortieth president (or to grapple with important issues like Iran-Contra or the end of the Cold War), Morris gave readers cheesy descriptions of his fictional narrator and his fictional family and his fictional or semi-fictional hopes and dreams. Morris took this approach, he explained, because he realized he didn’t “understand the first thing” about his subject—an abdication of the biographer’s most basic duty—and because of his own artistic aspirations. “I want to make literature out of Ronald Reagan,” he declared. He also described his use of a fictionalized narrator as “an advance in biographical honesty,” a reminder to the reader of the subjective element involved in all writing.
This was an argument that echoed the self-serving reasoning of Janet Malcolm, who suggested in The Silent Woman, her highly partisan 1994 book about Sylvia Plath and Ted Hughes, that all biographers share her own disdain for fairness and objectivity—a disingenuous assertion, given that she made no effort to carefully weigh or evaluate material in her book but instead wrote a kind of long fan letter to Hughes, extolling his literary gifts, his physical attractiveness, his “helpless honesty.” She wrote about her “feeling of tenderness toward Hughes,” and how reading one of his letters, she felt her “identification with its typing swell into a feeling of intense sympathy and affection for the writer.”
The postmodernist argument that all truths are partial (and a function of one’s perspective) led to the related argument that there are many legitimate ways to understand or represent an event. This both encouraged a more egalitarian discourse and made it possible for the voices of the previously disenfranchised to be heard. But it’s also been exploited by those who want to make the case for offensive or debunked theories, or who want to equate things that cannot be equated. Creationists, for instance, called for teaching “intelligent design” alongside evolution in schools. “Teach both,” some argued. Others said, “Teach the controversy.”
A variation on this “both sides” argument was employed by President Trump when he tried to equate people demonstrating against white supremacy with the neo-Nazis who had converged in Charlottesville, Virginia, to protest the removal of Confederate statues. There were “some very fine people on both sides,” Trump declared. He also said, “We condemn in the strongest possible terms this egregious display of hatred, bigotry and violence on many sides, on many sides.”
Climate deniers, anti-vaxxers, and other groups who don’t have science on their side bandy about phrases that wouldn’t be out of place in a college class on deconstruction—phrases like “many sides,” “different perspectives,” “uncertainties,” “multiple ways of knowing.” As Naomi Oreskes and Erik M. Conway demonstrated in their 2010 book, Merchants of Doubt, right-wing think tanks, the fossil fuel industry, and other corporate interests that are intent on discrediting science (be it the reality of climate change or the hazards of asbestos or secondhand smoke or acid rain) have employed a strategy that was first used by the tobacco industry to try to confuse the public about the dangers of smoking. “Doubt is our product,” read an infamous memo written by a tobacco industry executive in 1969, “since it is the best means of competing with the ‘body of fact’ that exists in the minds of the general public.”
The strategy, essentially, was this: dig up a handful of so-called professionals to refute established science or argue that more research is needed; turn these false arguments into talking points and repeat them over and over; and assail the reputations of the genuine scientists on the other side. If this sounds familiar, that’s because it’s a tactic that’s been used by Trump and his Republican allies to defend policies (on matters ranging from gun control to building a border wall) that run counter to both expert evaluation and national polls.
What Oreskes and Conway call the “Tobacco Strategy” got an assist, they argued, from elements in the mainstream media that tended “to give minority views more credence than they deserve.” This false equivalence was the result of journalists confusing balance with truth telling, willful neutrality with accuracy; caving to pressure from right-wing interest groups to present “both sides”; and the format of television news shows that feature debates between opposing viewpoints—even when one side represents an overwhelming consensus and the other is an almost complete outlier in the scientific community. For instance, a 2011 BBC Trust report found that the broadcast network’s science coverage paid “undue attention to marginal opinion” on the subject of man-made climate change. Or, as a headline in The Telegraph put it, “BBC Staff Told to Stop Inviting Cranks on to Science Programmes.”
In a speech on press freedom, Christiane Amanpour addressed this issue in the context of media coverage of the 2016 presidential race, saying,
Like many people watching where I was overseas, I admit I was shocked by the exceptionally high bar put before one candidate and the exceptionally low bar put before the other candidate. It appeared much of the media got itself into knots trying to differentiate between balance, objectivity, neutrality, and crucially, truth.
We cannot continue the old paradigm—let’s say like over global warming, where 99.9 percent of the empirical scientific evidence is given equal play with the tiny minority of deniers.
I learned long ago, covering the ethnic cleansing and genocide in Bosnia, never to equate victim with aggressor, never to create a false moral or factual equivalence, because then you are an accomplice to the most unspeakable crimes and consequences.
I believe in being truthful, not neutral. And I believe we must stop banalizing the truth.