Structure and Meaning
WHEN WE TALK ABOUT fiction and poetry in literary discussions, we want to know how the current piece is arranged. What is the starting point? Does it move beginning to end or back and forth or in some sort of curlicue? Does it begin with the ending? The beginning? Some other key moment? In other words, how is it structured? That’s what we do with works that are essentially fictive in nature. So here’s the question to consider: why should nonfiction be any different?
Because it’s not made up? Because the stuff in it really happened?
No need for that to matter. Yes, the events—births, deaths, marriages, discoveries, criminal trials, and all the rest—did actually happen. That doesn’t mean they don’t have to be arranged.
But they happened in a specific order. You know, chronology.
In fact, I do know chronology. But I also know structure. One is a timeline of events. The other is a framework over which those events get stretched. Participants will experience something, a basketball game, say, as moving from beginning through middle to end, opening tip-off to final buzzer. Each one will experience it slightly differently based on his or her own subjectivity—things like team affiliation and how well they like certain players—but in general they will agree that the Blue Demons beat the Red Devils on two late free throws, and that the turning point was when the Red Devils’ star forward fouled out with four minutes to go. The newspaper article in tomorrow’s paper, on the other hand, may not be at all interested in the chronology of the game, choosing to focus on the psychology or the drama. We can be sure, for instance, that the lede will not be the tip-off (it almost never is), and it wouldn’t be terrible to think it will instead focus on something much closer to the end: “Kevin Hamlin looked at the rim and tried to calm himself for the first of two free throws with nine seconds remaining. The sophomore, pressed into service by foul trouble and injuries to the Blue Demons’ backcourt, focused on the process—two bounces of the ball, an easy breath, bent knees, and a smooth release. As he completed the follow-through, the rim hardly seemed to matter anymore, and the ball barely rippled the net, nor did the second one, ensuring the Demons first place in the conference.” What follows may show almost no interest in the order of events but cling to the drama inherent in the rivalry or the story of an unlikely hero. The dismissal of the opposing star with five fouls may be the next thing mentioned or may not show up until the end of the piece. The difference between the game itself and the story of the game is the structure imposed on telling that story. The question is always, how can I tell this in the most effective way possible? How can I convey the intensity and struggle and keep readers hooked?
John McPhee may or may not be our greatest living writer of nonfiction (hint: he is), but he is almost certainly our greatest thinker about writing nonfiction. Of necessity. He has taught one seminar in nonfiction writing two out of each three years since 1974 at his alma mater, Princeton University. His collected class rosters read like a Who’s Who of contemporary American journalism, although there were also plenty who went on to success in business, the arts, and other lesser endeavors. In 2017 he published Draft No. 4: On the Writing Process, which is virtually that class, now called Creative Nonfiction. So what part of that process obsesses the master of writing?
Structure.
For that obsession, he credits (or blames, depending on your viewpoint) his favorite high school English teacher, Mrs. McKee, who required a “structural outline” for every piece. She did not insist on roman numerals but accepted any honest effort at describing the shape of the writing. Every writer has to impose some organizational rationale on every piece of writing, and there are two ways to get there: outline the structure beforehand or wait till afterward and wrestle an octopus. That’s what teachers are attempting to do when they force students to shake hands with the five-paragraph theme. There are multiple iterations of this wily beast, but they share certain elements: put the thesis statement here (or else there), make each body paragraph have a main point and some supporting evidence, and place something at the end that makes the bloody thing feel finished. This is not the formula for good writing, but it is a method of imposing order on the disorderly adolescent mind, a way of teaching structure to fourteen-year-olds. Well, guess what—forty-year-olds also have disorderly minds. And sixty. And eighty-seven, as McPhee is at the time of this writing. The nature of the order thus exacted is up to the writer, but it is required for the writer, as a means of disciplining thoughts that incline toward dishevelment if not derangement, and for readers, to bring them maximum something: pleasure, understanding, fulfillment, insight.
Here’s the thing writers (and readers who would understand them) need to remember: Chronology does not equal structure. Structure cannot ignore and certainly must not falsify the actual sequence of events, but it need not be a slave to it, either. To illustrate this point, McPhee tells the story of writing his book on Alaska, Coming into the Country (1977). His method is to create and store a very large number of note cards on which pieces of an eventual book are first committed to writing. He then arranges them in patterns on a work table until he finds the organizational design that will serve him best. For this book, he knew early on that he didn’t want that design to depend on the sequence of his own journey in the state. That would make the results about him more than about his subject, a situation that never appeals to him.
With the first section of Coming into the Country, “The Encircled River,” which describes a canoe-and-kayak trip from Alaska’s Brooks Range to the tiny hamlet of Kiana, he encountered a seemingly insurmountable problem: the climax occurred on day one of the nine-day trek. While he and other members of the five-man group took a long hike around the base of a mountain, they encountered a massive grizzly bear feasting on blueberry bushes. As in, whole bushes loaded with fruit. The experience was exhilarating and alarming in equal measure, since the big bears are notoriously unpredictable. This is in a sense the perfect Alaskan moment. To tell the story in straight chronological detail would be disastrous to the narrative; it’s all downhill from there. And he can’t falsify when the bear sighting took place, as he says in Draft No. 4, because this is nonfiction and facts are facts. The solution? Don’t stick to chronology, at least in a front-to-back straight line. He begins on day five shortly after, we later learn, a second bear sighting. Not so huge or scary, but big enough and magical enough: the young bear is playing with a salmon it has killed, tossing it in the air again and again. When it senses the humans floating by, it moves off into the underbrush.
So why begin and end where he does? For one thing, it allows him to begin midstream, as it were, and avoid all that tiresome business of arrival and shifting gear and such. We’re thrown right into the trip because these guys are already on it, dirty and sweaty and somewhat beaten up. But also because he can set up a narrative cycle, which corresponds to the cyclical nature of everything arctic, as he sees it. And because he can end with a bear—not the bear, to be sure, but a bear—which holds the promise of becoming that one of near-mythic proportion that he sees on day one. Here’s how he structures this narrative. He begins on day five, which he places at nine o’clock if we imagine a clockface. He goes over the top for days five through nine (put this one at three o’clock) and then flashes back to day one. We have been taught to expect that flashbacks end and flash forward to the “now” of the story. As McPhee explains in Draft No. 4, however, that’s not the plan; he will stay in the flashback until the section ends. That allows him to put the huge bear at the midpoint on his narrative dial, and then walk us through the first four days so that the young bear comes right at the end, as he completes the cycle. It works: the young bear, standing for all that is wild and foreign to our puny human experience, is a fitting culmination to this story.
Okay, but why nine and three o’clock, and not twelve and six?
He doesn’t say, but I think it has to do with visual aesthetics. Lateral opposites feel (which is to say, look) more balanced than the polar kind.
Structure—the design of the piece—is always something to consider as you read nonfiction just as much as with a short story or a novel. The structure of a piece of writing affects its meaning. Which means it pays for us to notice. In any story with a timeline, there are three places to begin: the beginning, the end, and anywhere in between. Doesn’t exactly narrow the field, does it? Each of those options presents a different set of questions. If we begin at the beginning, for instance, the questions revolve around the big one: where is this thing going? At the end: how did we get here? In between: how did we get here, where is it going, and why did we start here and not somewhere else? Just about every manual on short story writing will tell you to begin in media res, Latin for in the middle of things, but at a point as near as possible to the end. Often, the actual beginning—of events, not of the story—never appears, or else it resides in a quick reference or two to something that in the now of the story lies back there in the mists of time. Even if those mists are only fifteen minutes old. Why do they tell you that? Chiefly to keep beginners from bogging down in exposition, the swamp that eats reader interest.
Nonfiction articles often start the same way, at some moment before the very end (although they start there sometimes, too) but fairly close to it. Why is that? For some of the same reasons. The first-year player standing at home plate or the free-throw line in the biggest moment of the year is dynamic and filled with drama. So is the defendant rising before the judge for a verdict. The moment also contains a complete backstory, which is where the article will likely spend most of its time, and will also jet forward into the finale. The near-the-end opening can work and has done times without number. That said, articles also have begun at the beginning, at the end, and at points earlier in the middle. With success. The only rule here is that the resulting piece needs to succeed, and that is often down to the skill of the writer, not to a fixed policy.
And what about books? Same thing. The question is, what can you, as the writer, make work in this instance? Let’s take two books about the Trump White House in 2018. In Fire and Fury, Michael Wolff begins with the end of the beginning in a prologue dealing with Fox News chief (and founder) Roger Ailes, and Trump’s campaign strategist Steve Bannon at a dinner two weeks before inauguration day. He then launches into the tale of taking over at the beginning, with a first chapter called “Election Day.” Here he wants to strike two themes. First, that Ailes and Bannon saw themselves (and acted) as puppet masters, that the candidate, now president-elect, had to be stage-managed into success, protected against his political naivete and worst instincts while strengthened in certain leanings toward hypernationalism. And second, that as we see in chapter one, the incoming administration was unprepared to assume the awesome responsibility of running the country, unprepared in many cases even for the victory they had just achieved. He’s writing a gang-that-couldn’t-shoot-straight narrative, and the prologue and opening chapter signal that intention.
Bob Woodward’s Fear, on the other hand, begins a good bit later, nine months into the administration. His prologue seizes on a single moment, when Gary Cohn, director of the Council of Economic Advisers, allegedly pilfered a draft letter withdrawing the United States from its free-trade agreement with South Korea (known as KORUS) from the Resolute Desk in the Oval Office. Cohn and others believed breaking the agreement would be financially and politically ruinous. By contrast, President Trump had announced his opposition to all existing trade deals, largely based on a misunderstanding of the effect trade deficits have on the American economy. Woodward then launches the narrative proper in the first chapter by diving back to the very beginning, maybe even before the beginning, to a meeting Trump had with Bannon and former House investigator and longtime conservative activist David Bossie in 2010, when they vetted him as a potential candidate and offered advice they were sure he would never take. Only then does he move forward to the chronological beginning of his story of the Trump presidency, to the day following Trump accepting the Republican nomination. Yes, there is a lot of beginning before that beginning, but not for the narrative Woodward is constructing. And why does he land there? Because he wants to focus on an emblematic crisis, a New York Times story about the inability of Trump’s handlers to get the candidate to “control his tongue.” Bannon again shows up, frantic to not be seen as the architect of certain disaster as he talks to anyone who will listen, but chiefly to Robert Mercer, whose arch-conservative ambitions run as deep as his pockets, a man who has devoted millions and millions of dollars not merely to Republican causes but to the right wing of the right wing. From that point on, Woodward will move back and forth among the many passengers and supposed conductors on the Trump train, but he will generally stick to a chronological approach. His goal is to not confuse readers.
In opening as he does, Woodward strikes his own themes, first that everyone who comes into Trump’s orbit supposes they can control him, and second, that he very much needs controlling. Wolff’s tale is one of a dysfunctional administration; Woodward’s of an administration made dysfunctional by the erratic behavior of its chief executive. Wolff believes that the president was surrounded by crisis actors, Woodward that he turns everyone around him into such.
How late does Woodward begin his narrative? In terms of chronology, the moment of Cohn’s decision ultimately arrives on page 265 (of 357).
A book can even begin long after the events related in it, as works by Daniel James Brown and Margot Lee Shetterly prove. Brown’s The Boys in the Boat (2013) begins many years after the heroics at the Berlin Olympics with the author making the acquaintance of the aging (and ailing) Joe Rantz, the last living member of that elite group, and perhaps the one with the most compelling personal story. Shetterly opens Hidden Figures with a meeting in her childhood with one of the “human computers,” the black, female mathematicians who ran calculations for the early US space program. Unlike Brown’s meeting, Shetterly’s initial contact, Kathaleen Land, vanishes from the book but provides a pathway into other contacts and the research that led to her book. Aside from that difference, the two works share many similarities: a wide-ranging cast of characters whose backstories are filled with challenges and obstacles, the interlacing of momentous achievements (individual races or rocket launches) and external circumstances, whether the Depression and the rise of Hitler or the Jim Crow South and entrenched racial prejudice, and cuts between chapters on different characters and events in ways that insist on readers balancing the various elements of the story.
Having said all this, I should note that some books do, indeed, begin at the beginning. One particular area of nonfiction that is obliged to chronology is the life story. We really do need to follow the life (most of the time) in the order in which it was lived. Some biographers and autobiographers can bounce around between an endpoint and events surrounding it and earlier phases of that life, but they have to be really adept to keep us from bewilderment. That does not inevitably mean, however, that they start at point A, but some do. Stephen E. Ambrose begins his Undaunted Courage: Meriwether Lewis, Thomas Jefferson, and the Opening of the American West with the birth and youth of the title explorer, whose development is critical to arriving at a point where the Corps of Discovery can begin its expedition. True, the book is not a typical biography, concerning itself with the most important journey in American history, but it never takes its eye off Lewis as the primary focus. The trip itself is similarly managed in chronological order as befits its subject. And if following a person’s life is best handled by sticking to the timeline, can that be any less true of the life of a year? In recent decades there have been a number of books that focused on a single year, the best known of which may be David McCullough’s 1776 (2005), which he begins in the fall of 1775, after the Battles of Lexington and Concord had precipitated the Siege of Boston, which was ongoing despite the colonials’ effort to dislodge the British in the Battle of Bunker Hill. Indeed, British troops would not be forced to withdraw from Boston until the Fortification of Dorchester Heights the following March. McCullough, relying on a tremendous number of contemporaneous documents, marches readers into and through the eponymous year and all the way to Washington’s famous Delaware crossing and the Battle of Trenton on December 26.
As you have no doubt suspected, straight chronology is not the only way to manage structure even in a life story. In Alexander Hamilton, which served as the basis for Lin-Manuel Miranda’s Broadway smash, Hamilton, Ron Chernow brackets the chronological treatment of the life with prologue and epilogue that focus on Elizabeth Schuyler (Eliza) Hamilton in great old age, first forty-six years after her husband’s death in the Aaron Burr duel and again at her own death four years later, at age ninety. This structure imparts an automatic long view, a sort of preview of our own distance of two centuries from the death of the great man.
All this focus on histories and biographies is all well and good, but what about books not limited by the march of historical events? Books like, say, this one?
Let’s start by saying that there are comparatively few books written that owe nothing to the passage of time. Even works on science, at least the physical sciences but I think the life sciences as well, have to account for the way that discovery builds on discovery, newer thought on previous thought whether in agreement or opposition, theory upon theory. One major exception can be the how-to and self-help genres. But it is certainly possible to write about science with a minimal adherence to chronology. Take Michael Pollan’s How to Change Your Mind: What the New Science of Psychedelics Teaches Us about Consciousness, Dying, Addiction, Depression, and Transcendence (2018), which, despite the “How to” at the beginning, is emphatically not a do-it-yourself guide but a review of the new science, as the massively long subtitle makes clear. Having said that, though, we must admit that it starts out as if it will be a chronological study of the history of psychedelics. The prologue begins with the initial discovery of LSD in 1943 by Swiss chemist Albert Hofmann and the discovery by the West in 1955 of a drug long known (in its mushroom form) in indigenous New World cultures, psilocybin. Then, just when we think we see how this will go, the focus switches to Pollan himself, to his comparative ignorance of all things psychedelic, including its history. The first chapter, by contrast, begins not back at the beginning but with what he calls the “renaissance” in psychedelic study in clinical settings, research that stopped cold in the 1960s in response the counterculture’s enthusiastic embrace of any mind-altering substance. This rebirth had to wait for the new century and, with it, new findings in neuroscience.
This structure of the new-discovery text, as with most other genres, varies from writer to writer and work to work. Malcolm Gladwell, the contemporary master of new social science discoveries, has a formula he follows in his books. He begins with an anecdote that comes to a surprising conclusion and in turn leads to an insight into human consciousness and behavior. From there, he walks through the psychological, sociological, and economic research and theories that help explain the phenomenon in his opening, with abundant examples to illustrate his major points. In his second book, Blink: The Power of Thinking Without Thinking (2005), that anecdote involves a statue, claimed to be ancient Greek, that was purchased by the still-young Getty Museum. Its scientists performed all manner of tests, taking considerable time and expense in the process, the outcome in almost every case being to ratify the seller’s claim to authenticity. When art experts came to look at the statue—a young, nude male figure known as a kouros—the story was quite the opposite. The experts declared the kouros a fake, and the average time to reach that decision was measured in seconds. Of course, they turned out to be right. Gladwell uses this example to introduce the theory of thin-slicing, the ability of the unconscious to recognize patterns at a glance based on highly specialized experience. He builds out from this explanation to related concepts and examples, both of successes and of horrible failures, as in the 1999 police killing of the entirely innocent Amadou Diallo in the Bronx, when a case of mistaken identity led to forty-one shots fired, almost half of which hit the victim. While there are warnings about the dangers of “thinking without thinking,” readers generally tend to remember the strengths of the approach and the positive examples. Gladwell’s structure won’t work for every book, even for every book attempting something similar to his. But he has clearly found an approach to parceling out information that works for him.
The bottom line in all this? Structure matters. And not just because it determines what you read when. It matters because it changes meaning. If the writer presents two timelines in alternating chapters, present following past, that will produce a very different understanding than if she presents the entire past and then moves to the present. If another begins with one bear encounter and ends with another, and neither of those is the beginning or the end of the journey, he’s telling us something by that choice. If the narrative is circular, that will feel different than if it were linear. If a science book moves from discovery to discovery, how will that alter our perceptions from those we would have had if he moved from scientist to scientist? Or if he began with the most recent discovery and moved steadily backward? Hey, it happens. And if structure matters, it behooves us to notice how any given piece is put together.
Fake News and “Fake News”
ONCE UPON A TIME, we knew, or thought we knew, that the news was true. Responsible journalists strove to tell the truth, and mostly they did. When they were wrong, they owned up to their error, as when Walter Cronkite, as near to the voice of God as ever appeared on the nightly news, came back from Vietnam convinced that he had been lied to by the Johnson administration and in particular by Secretary of Defense Robert McNamara about the war and its prosecution, and that he had been complicit in spreading those lies. His broadcast recanting his earlier position was a bombshell that shook the nation’s belief in itself and its government. Journalists’ willingness to admit mistakes is part of why most readers and viewers and listeners have been inclined to trust the media. Until lately.
In the years just prior to the 2016 election, social media became incredibly powerful but also incredibly compromised. It served to spread a great deal of information, and people in increasing numbers said they looked to Facebook or Twitter for their news sources. Alas, it also spread vast amounts of disinformation, much of it purposely designed to deceive. So consider this question: if you rely on a single type of material for your news, and if much of that material is false, either due to sloppiness or malice, how accurate is your “news”?
That’s what I thought.
Throughout 2016 and the early months of 2017, those bogus sources of information acquired a name: fake news. What nearly everyone understood the term to mean is what its users intended it to mean, that a news story was intended to deceive. For decades, the phrase, if used at all, was applied to grocery checkout tabloids like the National Enquirer. The Enquirer, it should be remembered, paid to bury stories (called catch-and-kill in the in-house parlance) on Trump while finding liver cancer and two strokes supposedly plaguing Hillary Clinton in the absence of any medical opinions or, evidently, any source at all beyond fertile imagination. The term found new life, according to the BBC, in the summer of 2016 when Craig Silverman, the media editor for BuzzFeed, the internet media and news company, noticed that a tremendous number of utterly false stories having to do with the American presidential election were emanating from a single town in the Republic of Macedonia. This seemed odd since Veles is a fair distance from any point in the United States and has a population under 45,000 souls. To have more than 140 websites dedicated to creating fake news that would go viral and dominate news cycles seemed at least peculiar and more likely sinister. That BuzzFeed would notice is hardly surprising; it was created a decade earlier as a sort of laboratory to track viral online content. And this stuff was viral with a vengeance. The content, almost all utterly false, skewed right but also had a certain amount of pro-left nonsense, a curious sort of propaganda in that it appeared to have no dog in this political fight, unless sowing confusion can be said to be canine in nature. And the various sides latched onto the fakery aimed toward their sensibilities as onto holy writ. The warring tribes seemed to have discovered their self-justifying myths. If one has decided that the other side is un-American if not inhuman, then a story that claims—out of thin air—that, say, the FBI agent who supposedly leaked the Hillary Clinton emails was found dead in an apparent murder-suicide, well, that’s just too good to be false. Nor was the disinformation campaign blindly sent everywhere; some was targeted with fair precision. A study by Oxford professor Philip N. Howard found that fully half of all Twitter material aimed at voters was junk or fake news, meaning that no more than half was from genuine news sources. So throughout the summer and fall, the internet was filled with fake news and with more sober voices calling it out as “fake news.”
Seems like we got that all sorted out, doesn’t it? If only.
In December 2016 defeated candidate Clinton gave a speech in which she said that the cascade of “fake news” (using the term in its accepted sense) was threatening our democracy. Many listeners and commentators took that to mean she believed she had lost the election because of fake news. She didn’t quite claim that directly, and her chief concerns seem to have been the danger to a Washington, DC, pizza parlor that was spread by, among others, an alt-Right community on the social media platform Reddit (this sort of herd of like-minded individuals is known as a subreddit) and that became known as Pizzagate. The story claimed that there was a child sex ring involving Secretary Clinton and other major Dems in the basement of the Comet Ping-Pong pizza restaurant. The trivial fact that Comet Ping-Pong lacked a basement proved no impediment to the fertile minds of conspiracy theorists. This all seems preposterous on its face, but right-wing social media lit up and one adherent traveled hundreds of miles with a gun, then opened fire in the restaurant (firing wildly and managing not to hit anyone) while announcing that he was there to get to the bottom of things. That sorry saga was what Clinton had in mind when she referred to lives being endangered by all the fake news stories, but a great many people interpreted her comments as meaning only that she disliked these stories because they cost her the election.
One of those interpreters, evidently, was President-Elect Donald Trump, who seemed to believe that what Secretary Clinton meant was that news she disliked she deemed fake. And he ran with that. In January 2017, in a remarkable act of linguistic jiu-jitsu, he flipped the term to mean not what everyone else had meant previously but solely that whatever news he didn’t like was “fake.” And what he mostly didn’t like was mainstream media (MSM in the parlance of the cognoscenti). He first used the term against CNN White House reporter Jim Acosta, telling him, “You’re fake news.” From there, he has kept up the steady drumbeat about his private “fake news” sources in every speech and at every rally and in a very large number of tweets, thereby effectively changing the meaning of the term from what had formerly been understood to simply “that which displeases Trump.” His maneuver proved a godsend for dictators and strongmen around the world since it gave them a handy way to dismiss all negative coverage (very useful for the corrupt and murderous among world leaders). And he has used the term on an almost daily basis—along with derisive terms like “failing,” “phony,” and “hack”—ever since to deride respected news sources from the New York Times and Washington Post to National Public Radio and CBS, where Cronkite publicly renounced the government’s lies all those years earlier. Instead, he favors the National Enquirer, whose parent company over the years has paid hundreds of thousands of dollars in kill fees to bury stories unfavorable to Trump, whom they regard as the goose that lays golden “news” eggs, and Fox News, which does virtually no actual news gathering, relying much more on opinion shows, many of which traffic in conspiracy theories and demonstrably false stories from shady sources, not least of those being the Enquirer. From its inception, Fox News has set itself up as an alternative to mainstream news media, following founder Roger Ailes’s distaste for what he perceived as the liberal bias of much of the coverage found there. As a result, the Fox slant is decidedly and, to its credit, overtly right-wing (nothing hidden there), but it frequently allows that partisanship to make truth a casualty. Fox is a fact-checker’s paradise. It made use, for instance, of Pizzagate stories, although usually by laying authority off to some purveyor like InfoWars further upstream in the conspiracy theory watershed.
One upshot of presidential behavior is that any public figure can now feel free to try a “fake news” defense, at least in the court of public opinion. Such stratagems work better among right-leaning audiences, but even then, the outcomes are uncertain. While Trump has had marvelous luck, others accused of misdeeds, like ousted Alabama Supreme Court Justice Roy Moore, have lost elections despite living in overwhelmingly Republican states. It’s harder to hide behind claims of fake news when the accusations involve underage girls rather than porn stars and Playboy models.
One of the standard tactics, from Richard Nixon forward, is to dismiss all reporting based on anonymous sources as false. Such denials are more likely to be untrue than the reports they deny, assuming the reports come from reputable news outlets. If not, well . . . During the 2016 Trump campaign, Bob Woodward tells us in his excellent Fear, Steve Bannon roared out during a meeting about Trump’s disgraceful Access Hollywood videotape, where he claimed to be able to sexually assault women with impunity because of his fame and money, that all deep background stories are complete “lies and bullshit.” Here, we can assume, Bannon was speaking from experience: as CEO of the website Breitbart News, he would have had extensive experience in managing made-up stories, such as articles claiming that the Obama administration and Hillary Clinton supported ISIS, and attributing them to “unnamed sources.” But let’s look at the historical record a moment. Nixon and his various hatchet men claimed that the deep background Watergate stories by Woodward and Carl Bernstein in the Washington Post were falsehoods. They went on making that claim right up to the moment they headed off to prison or, for the great man himself, to the helicopter taking him to the political wilderness. But then, Woodward and Bernstein published All the President’s Men, in which a number of witnesses who previously had spoken on deep background allowed themselves to be identified, and the country discovered that these people really did know whereof they spoke. Most significantly, thirty years later, Mark Felt, who had been the number two man at the FBI, came forward to reveal that he had been Deep Throat, the most famous unnamed source in history, which Woodward corroborated, and any shred of doubt about the truth of that reporting vanished forever.
These repeated attacks on the press, often by politicians caught with their hands in cookie jars or other places they didn’t belong, are why major news outlets like the Post insist on rigorous vetting of sources, multiple sources for any important information, and absolute adherence to the truth as best it can be known. And why they punish journalistic fraud so severely. A newspaper or magazine or television program is only as good as its reputation. Without that, audience trust is utterly lost.
Look, we’re never going to agree on the veracity of information. Liberals will always suspect the National Review, conservatives The Nation. Traditionally, however, that has to do with their editorial positions, not the quality of their reporting. That mistrust is simply how things are going to be. But there should be some things that all rational people can agree on: that some sources routinely traffic in falsehoods, that others make mistakes but own up to them, that certain bad actors (often from outside the country) deliberately plant untrue material in the public consciousness, and that we all have to put in some work to dig out informational treasures and throw out the trash. Most of all, we have to admit that our prejudices are not the same as rational judgment and as such need to be held up for inspection.
As a coda to this discussion, on April 18, 2019, Department of Justice special counsel Robert Mueller published the report of his team’s investigation of Russian interference in the 2016 general election and possible Trump campaign/administration staff involvement. The report found that the Russians, as everyone understood by that point, had attempted in a variety of ways to interfere in the election on Trump’s behalf, but while there was a good bit of contact between various persons in the Trump circle and Russian actors, some of whom may have been agents of the government, that contact did not rise to the very high bar required for charges of conspiracy to be brought. On the matter of President Trump’s possible obstruction of justice, the report said that he couldn’t be prosecuted, chiefly because of a DOJ policy against indicting sitting presidents, but that the evidence would not allow Mueller to declare him innocent. Both of those decisions immediately became the subjects of wild spin from both sides of the political aisle. For our purposes, however, the big news was that nearly all of the stories about life inside the campaign and then the White House that had been reported by major news sources—the Times, the Post, NPR, The New Yorker, and ABC, NBC, CBS, and PBS—and condemned by Trump and his allies with the now-tired epithet fake news, were corroborated in sworn testimony by members of the campaign staff and administration. While the report does not mention those news outlets, in case after case, from
the news stories are confirmed in the report. So much for “fake news.” Now, can we for crying out loud retire the term?