You must strive to know all things, both the unshakable heart of reality and the opinions of mortals which reveal their lack of understanding. You should get to know their opinions all the same, for only then can you make sense of the impressions and attitudes which human beings take to be the truth.
—THE GODDESS IN PARMENIDES’ POEM ON NATURE
The time is out of joint. Shortwave radios once had the names of capital cities etched into their dials. Today, as we turn the knob and tune in to the world, from Damascus to Brussels to Moscow to Washington, the news is almost unrelievedly grim. The problems and differences that confront us may not yet be as catastrophic as those our parents and grandparents lived through, but they can feel more insidious and intractable.
Intolerance and illiberalism are on the rise almost everywhere. Lies go unchecked. Free speech is denied and state repression is returning in countries that even recently seemed on the path of openness. In the Middle East and Africa, and in the streets and suburbs of European cities, the murderous idiocy of religiously inspired nihilism can prove more persuasive than the milquetoast promises of secular democracy. We hear politicians talk. Children drown, starve, are blown to smithereens. The politicians go on talking. At home, boundaries—of political responsibility, mutual respect, basic civility—that seemed relatively secure only a decade ago, are broken by the week. Often it feels as if there’s a nihilistic spirit at work here too, a politics with no positive agenda of its own that seeks only to divide. A hectic rages in our blood.
These disheartening trends have any number of causes. In this book I have argued that the way our public language has changed is an important contributing and exacerbating factor. We’ve traced how a series of developments in politics, media, and technology have combined with advances in our understanding of the levers of linguistic persuasion to boost the immediate impact of political language at the price of depth and comprehensibility. And we’ve explored how an unresolved battle between two post-Enlightenment instincts—naïve and overbearing rationalism and the contrary tendency to overemphasize identity and community, which I called authenticism—has distorted how we think about the language of the public realm.
In the face of this array of negative forces, I pointed to two beacons of hope. The first was the ancient notion that human beings are born with a faculty of practical wisdom, or prudence, which should enable us to discriminate between valid and dubious public language. The second was the prospect of a rhetoric that might one day achieve a new balance of argument, character, and empathy. I used the phrase critical persuasion to describe it—“critical” in that it would consciously address, and submit itself to, its audience’s prudential scrutiny. It would seek to be reasonable rather than rigidly rationalist and, in its proportionate response to the legitimate demands of emotion and identity, would strive for actual truthfulness rather than rhetorical “authenticity.”
But how to get there from here? In chapter 7, we heard George Orwell hoping it was possible to “bring about some improvement by starting at the verbal end.” What could that mean for us?
Language and Trust
On the face of it, the crisis in our public language is a crisis of trust in public words and the people who say them. Trust is the foundation of all human relationships, and most of us know what it feels like to lose someone’s trust, or our trust in others. We also know how hard trust, once lost, is to regain.
But there’s more here than meets the eye. First, this falling off in trust in public language is only relative. As I noted in chapter 1, people never trusted politicians that much. When the Palace of Westminster burned down in 1834, the writer Thomas Carlyle was one of several observers to note the cheers and applause from the large crowd of onlookers—“There go their hacts [acts]!” In 1944, just after the success of D-Day, and at a time when the UK had a government of national unity led by Winston Churchill, Gallup conducted a poll in which they asked respondents whether they thought that British politicians were out for themselves, for their party, or for their country: 35 percent said they were out for themselves, 22 percent that they put their party first, and only 36 percent that they cared most about their country.1 When it comes to distrust, there is little new under the sun. Nor is every kind of distrust harmful. It’s hard to imagine a serviceable form of human prudence that didn’t have healthy skepticism at its core.
Some have cast doubt, moreover, on whether the recent “crisis of trust” is as dire as is often claimed. In the philosopher Onora O’Neill’s 2002 BBC Reith Lectures on the subject of trust, she pointed out that in their daily actions and choices, people routinely demonstrated practical trust in the same institutions and professions that they told the pollsters they distrusted. They might claim to have less faith in the medical profession, for instance, but that didn’t stop them going to the doctor. “We may not have evidence for a crisis of trust,” O’Neill concluded, “but we have massive evidence of a culture of suspicion.”2 Although in her view certain institutional and cultural practices actively encouraged or spread distrust, the public had so far proved largely immune to it, and the talk of a “crisis” was somewhat overdone.
Things have changed significantly in the years since Onora O’Neill gave her lectures. In many Western countries, levels of trust have fallen much farther. When YouGov asked members of the British public that same Gallup question about self, party, and country in 2014, seventy years after the original poll, a mere 10 percent of respondents said they believed that politicians put their country first.3 More important, the distinction O’Neill drew between low levels of stated trust and high levels of functional trust no longer feels secure.
The public are voting with their feet. Their distrust in traditional politicians has caused many to support antipoliticians and radical alternatives. In the UK in 2016, it drove much of the vote to leave the EU. Instead of claiming not to trust mainstream media, a growing number of people no longer consume it. A significant minority of parents, as we saw, have rejected the settled advice of that distrusted medical establishment and have refused to have their children vaccinated.
The “culture of suspicion” diagnosed by O’Neill has spread from opinion polls to voting, political activism, and civil unrest, and to private choices about everything from privacy to food safety to financial services. No doubt most people still end up functionally trusting most public services and institutions most of the time, but it is hard to deny that distrust—and the fury and sense of betrayal associated with it—is having a tangible and growing impact on our world. The word crisis does not seem too strong a description.
Some of the causes are so deep-seated that it might take decades, or longer, before our public language returns to full equilibrium and utility. But what, if anything, could the various players we have encountered in this book do now to stop further erosion, and perhaps even to begin the task of restoration?
* * *
Let’s begin with the professional politicians. The first point is the most obvious. If you say one thing and do another, the public will lose its trust in you. People may let an antipolitician get away with murder—in the United States, UK, Italy, and elsewhere, election campaigns have become like celebrity pro-am golf tournaments where the amateurs are almost encouraged to miss shots and goof around—but not you. When it comes to the biggest decisions, above all the question of war, we regard deceit, even recklessness with the facts, as a hanging offense.
Don’t try to fool the public about who you are. If you look like a senator and talk like a senator, only the loopiest antics—Senator Cruz, please step forward—will convince them that you are not a member of that hated elite. And if voters are bound to view you as a professional politician, common sense suggests that you should think carefully before pouring buckets of ordure over your colleagues and yourself. Judges and doctors and generals don’t do it. Indeed, they take care—especially when a scandal raises wider questions about competence or ethics—to talk up their calling. All are trusted more than you are. Modern politics is like the last scene of Reservoir Dogs, with everyone aiming a gun at everyone else. Recognize that you can’t shoot without getting shot yourself, and leave your more suicidal peers to finish one another off.
Treat the public like grown-ups. Share some of your actual thinking about policy, including the painful and finely balanced trade-offs you face, with the people you want to vote for you. There’s no need to talk down to them; most of the citizens you serve are not experts in economics or planning or public health, but that doesn’t mean that they’re stupid or incapable of understanding evidence or argument. If you can understand it, they probably can too.
Almost all modern public policy decisions are finely balanced. The evidence is uncertain, there are arguments and risks on both sides, and the decision makers must weigh probabilities rather than certainties. Admit it. Take the public into your confidence. They’re unlikely to trust you if you’re not prepared to trust them. And admit mistakes, clearly and quickly.
Don’t brush reality under the carpet. If you’re of the Left, and discover that income inequality is falling rather than rising (as it has done in the UK after the crash), or if it turns out that inequalities between generations may be more significant than those between classes, don’t deny the facts for ideological convenience. Get to the bottom of them—then go on to point out why they nonetheless raise issues of social justice or may lead to future problems.
Distilling complex public policy into plain language is difficult—but it must be done. In large measure, modern government is communication, yet the communications departments in most ministries and arms of government are full of frazzled time-servers. Clear them out and find a few actual writers. Throw in some graphic artists, videographers, and multimedia producers to boot. And while you’re at it, insist that the army of technocrats on whom you rely also take some lessons in lucid, unpatronizing expression. Then, whether they like it or not, get them out in front of the mikes and the cameras. Give your focus groups and your A-B testing platform a break from the search for the best phrases for party political attack; ask them instead to find the clearest way of laying out public policy choices.
Democratic politics is intrinsically and necessarily adversarial, and party (and sometimes personal) political advantage will be front and center in much of what you do. But consider what the evolutionary biologists call reciprocal altruism when it comes to public policy discussion. If you use every trick in the book to stop your political opponents from getting a fair hearing for their ideas, you can hardly object when they do the same to you. Your bold new ideas about the environment, or how to address the pension burden, will never get a proper airing unless there’s a political and media climate that provides space for serious discussion. There are risks in trying to break the vicious circle, not least that your own colleagues—who are themselves quite inured to it—will accuse you of naïveté or cowardice in the face of the enemy. Voters, on the other hand, might actually find it refreshing. And if you make the leap, you might be lucky enough to find one or two brave souls on the opposing side of the House prepared to take a chance on that old-fashioned thing called statesmanship.
This is not a plea for compromise as such. The two sides in even a constructive and courteous debate may end as far apart as they began, and the public may be faced with a stark alternative as a result. That’s as it should be: optimal public policy choices do not always sit in the middle of any given political battle. It’s rather that no matter how deep the political divisions, it’s always better to unearth the fundamentals of the argument and to expose them to the public. As we’ve seen repeatedly in this book, awkward policy areas that are ignored or reduced to the status of props in the pantomime of party politics seldom solve themselves. Instead they return to haunt the politicians who tried to bury them. Immigration, inequality, the aspirations and concerns of ethnic, cultural, and national minorities are all pressing current examples of this.
Spin has always been a part of politics and probably always will be. Have a care all the same. Machiavellian news management can still be effective in controlled societies where even outright lies may never catch up with you—thus the apparently never-ending political success story that is Vladimir Putin—but in the 360-degree digitally connected West, deniability is not what it used to be. As successive political leaders on both sides of the Atlantic have learned to their cost, what your attack dogs say “unattributably” to further your cause with words and in ways to which you yourself would never publicly stoop always ends up being tracked back to you. Your people leave your fingerprints wherever they go, and their character—their cruelty, their intimidation, their hypocrisy—soon becomes continuous with your own. Spin worked best when it was nameless and when almost everyone inside politics and the media colluded in it. Once it was given a handle, and the media started to report on it as a story in itself, its best days were over.
Don’t always listen to communications advice from your closest political allies. You may well agree with their verdict on the moral character of the gang that faces you across the aisle, but think carefully before you give your own side the red meat it craves. In the run-up to the 2015 British general election, several friendly commentators urged Ed Miliband to look and sound more “angry” about the Tories and their policies.4 No doubt anger was what Labour’s heartland supporters wanted, but was it really the most effective emotion with which to sway the unaffiliated, undecided voters on whom election victory actually depended? You need of course to do enough to keep your own troops motivated and united, but they are not your principal audience—not, that is, if you want to get into power.
Aristotle was right: amplification is a necessary implement in the politician’s rhetorical toolbox. Life is short, you need to focus your audience’s attention on your key message, to accentuate the contrast between your view of the world and that of your opponent. Conditional clauses and qualifying adjectives and adverbs are all very well in legal documents and policy discussions behind closed doors. Out in the open, the public wants clarity and crispness, and the news media crave short, punchy headlines. So there will be occasions when amplifying and simplifying your point—your diagnosis, your attack, your promise—makes good political sense.
But constant exaggeration is dangerous. Initially, it can seem to win you a kind of respect. Adult policy debate can sound persnickety or just plain boring, and politicians who break out of the charmed circle and offer simple, vivid judgments and single-sentence solutions can initially sound adventurous, honest, inspiring. But for how long?
Exaggeration is a drug. It delivers an instant high but can have deleterious long-term effects. Every word you say can and will be taken down and used in evidence against you, and one day you may come to rue that sweeping generalization or vicious put-down. Indeed, exaggeration can become your trademark so completely that the media will always expect it of you and, unless you say something essentially unreasonable, will not report what you say at all. Before you know it, you will find yourself playing a stock role in a stale political soap opera, and your ability to be heard on matters of substance will be gone forever.
Nor will the language itself emerge unscathed. When Margaret Thatcher died, several leading lights of the Left said that she had “wrecked” Britain. Not “damaged,” not “took in the wrong direction,” not “implemented divisive economic policies,” but “destroyed,” “wrecked,” “ruined.” I once walked into a hut in the highlands of Ethiopia during that country’s civil war. It was full of women who had been extensively burned with phosphorus bombs dropped by the regime of Mengistu Haile Mariam. There was no pain relief for the women, nor any medicines as far as I could see, nor any prospect of medical attention. If the UK is a wrecked land, what words are left for these women and their plight, or for countries like Syria and Libya and Somalia, where wrecked means bombed and burned-out cities, slaughtered children, lawlessness, despair?
Given the character of contemporary politics and media, it requires almost superhuman self-control not to give in to indiscriminate exaggeration—especially if your opponents have already abandoned all restraint—but it’s still the wisest course. Exaggeration wins fewer elections than its devotees imagine and, even when it does, things usually unravel rapidly. Let’s see if that proves true of the UK’s headstrong 2016 Brexit vote. For political parties, there’s a further risk: once begun, an internal competition for who can sound the most radical or the most ideologically pure can quickly become unstoppable. Parties that succumb to this temptation—the modern Republican Party is a splendid current example—lose control not just of collective discipline but of any coherent sense of their own identity.
So learn the right rather than the wrong lessons from the antipoliticians. Steer clear of their bogus simplicity and instead acknowledge the complexity of real-world policy. “There must be times,” opined the Daily Telegraph in early 2014, “when David Cameron envies Nigel Farage. The simplicity of the UKIP leader’s message obviates any need for subtlety or nuance. His position is easy to articulate: he wants Britain to leave the EU—no ifs nor buts.”5
In the end, David Cameron lost his job and his political career in the face of Nigel Farage’s supposed “simplicity.” Yet the truth is that Britain’s exit will be anything but simple—if the country still wants access to European markets and continued influence in European affairs—and the Leave camp’s wild promises on immigration, tax, and unfettered sovereignty may all have to give.
Throughout his career, Boris Johnson’s quasi-antipolitical persona—he presents as a postmodernist Bertie Wooster, flamboyant and pawkily self-aware—has allowed him to revel in gaffes and political gyrations that would have flattened a more straightlaced colleague. The Brexit referendum posed a problem for him, however. Without strong political convictions on the matter, he could have joined either camp, and argued for a time that the UK should vote to leave and negotiate to stay (he’s fond of saying that his policy on cake is pro having it and pro eating it too). Eventually he summoned the media to a flash mob press conference. “Let me tell you where I’ve got to … which is, um, I am, um, I’ve made up my mind,” he told them. Out it was.6
Deep social and political forces were at work in the referendum, but Johnson’s arrival energized the Leave campaign and helped them to victory. Millions of disadvantaged Britons were persuaded to cast a protest vote against the country’s elites by a man whose own resume (Eton, Oxford, The Spectator, The Daily Telegraph, MP for Henley, then Uxbridge) spoke of nothing but power and privilege. But, having never thought he would actually win, Boris looked more perplexed than pleased by the result and it was clear that there had never been a plan about what to do next. Within days, his lack of consistency and scruple had scuppered his hopes of becoming prime minister. Eccentricity and “character” are fun. They can even look like trustworthiness for a time. Real leadership calls for something more: substance.
But countercultural zaniness is not restricted to British politics. Here is the veteran Italian satirist Dario Fo eulogizing the comedian and latter-day party leader Beppe Grillo as a descendant of the medieval guillari, wandering entertainers who juggled with “words, irony and sarcasm” as well as with clubs: “He is from the tradition of the wise storyteller, one who knows how to use surreal fantasy, who can turn situations around, who has the right word for the right moment, who can transfix people when he speaks, even in the rain and the snow.”7
Putting aside the ominous history of magicians and mountebanks in modern Italian politics, the business of leading a party, let alone running a country, requires rather more than that admirable gift for “surreal fantasy.” In the 2013 general election, Grillo’s Five Star Movement (or M5S) won a quarter of the national vote—and then promptly began to descend into infighting and factionalism that left “the wise storyteller” telling the media he was feeling “pretty tired,”8 and being accused by some of his closest colleagues of remoteness and autocracy. Grillo and M5S may still be an attractive home for antiestablishment voters (their anti-EU stance helped them perform well in the 2014 European elections and in more recent regional and mayoral elections), but the thought of them ever actually getting into power is scary.
No, the lesson to learn from the antipoliticians concerns ethos. They look and sound like human beings. They lack the polish and control of the conventional politician. Their anger and impatience are not a carefully focus-grouped and calibrated rhetorical gambit but something they palpably feel. They make mistakes, change policies without warning and sometimes for no apparent reason, say things that would be deeply offensive if uttered by a mainstream political leader. And yet—for at least as long as the antipolitician is an outsider—the public is disposed to forgive and forget. Their arguments (logos) may be simplistic, but at least they’re not automata. That can be enough, even for sophisticated voters, to bridge the gap of persuasion.
It’s something that precious few politicians from the established parties can pull off. Most have been schooled never to depart from their talking points, never to concede error, never to lose it. For them, media interviews are a stylized game: any difficult question is greeted with a nonanswer or an answer to an entirely different question, the one that the politician has been coached to give. The effect is evasive, brittle, alienating. Perversely, the result of the heroic effort put into not making mistakes is that mistakes are the only thing the media end up pursuing.
I spent an hour with Hillary Clinton when she was secretary of state. In private conversation, she came across as exceptionally intelligent, thoughtful, open-minded, self-deprecating, human, mischievous. My colleague at The New York Times Magazine, Mark Liebovich, had a similar experience when he was interviewing her off the record. Indeed, the encounter was going so well, and she was speaking with such personality and eloquence, that he suggested that they go on the record. At once, he says, the armored visor came down, and she shifted to the tried-and-tested defensive boilerplate of the stump speech and the official press release. Her fault? Our fault? Those questions don’t help much. Collectively, we’ve managed to get to a place where it is almost impossible for the public to get a sense of what leading public figures are like underneath the embattled public persona. Between us, we need to figure out a way of demilitarizing.
Finally, pathos. It is easy for politicians to convince themselves that they truly understand the public; that the sum of audience information to which they are exposed—quantitative and qualitative data, taken with their own inevitably rather random interactions with voters—adds up to a complete picture of the public mood; and that the models and segmentations that the marketing specialists construct for them are firm enough to bear the load of everything they want to build on them—the policies and the political tactics, the keywords and taglines, the stories and the narrative shapes.
In truth, audiences are like the sea, infinitely diverse and changeable, and this morning’s conditions are a very imperfect guide to this afternoon’s, let alone tomorrow’s. Great rhetors are like great sailors, their skill lying less in the way they turn the wheel than in their ability to read the sea ahead and to respond to it fluently and intuitively. It’s not that data and instruments are useless—sensible sailors consult satellite radar images and the GPS as well as their instincts—but that they are a complement to talent and the lessons learned from long experience rather than a substitute for them.
We live in the age of data science, and some straightforward human behaviors—patterns of buying consumer goods or browsing through online content—can be predicted across a given population with striking statistical success, as Elmer Wheeler foretold. But the higher-order questions with which the public language of policy and politics necessarily concerns itself involve matters of identity and morality, of a sense of individual and collective self, and an ever-changing and ever-disputed picture of what the good life consists of. None of these will be amenable anytime soon to tracking pixels or algorithmic optimization, or even to the most inspired polling guru.
The relationship of any politician with the public is ultimately a strictly human affair. It’s you and them. That well-heeled band of experts no doubt have their uses, though it’s important always to keep in mind that the segments and types they trade in—that “new” generation you must win over, the soccer mom and white van man—are abstracted versions of reality rather than reality itself. And once you’re up there on the podium or staring down the barrel of the camera, they’ll be nowhere to be seen. The only empathy your audience will be able to judge is yours.
Human beings are by nature social animals; we have an astonishing capacity to tell whether someone’s openness to us is genuine or sham. The ability to truly listen is every bit as important to the rhetor as any other gift in the speaking department—it is in fact a part of the same gift. Without it, pathos and ethos inevitably clash or drift apart, and logos, the argument you wanted to make, the reason you stood up in the first place, falls on deaf ears. So listen.
* * *
Politicians are not solely responsible for the crisis in public language, nor are journalists and editors. But that doesn’t mean that the media are innocent bystanders. So what can my trade do to respond to the issues I’ve raised in this book? What steps could we take as an industry, an academy, or even just as individuals to stop the rot?
First let’s reject perspectivism, the notion that everything is a point of view, that “truth” is a meaningless concept: those who say that generally do so because reality doesn’t suit them. There are such things as facts and it is still the job of journalists to report them. But this does not mean that we should be naïve in our realism. We can recognize that conscious bias is a commonplace in journalism, and that even journalists who strive to be impartial can be in thrall to unconscious narratives and prejudices. We can acknowledge the undertow of political and social power structures and accept, to dilute Marshall McLuhan, that the medium always influences the message and that changes in the form, length, velocity, and interactive potential of media all affect the meaning it conveys. That is part of the burden of this book.
By all means be a critical realist, then, recognizing that the way human beings perceive, make sense of, and express reality is always mediated and subject to distortion. But accept too that there is a difference between the reasonable observation that, given the way history is recorded, we may never fully understand the rise of Al Qaeda and the attack on the World Trade Center, and the crackpot suggestion that Jerusalem and Washington were really responsible for 9/11.
Nothing that has happened to our world politically or technologically has made the need to uncover the truth less pressing. If anything, the world has become harder to understand, and the tools and techniques available to the world’s many liars are more formidable. So we should ignore calls for journalism to become less hostile or adversarial, if that means any reduction of skepticism or unwillingness to pursue a story to its conclusion. Interviews with politicians should be courteous, but they also should be tough and, if the interviewee refuses to answer the question or obfuscates in some other way, toughness is more important than courtesy.
But don’t restrict your toolbox to the instruments of inquisitorial torture. Allow space not just for policy debate but also for policy explanation, and keep your finger off the scales until you get to the opinion pages. Perhaps a heavy editorial hand once sold newspapers or reassured doubtful readers. Today it may put many potential users off, including those younger customers you (and your advertisers) are so desperate to attract.
Give the politicians the space to set out their stall in their own words. Avoid the temptation to drop the initial political statement or policy announcement after its first few outings in your eagerness to move on to reaction and argument, forcing new readers or viewers to infer what the first speaker must have said from the angry response to it. It is a civic duty of serious journalism to allow politicians to be read or heard by the public at reasonable length in their own voice, and to debate with each other in paragraphs rather than ten-second prerecorded bites. Then hear from everyone else—your own experts, the academics and pundits, members of the public—and let battle commence.
Our world is deafened by bogus revelation, venom, and speculation dressed up as proven scientific fact, but true investigative journalism, grounded in evidence and presented cogently on its merits, can still make the whole room fall silent. The public need for it is greater than ever and at every level—from parish council to city hall, up to governments and multinational institutions—and yet the supply is faltering. That’s because investigative journalism breaks most of the rules of modern media economics. It is expensive and time-consuming, has a high failure rate, and often involves the kind of intricate detail that contemporary readers are said to have no time for. Do it anyway.
Great investigations have a restorative power, over not only the institutions and injustices that are exposed but also over trust in journalism itself. And for brave news organizations, investigations offer another potential benefit: in a desert of undifferentiated and listicated journalistic packaged goods, they can present a valuable point of distinction, a parcel of high ground that can be seen for miles around.
This is something of a golden age when it comes to the journalism of analysis and contextualization. Backgrounders are not a new thing: mid-twentieth-century newspaper readers followed battlefronts and moon shots and round-the-world solo sailors with the aid of maps and diagrams. As we saw, brief expositions, or “bexbos” as we called them—in other words, secondary video or studio packages that aimed to put the initial piece of reportage into context—arrived at the BBC in the 1980s. By 2012, The New York Times’s feature Snow Fall, the story of a complex and tragic skiing accident in the Cascades, showed how journalists could weave words, still images, videos, and animating graphics and maps into a narrative whole cloth that would tell the story better than any one medium could ever do.
But analytical journalism can go beyond the essentially descriptive and contextual and, especially in the field of public policy, burrow deep into the fundamentals of a story. Is Obamacare working or failing in practice? Do migrant workers help or hinder a given economy? The answers to these and similar questions typically come not from a Deep Throat in the shadows of an underground parking garage, but from the careful study of often publicly available data. We are still in the foothills of analytical journalism; greater public access to data and advances in machine learning and other forms of artificial intelligence should soon enable it to go farther and deeper than it does today. This is one aspect of contemporary journalism that doesn’t need to be changed—it needs to be reinforced. And cultural pessimists in the media please note: many of those citizens whom you think of as craving nothing but digital prolefeed are eager to make sense of a complex world and are lapping it up.
You probably have your own views about whether good reporting and political partisanship can ever coexist. Many reporters and editors argue that their political view of the world gives their journalism a passion and an explanatory cogency unavailable to the cold-eyed impartialist. To me, real news journalism always strives for objectivity and political impartiality—everything else is special pleading. Call it opinion and we can all sit back and agree or disagree with it. Just don’t claim that it’s news: jumbling the worlds of what is and what should be is as incoherent and misleading as confusing astronomy with astrology.
But even politically committed journalists should keep a proper professional and social distance from the people on whom they report. Trying to have it both ways—bosom pal one minute, seeker after the truth the next—is impossible and often leads to the kind of collusion and trading of stories and people that give journalism a bad name. Occasions like the White House Correspondents’ Dinner replace the proper relationship between politicians and the press with a kind of mutual masturbation, jokey, false, politics as celebrity comedy with bouncers at the door to stop any ordinary voters getting in. Our democratic leaders know that their own mode of public discourse is tainted, so they’re eager to borrow those of journalism, the entertainment industry, and digital culture. If you get too close to them, within a heartbeat they’ll be trying to sound and look just like you.
And beware another threat. The balance of power between the press and advertisers has shifted in favor of the advertiser, and there’s good evidence that many old and new media organizations are bowing to pressure to soften their reporting so as not to offend commercial partners and thereby lose revenue. Allowing your reporting to descend into self-censorship or outright commercial marketing is as great a betrayal of journalism as any—and more pernicious than most because it can be so hard for readers to spot. There is a simple fact to be faced here: whatever they say in their annual corporate social responsibility review, few companies are in favor of “transparency” when it comes to themselves. They may try to bury bad news by hiding it and obstructing legitimate journalistic inquiry or, if that fails, by using threats and commercial leverage.
If it is clearly labeled to distinguish it from newsroom output, advertorial and its digital cousin “branded content” are fine, but police that border carefully. The public language of politics has become a self-interested form of marketing-speak. Don’t let the same thing happen to the language of journalism.
The greatest threats are also the most fundamental. If the cardinal rule for politicians is not to say one thing and do another, that for journalists is not to lie. Very few professional reporters or editors deliberately perpetrate categorical untruths in their work, but many have become habituated to practices which, day by day, generate a legion of little lies: the twisted or “improved” quote; the omission of facts or context that might spoil a given story; the use of the question mark not to ask a question but to present a wild claim or speculative smear as if it were a matter of legitimate debate; the out-of-context addition of photos or other images from a different time and place to suggest an attitude of guilt, stupidity, or inappropriate smugness in relation to the present story. This is mendacity as subliminal habit. Each little lie may seem trivial; but they add up.
Perhaps the most pernicious moral risk facing the modern journalist is the sin the medieval theologians called accidie. It’s the least discussed of the seven deadly sins—”sloth” is how it is usually rendered in English—but what it really means is going through the motions, losing a grip on the real meaning of words or actions. In journalistic practice, accidie can lead reporters to twist reality beyond recognition until it vaguely resembles one of their limited repertoire of routine narratives, and to exaggerate and demonize less out of malice than because that too has become standard operating procedure, what the story “needs,” and definitely what their editors and—who knows?—perhaps even their readers have come to expect.
Have the digital insurgents managed to branch away from these old and deeply rutted paths? Politicians would like to think so, and they sometimes go through phases of ostentatiously offering interviews to BuzzFeed and the Huffington Post rather than the Wall Street Journal and the BBC to underline the point. And it’s perfectly true that great leaps have been made in multimedia, user experience, audience development, syndication. Journalism has never been more effectively packaged or efficiently distributed.
But more often than not the content that pops out at the end of this shiny digital tube bears an uncanny resemblance to the endlessly repeated and reused stories that people have been reading in the tabloid press for more than a century. Although there have been advances of analytical reporting, innovation in story-shape and the narrative tricks and tropes of traditional journalism has been surprisingly limited.
The result is a special case within the wider crisis of public language: that of a tribe whose discourse no longer has the breadth or the adaptability to reflect reality, but whose befuddlement is such that, even if they are aware of the dilemma, they are more likely to blame reality than themselves. Perhaps this is the reason that the beast so often appears feral. It knows no better, is too set in its ways, too invested in the belief that anger and bile always get the biggest audience, in the end too frightened to try anything different. The important question about much old-fashioned journalism is not whether it can survive as a profession but whether it deserves to—and whether anyone would miss it if it disappeared.
Language and Institutions
None of this will be solved, or even ameliorated, without meaningful progress on the economics of media. Silicon Valley engineers taught us to believe that news is atomic, in other words that consumers are chiefly interested in catching up with headlines and summaries of individual stories, that they don’t really care who provides these units of news, and therefore that nothing is lost if they are aggregated from many different journalistic sources by an algorithm (Google News) or some combination of algorithm and human editor (Huffington Post). Perhaps third-party aggregation might even prove superior, because it could offer individual users a wider choice of sources and, by tracking their consumption, predict and prioritize which stories they are most likely to find “relevant.”
To the person with a hammer, everything looks like a nail. It’s easy to see why computer scientists who were adept in the parsing, organization, and distribution of information, but had little expertise or interest in content as such, should have thought like this. Nor, particularly at the level of headlines and home screens, is the idea entirely wrongheaded. If someone never clicks on a sports story, over time it probably makes sense to drop sport down the list of stories on his or her home screen, even if you are a news provider with an aspiration to be a “journal of record.” It is nonetheless a woefully impoverished view of how real human beings interact with news and other forms of journalism.
A great newspaper, news program, or digital news site does not churn out news like a pile of individual bricks, which can be laid by any passing stranger in combination with bricks from any number of other brickworks and turned into whatever shape house that stranger wants to build. It has a signature, a point of editorial view. It reaches out to a prospective audience with the offer of a relationship that transcends the mechanical transmission of new facts—a relationship that is cultural, political, emotional, communal.
Most “stories” are not reports about one-shot news events but are installments in long-running, and often slowly developing, political and social transitions and conflicts. They relate to what has gone before and what is to come, and the reader or viewer gets both informational value and a kind of comfort from coming to understand the approach of a given reporter, or columnist, or news brand. Consistency matters. Provenance matters. Trust—earned the hard way, through diligence, professionalism, and high standards over years and decades—matters most of all.
A brick is a brick. Journalism is a complex cultural artifact. Indeed, in its deep and synchronous connectedness with politics, society, and the wider culture, it is probably more complex than most other forms of literature or, dare I say, art. Journalistic organizations are not look-alike factories producing the same undifferentiated commodity. They are cultural institutions.
Alas, across the developed world, the majority of them are failing cultural institutions. The slow-motion collapse of their business models, and the defeatism and resentment that go with it, are no doubt partly responsible—as Tony Blair claimed in his “feral beast” speech—for the ethical and behavioral weaknesses in contemporary journalism we discussed a few pages ago. But it now looks as if causality runs the other way as well: that, once robbed of its transient distributive advantages, bad, unambitious journalism produces a bad business.
The easy profits that media companies once enjoyed bred a widespread complacency about quality. In their different ways, newspapers and television companies both enjoyed privileged access to users and were able to charge advertisers handsomely for the right to put their messages in front of them. High-margin advertising became their main source of revenue. Economically, advertisers were their real customers, readers and viewers a means to an end. In many newsrooms, the result was an instrumentality that could sometimes border on contempt.
Today across the West, that model is unraveling, rapidly in the case of physical newspapers, more slowly but still relentlessly when it comes to broadcast TV. And in digital, it essentially doesn’t work. Despite vast headline audiences, Web advertising is a problematic revenue stream for almost everyone other than global platforms like Facebook and Google. A vicious circle has set in: publishers respond to low advertising rates by overloading their pages with too many ads; readers retaliate by turning away or installing an ad blocker. On smartphones, the problem is even more fundamental—there is no adjacent “white space” to sell.
A few publishers, like The New York Times, are reinventing digital advertising. Rather than relying on the principles of adjacency and stolen attention, we’re working with commercial partners to develop advertising messages which—while clearly labeled to distinguish them from our own journalism—are compelling enough to command interest and consumption on their own in our main content feed.
It’s a demanding model, however, which requires far more brand equity, investment, and creativity than most publishers can muster, whether they are legacy players or digital newbies. And even for the lucky minority, advertising on its own will not be enough to pay for news. Membership, freemium models, e-commerce, and events will not be enough either. There’s nothing else for it: if high-quality journalism is to survive, the public will have to pay for it.
At The Times, we have the largest and most rapidly growing digital pay model for news in the world. We let more than one hundred million people sample our journalism for free every month, but we still believe that every story and summary and video we create should be worth paying for.
It’s a high bar. No one needs to pay for scandal, paparazzi shots, celebrity news, listicles, or hate-filled prejudice and slander on the Internet. You can eat as much as you want of that free of charge. A pay model for news works only if you offer journalism that is genuinely distinctive and that delivers real utility and value. The real reason most Western newspapers have failed to get digital pay models to work is not due to some deficiency on the part of their readers. Their journalism isn’t compelling enough to sell.
I’ve made the case for serious, ambitious, well-funded journalism for civic reasons. If you are a publisher of digital news, new or old, you should embrace this agenda for reasons of survival. The same goes for your counterparts in TV and radio, where radical disruption is only just around the corner. If enough players act now, there could actually be more first-class journalism in the future than there was in the days of easy money.
But many legacy news organizations are too set in their ways to change and will probably go to their graves blaming everyone else for their sad demise. In the meantime, particularly in the UK and Europe, they will likely place more faith in lobbying for regulatory easement than in fundamental business reinvention. Where they can, they will use their political muscle to attempt to gain local protection from Silicon Valley and to disembowel or destroy the public broadcasters.
Given the level of commercial disinvestment in journalism, we might expect political support for the BBC, PBS, NPR, and other public broadcasters to be increasing. The opposite is the case: across Europe, in Australia, Canada, Japan, and elsewhere, governments are laying siege to the only institutions that can guarantee universal access to at least some high-quality serious journalism during this long and difficult digital transition. Their civil servants—tax-supported themselves—are still wedded to 1980s free-market theories about the coming age of media choice that have turned out not to be true. Their political leaders are in hock to commercial media owners, who have a selfish interest in the destruction of the public broadcasters.
Few modern politicians have the courage to acknowledge it, but the BBC and its sister broadcasters around the world are far more than state-owned purveyors of information, education, and entertainment. Byzantine and fallible, chaotic, often maddening, they are full of creativity and public-spiritedness—not obsolescent throwbacks but bulwarks of modern civilization.
Scrutinize them. Reform them. Hold them to account. But recognize what is at stake. If wrecked or hollowed out, they will be impossible to rebuild. The commercial interests that are lobbying for their marginalization or abolition are themselves likely to fail in any case. And once the public broadcasters have been neutered, and subscription is the main way of funding journalism, what will happen to those citizens unable to afford journalism of real quality?
* * *
For good or ill, institutions—not just those involved in broadcasting and media but also institutions across public life—are critical to the future of our public language. Indeed, institutions are systems of public language themselves. They originate and then preserve the conventions under which a given community addresses issues, reaches decisions, defines and polices the boundaries of the sayable. When the way they use language becomes decadent, the damage is felt everywhere.
If things are to improve, our institutions must change fundamentally. First, they must accept that their favorite language—the contemporary jargon of “accountability” and “openness”—is a busted flush.
During the global financial crisis of 2008, the systems of governance, accountability, and compliance that were supposed to ensure proper oversight of individual banks and financial institutions, and of the financial system as a whole, were shown to be a farce. In the aftermath, instead of an honest acknowledgment of how far the interlocking safeguards of corporate governance, financial regulation, central bank oversight, and the law had failed, the authorities simply pulled the same levers harder. If five thousand pages of banking regulations didn’t work, why not try ten thousand? Treated with weariness and contempt by those to whom it is meant to apply, incomprehensible to the rest of us, why should the public place an ounce of trust in any of it? This too is accidie, the sin of pretending that empty words are really full of meaning.
The culture of compliance is a false god, a failed rationalist attempt to turn the quintessentially human qualities of honesty, integrity, and trust into a regulatory algorithm. Abandon it. Start from scratch. Fit your rules around the central anthropological reality that trust is central in all our affairs and that trust is a subjective business. Shared values and peer pressure to do the right thing are more likely to prove effective before the fact than officious attempts to codify good behavior—attempts that in themselves do nothing to change hearts or improve organizational culture, and always seem to generate perverse incentives and outcomes.
What goes for financial regulation goes for lawmaking more widely. In The Rule of Nobody, the American lawyer and writer Philip K. Howard chronicles the vast waste and paralysis associated with an overcomplex, contradictory, and obsolescent legal code. To the wider economic and social cost, let’s add public incomprehension and alienation. Law is a primordial and paradigmatic form of public language—Moses descending from Mount Sinai with his tablets. Turn it into a cacophony of technocratic babble and don’t be surprised if the tribes of Israel grow restive.
Institutions must decide what they stand for. If you stand for scientific objectivity, don’t squander that by lending your authority to political advocacy. If you run a university and claim to stand for intellectual and creative freedom, get off your backside and defend them. Extremism, including both Islamophobia and anti-Semitism (often skulking behind the word anti-Zionism), is rising on many Western campuses, while the leaders of universities and other institutions sit in a funk of liberal cognitive dissonance, or hide behind spurious claims about public order and responsibility. Who ever said it was going to be easy? High principles usually require risks and sacrifices.
You should of course seek to understand and empathize with everyone. I hope you will combine that empathy with clarity and courage about the sovereignty of free speech and the right of everyone to hold and express divergent views. But whatever you decide, it’s time to get off the fence. Don’t, by your example, teach a generation of young people how to equivocate and cower.
When inexperienced pilots find themselves in a spiral dive, their instinct is to tug the stick back and raise the nose. In a normal dive, that restores the airplane to level flight, but in a spiral dive, it tightens the corkscrew and seals their fate. To return to level flight, you first have to straighten the wings and only then raise the nose. But with the ground hurtling toward you, the false survival instinct to pull back can be ungovernable. Many of today’s institutions are in the grip of exactly this psychology. Take a deep breath. Look calmly at the instruments. Straighten the wings.
Teaching the Oily Art
And what of the public themselves, that audience which is no longer just an audience and for which we have no entirely satisfactory word in the English language? Media executives oscillate between affectless terms like user and consumer and customer, while politicians typically talk about voters or the electorate. All these betray an underlying instrumentality: they are definitions of our listeners based on what we want to get out of them. The word citizenry would perhaps fit the bill if one didn’t feel the need to don a tricorn hat and brandish a flintlock as one said it. So let’s stick with the public, which at least has the advantage of directing our attention to the space that this group of people occupies when they step out of their private lives and congregate, and where they listen to, and sometimes speak, the language that is the subject of this book. What benefits would a healthy, high-functioning public language deliver to them? And what first steps could they and we take now to establish the conditions for it?
We can agree that broad public deliberation is central to the idea of democracy—the people weighing up the issues and deciding which proposal to back or which party and which leader should govern. But what does deliberation involve? In the English-speaking world, the simplest and most influential model is trial by jury. The jury hears all the evidence and arguments and then goes away to consider its verdict. Consideration in this case means discussion and debate among the individual members of the jury and an attempt to reach unanimity.
When we think of idealized popular political deliberation, it is tempting to think of the jury room writ large, of a dialectical process to which every single citizen should in principle contribute, leading to a decision in which all—even the dissenters—play a part. We know of course that the issues are more complex and the jury vastly more diffuse, but we can still think the more the merrier: the more engagement, the more argument, the more personal commitment the better.
But is that realistic? Being in a jury puts every individual juror on the spot. Isn’t the truth that, in the absence of that kind of specific public duty, most people prefer not to advance their own opinions or critique those of others? Only a small percentage of those who read a given online news story share it with their friends, and only a small percentage of them add a comment. In those countries where membership in a political party is voluntary and doesn’t offer any social or career advantage, most people prefer not to get involved at all. We may want to encourage and applaud the activists and cheerleaders, the bloggers and controversialists, but the legitimacy of democracy has always depended less on them than on the 90 percent or more of the population who take part in none of those things, who watch and listen and who, if they discuss politics at all, do so in a purely private setting.
The Athenians understood this. The demos was sovereign—the public was boss, there was no question about that. But they exercised their sovereignty in practice by doing little more than turning up, listening, and making a collective decision. The juries that reached a verdict in trials were very large and their deliberation did not involve asking questions of the witnesses or the rival rhetors, or even conferring with one another. All citizens were expected to use their own practical wisdom independently to reach a conclusion and cast their lot (vote) accordingly. Justice depended not on unanimity or the ability of the more outspoken jurors to persuade their peers but simply on the aggregation of the mass of individual opinions.
Public deliberation can mean modern citizen-juries, panels of voters debating the goals and trade-offs the politicians are grappling with, but it certainly needn’t. And it is unrealistic to imagine that more than a minority of voters would ever want to devote the necessary time to such a process, even if technology made it straightforward.
Contemporary representative democracy does not depend, any more than did Athenian direct democracy, on every citizen becoming actively involved in policy debate or routine political decision making. It depends on a public that is willing and able to absorb the facts and listen to the arguments and, on the basis of that, to decide every few years who should govern on their behalf.
Perhaps that sounds too modest, too inert. In a democracy it is everything. In fact, more than political parties, more than leaders, it is democracy. But it is the argument of this book that the way today’s political leaders and media speak to the public is making this essential democratic duty harder to discharge. Some citizens are consciously or unconsciously opting out of their constitutional role altogether, while those who do take the trouble to participate often do so on the basis of a distorted view of reality and of the choices in front of them. If we are to do anything to address this in the short term, it is the politicians and the media who will have to bear much of the burden. But is there anything that the citizens themselves can do to better prepare them to be a good sovereign?
Rhetoric has always been controversial—Plato would gladly have strangled it at birth. But as we’ve discovered repeatedly in this book, ignoring it or pretending that it’s possible to abolish it only makes things worse. Better to listen to the words of Parmenides’ Goddess quoted at the start of this chapter. At least in my reading of this frankly baffling9 passage the Goddess acknowledges the distinction between true understanding and opinions but argues that we must pay attention both to the real heart of any question and the often erroneous opinions of other human beings, because they too are intrinsically important. Rhetoric is the language in which those opinions are conceived and shared.
What the Goddess implies in her injunction is that opinion and the rhetoric of opinion will always be with us. There is no magic wand or program of beautification that can transport us from our world to one in which the only words that are ever uttered express perfect truth, or perfect authenticity, or perfect anything else. That is not our nature as human beings and it cannot be the nature of our language either.
So let’s put public language at the heart of the teaching of civics. Constitutional history, the structure of the different arms of government, how a bill becomes law, the way our courts work—these should all have their place in the curriculum, but none of them is as important as the mastery of public language. Few of us will ever be directly involved in the legislative process. Nor is detailed knowledge of the workings of the House of Commons or the US Senate likely to help wavering voters make up their minds. But they will encounter rhetoric everywhere—every time they read or look at the news or hear a speech, open up an app or look at an advertisement. The dream of rhetoric as the art of reasonable, critical persuasion depends more than anything else on the emergence of a critical audience.
We need to teach our children how to parse every kind of public language, from marketing-speak to the loftiest political utterances on TV and radio, the Web and social media. Young people should learn the history of political rhetoric and advertising, explore case studies, create their own public language in the form of text, picture, and video.
The media, and especially mission-driven media institutions like the BBC and The New York Times, have an important part to play, as do all organizations—museums, think tanks, foundations—devoted to advancing the public’s understanding of science and other policy areas. We all have a duty not just to red-flag the tendentious and the suspect but also to help our audiences build their own mental model in each major policy area—economic, geopolitical, social, scientific—into which that day’s statistics or political claims can be placed in a context of proportion and probability. They need to learn too how to challenge each model and adapt it in the light of changing circumstances.
This is not the way rhetoric is generally taught today. The humanities as a whole stand at low tide, judged less economically valuable, less worthy of research grants than the sciences, an indulgence for privileged kids or those who don’t know what to do with themselves. And even within the humanities, in most schools and universities rhetoric is ignored. If Cicero were alive today, he’d probably become an economist or a computer scientist. The last subject he’d choose to major in would be rhetoric.
But if anything can hold our brittle public realm together, it is more likely to be the right kind of rhetoric than a clever new piece of code. Let’s remember that, like the other humanities, like all great art, the question that it wrestles with—how are we to live with one another?—is the most important question that confronts any human society. Let’s teach our children rhetoric.
The Trump Test
Imagine a test. It may be invidious to name it after one individual—Donald Trump is, after all, a symptom rather than the cause of the disease—but at least the title should make my intention clear. The Trump Test is a measure of the health of a public language. To pass it, the language must enable ordinary citizens to distinguish at once between matters of fact and those of opinion, between grown-up political discourse and outright nonsense.
At present, not just in the United States but also in Britain and some other Western countries, our public language is manifestly failing the test. I’ve proposed some steps we could take to stop the decline, but I’d be the first to warn against relying solely on them. Even if they were widely embraced, their effect would be modest. And too many of the actors are trapped in the downward spiral themselves for even that to be certain.
No, if our rhetoric is ever to return to health, we must look not just to near-term changes in behavior but beyond that to the fundamental social and cultural forces that shape our language. When will their balance change and start to favor regeneration rather than disintegration? Are there any early signs of that shift beginning to happen?
* * *
The seeds of renewal of a public language germinate in unexpected places and where the cultural pessimists least expect them: out of the mouths of immigrants and refugees; in the border towns and on the jagged edges of our societies where people have less to lose and more to say because they have more to be angry about; in forms and contexts that are apparently removed from the supposedly serious business of politics and journalism.
The critical reception to Salman Rushdie’s Midnight’s Children (published in 1981) confirmed a growing sense that creative momentum in English literature was moving from its heartlands in Britain and America toward former colonies and countries where English was one language among many, and from the white heterosexual majority population to ethnic and sexual minorities. Immigrants and those whose biographies included long stretches of life in different cultures or marginalized communities turned up with increasing frequency on prize short lists.
All of this unsettled some conservatives, who worried that the focus on minority literature was driven by political correctness, and that this kind of cultural globalization and relativization risked turning English itself into a mongrel tongue. We could debate the first assertion. History suggests that the second is wholly fallacious and that, on the contrary, exposure to different cultures thickens the plot, introduces new vocabularies and new perspectives, and challenges the status quo in unsettling but ultimately fruitful ways.
There are other fresh shoots. Satire has been enjoying a comeback on television and the Web in both Britain and America. Satirists have always been public language’s street sweepers, brushing away bogus rhetoric in all its forms—the false, the fawning, the idiotic. Although satirical magazines and Web sites like Private Eye, The Onion, and Charlie Hebdo play their part, this is especially true of today’s TV satirists, Chris Morris and Armando Iannucci in the UK, Jon Stewart and John Oliver in the US among them. Programs like The Day Today, Have I Got News for You, The Thick of It, The Daily Show, and Last Week Tonight often do a better of job of deconstructing the language of politicians, and helping viewers make sense of what is really going on, than the majority of straight news sources. Indeed, many people now rely on them not just for laughs but also for the most trustworthy commentary on current events.
People sometimes talk about satire as if it were just another expression of the wider cynicism and negativity that they detect in both media and politics, but it isn’t. The best satire is a fusion of anger and creativity. It’s a purgative, and its purpose, like great journalism and great aspirational politics, is not to hurt but to cure.
Anger also powers the language of hip-hop. While mainstream white rock and pop have rarely strayed far from the solipsistic world of personal feelings, hip-hop is almost always socially situated and politically aware, and often conscious of itself as rhetoric. “My words are weapons,” rapped Eminem in “Words Are Weapons.” “I use ’em to crush my opponents / My words are weapons / I never show no emotion.”10 This is scarcely news—Public Enemy was announcing that Elvis was racist and US history nothing but four centuries of rednecks back in 1989 11—but hip-hop has gravitated to the main stage in the decades since then without losing any of its indignation or linguistic inventiveness.
In hip-hop, the personal is political and the political personal. Beyoncé’s 2016 album Lemonade is a cycle of songs about betrayal, fury, and redemption that places her own emotional life artfully within the wider struggle of black women for respect and love: the mothers of three young black men killed in law enforcement incidents appear in the video that accompanies the album. In response to one of those deaths, that of Michael Brown, who was shot and killed by a policeman in Ferguson, Missouri, the hip-hop artist Killer Mike wrote this on his Instagram account:
No matter how u felt about black people look at this Mother and look at this father and tell me as a human being how u cannot feel empathy for them … These are not THOTS, niggas/niggers, hoes, Ballers, Divas … They are humans that produced a child and loved that child and that child was slaughtered like Game and left face down as public spectacle while his blood drained down the street … 12
This is heartfelt and powerful prose. But compare it to the coiled outrage of the artist’s song “Pressure”:
Liberation costs more than a damn dollar
It costs what Christ gave
King gave
X gave
A billion dollars don’t make you an ex-slave
Nigga With an Attitude since fifth grade
I never behave
Rather be a dead man than a live slave 13
Lin-Manuel Miranda’s 2015 hip-hop musical Hamilton tells the story of the American Founding Father Alexander Hamilton with the sophistication of a political science treatise and the musical and verbal wit of a Mozart/Da Ponte opera:
How does a bastard, orphan, son of a whore
And a Scotsman, dropped in the middle of a forgotten spot
In the Caribbean by Providence, impoverished, in squalor
Grow up to be a hero and a scholar?14
Hamilton deals with the ironies and disappointments of democratic politics not with weary cynicism but fascinated zest. The Thomas Jefferson, who jokes about his slaves, is played, like many other cast members, by a black actor. George III begins as an absurd caricature but is then allowed his own mordant critique of the brave new political world that Hamilton and his friends are building. Hamilton’s idealism and sense of honor lead to a senseless death at the hands of Aaron Burr, but by then they have defined, not what American political culture was or is, but what it might aspire to. With its sprung rhythm and switchback language, Hamilton hints at what a new political rhetoric might sound like, one that can acknowledge the disputatiousness and cynicism that are endemic to democracy but never accepts that they are the whole of the story.
* * *
If we look hard, we can also see some promising new buds within political discourse itself. The language of fairness is one of them.
On the face of it, fairness is bitterly disputed in modern political discourse. Often rival definitions sit on both sides of a given argument. Can it possibly be fair if women and minorities earn less than men? But if it involves hiring and promoting people because of their gender or skin color rather than purely on their professional ability, is affirmative action (or positive discrimination, as it is often called in the UK) fair?
The disputes that arise from these questions can leave one wondering whether fairness has any objective meaning at all, or whether it is one of those words that can be twisted to fit any side of any argument. But in modern pluralist societies, almost everyone agrees (at least in principle) that fair treatment is both a universal right and a moral duty, so any case that convincingly evokes fairness is likely to carry force. A battle to define what is fair, or which of two rival perspectives on fairness should prevail, is therefore a battle of substance—and, as we shall shortly see, sometimes there are clear-cut winners and losers.
Fairness has been a long time coming—politicians have been arguing about it at least since King John’s barons forced him to sign Magna Carta in 1215, though the decisive advances in the theory and language of social justice were made from the late seventeenth century on. By 1948, when the United Nations made its Universal Declaration of Human Rights after the horrors of the Second World War, the argument for a global framework of fairness for all was carried nem con. Forty-eight countries voted in favor of the declaration and none against. The eight countries that abstained (the Soviet bloc, Yugoslavia, South Africa, and Saudi Arabia among them) had no intention of upholding the rights contained in the declaration, but it is telling that even at that time none of them thought it politic to vote against.
In the decades that followed, they and plenty of the countries that had voted in favor would betray the declaration in practice. The rights to equal treatment under the law, to freedom of expression and of association, to education, and to a basic standard of living—all would be routinely flouted around the world. Western countries too would fail to live up to the declaration, with flagrant abuses at home and abroad; even today, no one can look at the lives of the poorest members of our societies and believe that they enjoy equality before the law or that higher education is “equally accessible to all.” The world’s constitutions and political leaders talk the talk about equality and fairness but frequently treat them with contempt in practice. No one is free from hypocrisy.
But this should not blind us to the progress that has been made. Certain universal values have been publicly asserted and only the most benighted and deranged political forces dare publicly to deny them. The innocent still get murdered, but almost everyone acknowledges that it is a crime, and the regimes that are responsible know they risk economic, diplomatic, and even military retaliation, not to mention being put on trial in the International Criminal Court. The right to asylum, which is guaranteed in the declaration, is profoundly awkward for the many countries who regard it as immigration by another name, and who would rather think of the taking in of refugees as something voluntary, rather than the moral and legal duty it is. But though it is routinely honored in the breach, at least the right cannot be denied in principle. As a result, it reveals the selfishness and cowardice of those who flout it for all to see.
Public words about justice and humanity are far from a complete solution, but they count for something. So too, despite the piousness and self-regard that inevitably attends them, do the rock concerts and celebrity endorsements to ameliorate poverty and oppression. They too have helped generalize the sense that, even though our societies fall woefully shy of it, there is a universal set of minimum standards that should apply to the way human beings treat one another everywhere. Words and music are the softest kind of soft power, but like water eroding stone, over time they can wear away adamantine opposition.
Playing the fairness card doesn’t necessarily make issues easier to resolve. Is it fair to deny a woman the right to have an abortion or fair to her fetus to allow the abortion to go ahead? When fairness can be invoked by either side to a dispute, with plausibility to its own supporters at least, it can postpone resolution indefinitely. But even here there can sometimes be unexpected breakthroughs.
Take same-sex marriage. Opponents of allowing gay people to marry each other initially based their objections largely on their religion, believing first that homosexuality was sinful and second that marriage was intended by God to be the union of a man and a woman, not just to consummate their love for each other but also to produce children. But even in as religious a country as the United States, it is widely accepted that matters of faith are a private affair. The advocates of reform, moreover, took care to make their case not on the basis that those opposed to homosexuality were wrong or bigoted, but rather that marriage is a civil as well as a religious institution and that, in the civil realm, the exclusion of gays raised a simple matter of fairness: if one pair of consenting adults is allowed to get married, why not another? Western societies had long ago decided to make divorce legal, even though many of their citizens considered it to be wrong. These societies don’t force anyone to get divorced or even to condone divorce in principle. But they argue that Citizen A doesn’t have the right to prevent Citizens B and C from getting divorced if they so choose.
The effect was to drive a rhetorical wedge into the opposing camp. Sticking to religious and moral principles effectively meant retreating from the policy debate to the comfort of the pulpit, allowing the faithful to remain true to their convictions but ceding most of the active political ground to the reformers. The alternative was to confront the reformers on their own terms, but that meant abandoning religio-moral arguments for less certain sociological ones—which largely seemed to boil down to the proposition that heterosexual marriage is an ancient institution that you interfere with at your peril. Few of the vast uncommitted majority found that line of reasoning compelling. Many other “time-honored institutions,” like the historic privileges and power that men enjoyed over women, had already been successfully challenged and the result had not been Sodom and Gomorrah but social progress.
The decision to promote “same-sex marriage” rather than “gay marriage” was also an astute move by the proponents of reform. “Same-sex” points the listener’s mind toward matters of gender fairness and broader tolerance rather than gayness as such. Pragmatic, but also justified: the case was always equality before the law rather than a plea for any one sexuality.
For a long while it looked as if same-sex marriage was going to be another of those interminable values debates that never get resolved. But at some point the opposing party simply ran out of words. And so in the United States and a growing number of other countries, what had looked like a lengthy struggle gave way to something approaching a walkover.
In the Catholic Church, Pope Francis had opposed same-sex marriage when he was a cardinal in Argentina, claiming that the true inspirer of the proposed reform was “the father of lies.” Since then, though, Francis has used the language of fairness to signal if not a shift in his position on the specific issue, then at least a different approach both to the broader topic of homosexuality and to the role of the pope. Asked in July 2013 about gays inside the Vatican, he replied, “If someone is gay and he searches for the Lord and has good will, who am I to judge?” “Who am I to judge?” manages to crowd humility, respect, orthodoxy (popes are indeed not meant to judge), and a characteristic touch of mischief into a handful of words. It’s a sentence that contains more meaning—and more controversy—than all but a handful of the encyclicals that have been handed down by popes over the past two thousand years.
Pope Francis has used the language of fairness and respect in other contexts as well, in particular in relation to the environment, global income disparity, and the refugee crisis. It is not so much that he has abandoned the traditional authority of papal discourse, but rather that he has found a new way of expressing that authority through the language of fairness.
The progress to which these examples point is tentative and partial. The legalization of same-sex marriage doesn’t mean that hostility to gay people is disappearing; as the history of racism in Western countries demonstrates, hatred and bigotry can survive long after overtly prejudiced language is moved to the margins of public discourse. Moreover, sometimes brave talk can be an excuse for the avoidance of tough practical decisions.
Still, the emergence of a powerful and widely accepted moral language gives the lie to some of our darkest fears about our public discourse. Though it is often spoken by the weak and dispossessed, there is something unstoppable about the language of fairness. The barriers it faces remain formidable, but we know that in the end the sea can wear down even the stoutest coastal defenses.
* * *
None of these proposals or examples guarantees that our public language will pass the Trump Test anytime soon. The forces of political fragmentation and digital disruption are still playing out. Many of the players are trapped in habits and responses that they will find hard to break, even if they want to. Perhaps, to put it simply, too much has been said in this frantically prolix world of ours, too many hateful, mad, duplicitous words—and what is needed now is a period of forgetting, or a kind of general amnesty, before we can even hope for a recovery.
But let’s not despair. Public language has come back to life before, as it did in England in the century after the Civil War, sometimes even as the last rites were being read over it. Revival depends not on the victory of one ideology over another, nor on any deliberate call for reform, but on a turning of the tide of culture and society. We’re commonsensical creatures and we know that our life together depends on our being able to resolve our differences, at least most of the time. Sooner or later a new language of reasonable persuasion should emerge. We just don’t know when.
So what can you do in this long uncertain interim? Open your ears. Use your own good judgment. Think, speak, laugh. Cut through the noise.