A FREE SOCIETY is a moral achievement.
Over the past fifty years in the West this truth has been forgotten, ignored, or denied. That is why today liberal democracy is at risk.
Societal freedom cannot be sustained by market economics and liberal democratic politics alone. It needs a third element: morality, a concern for the welfare of others, an active commitment to justice and compassion, a willingness to ask not just what is good for me but what is good for “all of us together.” It is about “Us,” not “Me”; about “We,” not “I.”
If we focus on the “I” and lose the “We,” if we act on self-interest without a commitment to the common good, if we focus on self-esteem and lose our care for others, we will lose much else. Nations will cease to have societies and instead have identity groups. We will lose our feeling of collective responsibility and find in its place a culture of competitive victimhood. In an age of unprecedented possibilities, people will feel vulnerable and alone.
The market will be merciless. Politics will be deceiving, divisive, confrontational, and extreme. People will feel anxious, uncertain, fearful, aggressive, unstable, unrooted, and unloved. They will focus on promoting themselves instead of the one thing that will give them lasting happiness: making life better for others. People will be, by historic standards, financially rich but emotionally poor. Freedom itself will be at risk from the far right and the far left, the far right dreaming of a golden age that never was, the far left dreaming of a utopia that will never be.
Liberal democracy is at risk in Britain, Europe, and the United States. So is everything that these democracies represent in terms of freedom, dignity, compassion, and rights. The most technologically advanced societies the world has ever known have forgotten just this: we are not machines, we are people, and people survive by caring for one another, not only by competing with one another. Market economics and liberal politics will fail if they are not undergirded by a moral sense that puts our shared humanity first. Economic inequalities will grow. Politics will continue to disappoint our expectations. There will be a rising tide of anger and resentment, and that, historically, is a danger signal for the future of freedom.
I believe that we are undergoing the cultural equivalent of climate change, and only when we realize this will we understand the strange things that have been happening in the twenty-first century in the realms of politics and economics, the deterioration of public standards of truth and civil debate, and the threat to freedom of speech at British and American universities. It also underlies more personal phenomena like loneliness, depression, and drug abuse. All these things are related. If we see this, we will already have taken the first step to a solution.
WARNINGS of the threat to liberal democracy are today being sounded by political leaders. On November 8, 2019, the thirtieth anniversary of the fall of the Berlin Wall, German chancellor Angela Merkel warned that “the values on which Europe is founded—freedom, democracy, equality, rule of law, human rights—they are anything but self-evident.” Days earlier the French president Emmanuel Macron declared that Europeans were experiencing “the brain death of NATO.” Europe, he said, stands on “the edge of a precipice.” Speaking in London on November 13, 2019, Hillary Clinton spoke about the women members of Parliament who were leaving politics because of the abuse and threats they receive from extremists. Britain, she warned, may be “on the path to authoritarianism, that is the path to fascism.”1 Giving weight to such concerns, in June 2019 a combative Vladimir Putin declared that “Liberalism is obsolete.”2 These are not normal times.
Thirty years ago, with the collapse of communism and end of the Cold War, the West seemed part of another narrative altogether. It was called “the end of history,” and it seemed that the free market and liberal democracy would gradually and painlessly conquer the world. People everywhere wanted the wealth the market created and the freedom liberalism bestowed. For a while that seemed plausible, but today, to its adversaries, the West looks jaded, exhausted, divided, and weak.
Few people in recent years can have escaped the feeling that strange and unprecedented things are happening. The world has not been proceeding calmly along its accustomed course. The international political arena has not recovered equilibrium since September 11, 2001. The global economy has not reconfigured itself since the crash of 2007–8. The rising tide of drug abuse in the United States and the United Kingdom suggests that not all is well in many people’s lives. The tenor of debate, whether in politics or academia, has become angrier and more vituperative. Some deep and destabilizing transformation is taking place in the twenty-first century, but it is hard to say what. In an age of information overload, when so much of the news comes to us in such small, disconnected slices, we live in a world of dry sound bites, which increases our sense of not knowing where we are. This can lead to feelings of powerlessness, anxiety, and fear, and a desperate desire to find people who will resolve the dissonance for us.
ONE of the most important symptoms of this culture shift is the changing face of politics. Since 2016 and the Brexit referendum, British politics has, for much of the time, been reduced to fiasco and farce by the “Yes–No,” “Hard–Soft,” “Deal–No Deal” drama of Britain’s withdrawal from the European Union. The government, for much of that time, has failed to present a united front, while the main opposition party showed itself unwilling or unable to confront the highly documented presence of antisemitism in its ranks. Both of these phenomena marked new lows in post–Second World War British political history.
Elsewhere in Europe, during the same period, there have been riots in France, Germany, Italy, Spain, and Sweden. A seasoned observer of French politics, John Lichfield, said about the Gilets jaunes (“yellow jacket”) riots in Paris in 2018–19 that he had never seen “such wanton destruction… such random, hysterical hatred, directed not just towards the riot police but at shrines to the French republic beyond itself, such as the Arc de Triomphe.” The battle, he said, “went beyond violent protest, beyond rioting, to the point of insurrection, even civil war.”3
In January 2019, thirty writers, historians, and Nobel laureates, among them Simon Schama, Ian McEwan, and Salman Rushdie, warned in a manifesto that “Europe as an idea is falling apart before our eyes.” They spoke of “the populist forces washing over the continent.” The European ideal, they said, “remains the one force today virtuous enough to ward off the new signs of totalitarianism that drag in their wake the old miseries of the dark ages.”4
In the United States, the 2016 presidential election was one of the most divisive on record. According to a Reuters/Ipsos survey, 15 percent of Americans had stopped talking to a relative or close friend as a result of the election. There was demonization on both sides. It was a clear case of the kind of politics the late Bernard Lewis once encapsulated as: “I’m right. You’re wrong. Go to Hell.” Three years later, Peggy Noonan, one of the most eloquent voices in American politics, wrote in the Wall Street Journal that “people are proud of their bitterness now.” The polarization feeds on itself, becomes ever more acute: “America isn’t making fewer of the lonely, angry and unaffiliated, it’s making more every day.”5
One term in particular has surfaced in many descriptions of the new politics: namely, populism. This is not an easy term to define and is sometimes used very loosely as an insult rather than as a precise description. In general, though, the term is used to describe a form of politics that occurs when people see unacceptable gaps opening up in wealth and opportunity, when they sense assaults on their values either from an avant-garde or from outsiders, when they feel that the establishment elites are working against them, not for them, and that the government is not addressing their problems. This leads to a call for strong leaders, and to relative indifference to the democratic process. A 2017 study of the rise of populism in the major developed countries showed that votes for populist parties were at their highest level since the 1930s, with a massive increase since 2013.6
Throughout the West there has been a loss of trust in public institutions and leaders, a rise of extremism in politics, and a notable failure of governments to address fundamental problems such as climate change and the discontents of the global economy. A new phenomenon has begun to emerge: of “identity politics”—that is, political campaigning focused not on the nation as a whole but on a series of self-identifying minorities, leading to the counter-politics of populism on behalf of a beleaguered and enraged native-born population who see themselves sidelined by the elites and passed over in favor of the minorities.
Meanwhile, the very principles of political discourse have been damaged to the point where there has been a serious breakdown in trust. The manipulative use of social media; the distortions that have gone by the names “post-truth,” “alternative facts,” and “fake news”; and the mining of personal data that should never have been available for such purposes, have led to widespread cynicism concerning the political process. The sheer number of books with titles like The Strange Death of Europe, How Democracies Die, The Retreat of Western Liberalism, The Suicide of the West, and the like suggest that an unusually large number of analysts have concluded that liberty itself, as we have known it in recent centuries, is at risk.
“Demoralised, decadent, deflating, demographically challenged, divided, disintegrating, dysfunctional, declining” is how Bill Emmett describes the state of the West today, as seen through many Western eyes as well as those of its detractors.7 Or as the famous Jewish saying puts it, “Start worrying. Details to follow.”
A SECOND set of phenomena relates to personal happiness—or lack of it. Living standards for most of us in the West have reached levels our ancestors could not have contemplated. We have access to goods, delivered to our door, from almost every place on Earth. We can choose more widely, travel more extensively, and we enjoy more personal freedom than any previous generation. The internet and social media have brought the world to us and us to the world. We hold more computing power in our hands than was to be found in entire scientific departments fifty years ago. Never have we had wider access to knowledge. Never have we had more immediate contact with people throughout the world. This is in many respects the world of which our ancestors could only dream. Poverty, hunger, illiteracy, premature death—all of these things have been addressed with monumental success. Life expectancy has grown between two and three years every decade for the past century. On the face of it, we could not be in a better place.
Yet there are signs that this is far from the case. For example, in the United States, more than 70,200 Americans died from a drug overdose in 2017, a doubling of the figure in a single decade, and a tripling in twenty years. This is truly an area in which America leads the world. Death rates from drug overdose are almost four times higher than in seventeen other wealthy nations, and for the first time in recent history, life expectancy in the United States is actually falling. Alcoholism is killing more people and more younger people. Suicide rates are up 33 percent in less than twenty years.
In Britain, a 2018 report revealed that the number of people aged fifty and above who have received hospital treatment for drug abuse has more than quadrupled in a single decade, up from 1,380 to 7,800. The good news is that there has been a 6 percent fall in the number of younger people seeking treatment. However, adult drug abuse still has an effect on the youth population because of the numbers of young children being recruited by drug gangs to distribute cocaine and heroin.8
Drug abuse is often related to the wider phenomenon of depression, and rates of depression among American teenagers are also rapidly rising. In a recent poll survey by the Pew Research Center, 70 percent of young Americans, aged between thirteen and seventeen, say that anxiety and depression are serious issues among their peers; it tops what they see as their generation’s concerns.9 In 2017, 13 percent of American teenagers said they had experienced at least one major depressive incident in the past year, up from 8 percent in 2007—a 59 percent increase in a single decade.10 Meanwhile, a 2018 report by the Children’s Society in Britain came up with the shocking statistic that 20 percent of fourteen-year-old girls in Britain had deliberately self-harmed in the previous year.11
In 2018, Jean Twenge, one of the participants in my BBC radio series on morality, wrote a definitive study of iGen, her term for young people born in 1995 or later. In it she documents the dramatic rise in suicides, attempted suicides, and depressive illnesses among American teenagers, along with an equally dramatic fall in their self-reported accounts of life satisfaction. They are, it seems, a very anxious generation—iGen’ers, she says, “are scared, maybe even terrified.” They are “both the physically safest generation and the most mentally fragile.”12
This is more than a conundrum. It raises a fundamental question about where we are going in the market-economic, liberal democratic West. We may have won the battle for life and liberty, but the pursuit of happiness still eludes us. We keep chasing it, but it keeps running faster than we can.
A THIRD dimension of our contemporary unease has to do with the economics of inequality. The most conspicuous example is the ever-widening disparity between chief executive pay and the rest. One example: in 2018 the chief executive of Disney, Bob Iger, received a total payment for the year of $65.6 million, provoking outrage from Abigail Disney, granddaughter of Roy Disney and grandniece of Walt Disney. She called it “naked indecency.” It represented 1,424 times the median pay of a Disney worker. It is time, she said, “to call out the men and women who lead us… about how low we are prepared to let hard-working people sink while top management takes home ever-more outrageous sums of money.” Expecting corporate boards to do so is, she said, unreasonable because “they are almost universally made up of CEOs, former CEOs, and people who long to be CEOs.”13
This tendency, highlighted at the time of the financial crash in 2007–8, has been growing for a long time. In America in 1965 the ratio of chief executive to workers’ pay was 20:1. Today it is 312:1.14 There might be less raising of eyebrows if the chief executives were entrepreneurs, creating their own business, taking their own risks, investing their own personal savings. But they’re not. They are risking their shareholders’ money and their employees’ future. It is hard to avoid the conclusion that a small elite of executives, board members, and major shareholders has allowed this to happen at the cost of a more equitable distribution of the company’s success.
This, though, is only part of a much wider problem: the disconnection between economics and society that has grown as manufacturing and trade have become globalized. In a bounded economy, economic growth tends to benefit the nation as a whole, even though the rewards are not distributed equally. That is not the case when production can be outsourced to low-wage economies in some other part of the world, such as Southeast Asia. This tends to concentrate economic activity in the West to urban trading centers, leaving vast swathes of the country—former mining and manufacturing areas, for example—depressed and deprived with high rates of unemployment, drug taking, and crime; low social capital; poor schools; and few chances for children growing up there.
Even California, in the 1960s the epitome of the American dream, is suffering. Today it faces a massive crisis of homelessness: though it contains an eighth of the national population, it has a quarter of the nation’s homeless. It is economically deeply stratified, between the super-rich of Silicon Valley and the entertainment industry; a middle class of the state bureaucracy, academics, and people in the media; and at the bottom what Gerald Baker of The Times calls the “modern serfs,” who have “few assets, no stake in their economy, and thanks to prohibitive housing costs, limited mobility.”15 Utopia has become dystopia.
Former International Monetary Fund economist Raghuram Rajan, in his book The Third Pillar (2019), argues that the human webs of connection—the relations, values, and norms that bind us to one another—are being torn apart by technological innovation. The result, as we are seeing, is social unrest, violence, and populism. Rajan argues that markets must reestablish their connection with the web of human relations and become socioeconomics, that is, concerned not only with profits but also with social impact.
The widespread use of artificial intelligence will have a major impact on employment. Current estimates suggest that between 20 percent and 50 percent of jobs will be affected. We do not know whether an equal number of new jobs will be created, or whether there will be a rise in unemployment, or some other adjustment, such as a reduction in working hours. But the economies of the West are on the cusp of massive, technology-driven change. It would be foolish to suppose that economic growth can be pursued indefinitely as an abstract exercise in profit maximization without regard to the impact on human beings and the communities in which they live.
A FOURTH phenomenon is the assault on free speech taking place on university campuses in Britain and America, giving rise to new phenomena, like safe spaces, trigger warnings, micro-aggressions, and no-platforming, all designed to limit or ban the expression of sentiments that might offend some students, even if their banning offends others. To an ever greater extent, mob rule is taking the place of what was once the sacred mission of the university: namely, the collaborative pursuit of truth. The idea that certain views, and people holding them, might be banned merely because they might upset someone, which is what is happening in many academic circles today, is astonishing. It is the new intolerance.
During 2019, I had the shock of seeing one of the participants in my BBC Radio 4 series on morality, Jordan Peterson, a University of Toronto psychologist, denied a research fellowship at the Cambridge University Divinity School on the grounds that a photograph had been taken of him alongside an individual wearing a T-shirt with an offensive message. There was no suggestion that Peterson had registered the offensive message, let alone endorsed it—the incident took place after a lecture, hundreds had paid to have their photograph taken with him, and he only had a few seconds with each. What, I wondered, was the role in a divinity school of ideas like faith, truth, justice, generosity, and forgiveness? Had they heard of the saying, “Judge not that ye be not judged”? Would Abraham, Moses, Amos, or Jeremiah, each of whom challenged the received wisdom of their day, have found a platform under such ill-conceived censoriousness?
This is only one example of a much wider problem. A 2017 report by Spiked magazine found that, of 115 universities and student unions in Britain, 63.5 percent were “severely” restrictive of free speech, with more than 30 percent somewhat restrictive. Leading human rights and free speech advocate Peter Tatchell, commenting on the report, said: “Universities used to be bastions of free speech and open debate. As this report shows, they are increasingly hedging free speech with all kinds of qualifications, making it no longer free.”16
A November 2019 report by Policy Exchange in Britain showed that a plurality of students supported a ban on Jordan Peterson (41 percent) and feminist Germaine Greer (44 percent), whose presence at a university was protested against on the grounds that she was “transphobic.” Some 40 percent of students said they felt uncomfortable at expressing aloud attitudes that conflicted with the views of their fellow students. The report warned that “instead of being places of robust debate and free discovery,” Britain’s universities were “being stifled by a culture of conformity.”17
Although the universities and students are at the opposite end of the political spectrum from Vladimir Putin, they seem to be veering close to his view that liberalism—of which free speech is an essential component—is obsolete. Only someone lacking in historical knowledge of what happened in French and German universities in the 1920s and 1930s could fail to find this first step to lead down a very dangerous path indeed.
Campus witch-hunting is itself only one of a cluster of new phenomena that are having a corrosive effect on tolerance and truth. We have seen the return of public shaming and vigilante justice via social media campaigns. There is post-truth, the term that came to prominence during the 2016 American presidential election, signaling that veracity is taking second place to the mass manipulation of emotion. There is the loss of civility in public discourse. Social media have given everyone a voice, and often it is a shrill one. All these things undermine the sense of belonging together as a single community that reasons respectfully together.
THESE and the other phenomena I discuss in the book are not unrelated. They are the multiple consequences of a single underlying shift in the ethos of the West. Climate change has many causes and symptoms: greenhouse gases, toxic emissions, the loss of tropical rainforests, rising sea levels, the melting of ice caps and glaciers, the proliferation of extreme weather conditions, the extinction of species of plant and animal life, and the threat to many more. Different though these are, they are all part of a single phenomenon: global warming.
Likewise, divisive politics, inequitable economics, the loss of openness in universities, and the growth of depression and drug abuse are the result of what I call cultural climate change. They are the long-term consequences of the unprecedented experiment embarked on throughout the West a half-century ago: the move from “We” to “I.”
All countries and cultures have three basic institutions. There is the economy, which is about the creation and distribution of wealth. There is the state, which is about the legitimization and distribution of power. And there is the moral system, which is the voice of society within the self; the “We” within the “I”; the common good that limits and directs our pursuit of private gain. It is the voice that says No to the individual “Me” for the sake of the collective “Us.” Some call it conscience. Freud called it the superego. Others speak of it as custom and tradition. Yet others call it natural law. Many people in the West speak of it as the will and word of God.
Whatever its source, morality is what allows us to get on with one another, without endless recourse to economics or politics. There are times when we seek to get other people to do something we want or need them to do. We can pay them to do so: that is economics. We can force them to do so: that is politics. Or we can persuade them to do so because they and we are part of the same framework of virtues and values, rules and responsibilities, codes and customs, conventions and constraints: that is morality.
Morality is what broadens our perspective beyond the self and its desires. It places us in the midst of a collective social order. Morality has always been about the first-person plural, about “We.” “Society,” said Lord Devlin, “means a community of ideas; without shared ideas on politics, morals, and ethics, no society can exist.”18 Society is constituted by a shared morality. Although Nietzsche challenged this view as early as the 1880s, it remained the prevailing public opinion until the 1960s. To be a member of society was to be socialized, to internalize the norms of those around you, to act for the good of others, not just yourself. The assumption was that you must be part of something larger than yourself before you can be yourself.
Morality achieves something almost miraculous, and fundamental to human achievement and liberty. It creates trust. It means that to the extent that we belong to the same moral community, we can work together without constantly being on guard against violence, betrayal, exploitation, or deception. The stronger the bonds of community, the more powerful the force of trust, and the more we can achieve together.
Friedrich Hayek put it well. We get along with one another, he said, because “most of the time, members of our civilization conform to unconscious patterns of conduct.”19 Without these habits of heart and deed, there would be severe limits on what we could do together. Freedom, he said, has never worked “without deeply ingrained moral beliefs, and coercion can be reduced to a minimum only where individuals can be expected as a rule to conform voluntarily to certain principles.”
Morality is essential to freedom. That is what John Locke meant when he contrasted liberty, the freedom to do what we ought, with license, the freedom to do what we want. It is what Adam Smith signaled when, before he wrote The Wealth of Nations, he wrote The Theory of Moral Sentiments. It is what George Washington meant when he said, “Human rights can only be assured among a virtuous people.” And Benjamin Franklin when he said, “Only a virtuous people are capable of freedom.” Or Thomas Jefferson when he said, “A nation as a society forms a moral person, and every member of it is personally responsible for his society.” Lose morality, and eventually you will lose liberty.
That was the received wisdom for centuries. How did it change? It began with relatively abstract ideas. There was a long period of reflection on the nature of the individual and the self, starting with the Reformation, continuing through the Enlightenment, and culminating in the nineteenth-century radicalism of Kierkegaard and Nietzsche. Between the 1930s and 1960s came the existentialists in France and the emotivists in Britain and America, who argued that there was no such thing as an objective moral order: there are only private choices based on subjective emotions. But these new ideas did not dislodge the assumption that society was built on the foundation of a shared morality.
Starting in the 1960s, that changed. First came the liberal revolution: it is not the task of law to enforce a shared morality. Morality gave way to autonomy, with the sole proviso that we did not do harm to others. Then, in the 1980s, came the economic revolution: states should minimally interfere with markets. Then, in the 1990s and gathering pace ever since, came the technological revolution: the internet, tablets, smartphones, and their impact on the global economy and the way we communicate with one another. Social media in particular has changed the nature of interpersonal encounter.
Each of these developments has tended to place not society but the self at the heart of the moral life. It is not that people became immoral or amoral. That is palpably not so. We care about others. We volunteer. We give to charity. We have compassion. We have a moral sense. But our moral vocabulary switched to a host of new concepts and ideas: autonomy, authenticity, individualism, self-actualization, self-expression, self-esteem.
A Google Ngram search (measuring the frequency with which a word occurs in printed texts over a given historical period) reveals that words that used to be commonplace have become rarer since the 1960s, particularly respect, authority, duty, ought, conscience, and honor; though the biggest fall was between 1960 and 2000. Since then, some have recovered their salience. Other subtle shifts have taken place, however: regret has tended to displace remorse, and shame has become more common than guilt. One of the most striking findings is that, while talk of responsibilities has remained more or less stable, since 1960 there has been a sharp rise in the use of the word “rights.” We may still moralize, but we are reluctant to express guilt, remorse, or responsibility.
I BELIEVE that underlying much of what has happened has been the misapplication to morality of the economic principle of outsourcing. The idea goes back to Adam Smith’s division of labor and David Ricardo’s theory of comparative advantage that says, even if you are better than me at everything, still we both gain if you do what you’re best at and I do what I’m best at and we trade. The question is: Are there limits? Are there things we can’t or shouldn’t outsource?
One example happened in the years prior to the financial crash in 2007–8. The banks began to outsource risk, lending far beyond their capacities in the belief that either property prices would go on rising forever, or more significantly, if they crashed, it would be someone else’s problem, not theirs. The crash proved that in a highly interconnected financial system you can’t outsource risk on that scale. Not only does it fail to protect you from risk; it also prevents you from knowing what is happening until it is too late and disaster has become inescapable.
Another recent example, almost unnoticed, is the outsourcing of memory. Smartphones and tablets have developed ever larger memories, while ours and those of our children have become smaller and smaller. Why bother to remember anything if you can look it up in a microsecond on Google or Wikipedia? But this confuses history and memory, which are not the same thing at all. History is an answer to the question “What happened?” Memory is an answer to the question “Who am I?” History is about facts, memory is about identity. History is about something that happened to someone else, not me. Memory is my story, the past that made me who I am, of whose legacy I am the guardian for the sake of generations yet to come. Without memory, there is no identity, and without identity, we are mere dust on the surface of infinity.
Something similar happened to morality. When I went as an undergraduate to Cambridge University in the late 1960s, the philosophy course was called Moral Sciences, meaning that just like the natural sciences, morality was objective, real, part of the external world. I soon discovered, though, that almost no one believed this anymore. Morality was held to be no more than the expression of emotion, or subjective feeling, or private intuition, or autonomous choice. It is whatever I choose it to be. To me, this seemed less like civilization than the breakdown of a civilization.
The result was that, in effect, morality was split in two and outsourced to other institutions. There are moral choices and there are the consequences of those choices. The market gives us choices, and morality itself is just a set of choices in which right or wrong have no meaning beyond the satisfaction or frustration of desire. The result is that we find it increasingly hard to understand why there might be things we want to do, can afford to do, and have a legal right to do, that nonetheless we should not do because they are unjust or dishonorable or disloyal or demeaning: in a word, unethical. Ethics is reduced to economics.
As for the consequences of our choices, these have been outsourced to the state. Bad choices lead to bad outcomes: failed relationships, neglected children, depressive illness, wasted lives. But the government would deal with it. Marriage was no longer needed as a sacred bond between husband and wife, and the state would take responsibility for any negative consequences. Welfare was outsourced to government agencies, so there was less need for local community volunteering. As for conscience, which once played so large a part in the moral life, that could be outsourced to regulatory bodies. So having reduced moral choice to economics, we transferred the consequences of our choices to politics.
All of this was done with the highest of intentions, but it overlooked one of the most important lessons to have emerged from the wars of religion in the sixteenth and seventeenth centuries and the new birth of freedom that followed. A free society is a moral achievement, and it is made by us and our habits of thought, speech, and deed. Morality cannot be outsourced because it depends on each of us. Without self-restraint, without the capacity to defer the gratification of instinct, and without the habits of heart and deed that we call virtues, we will eventually lose our freedom.
THE LONG experiment that began in the 1960s had many causes. There was the exhaustion brought about by two world wars. In Britain there was the threefold promise of the welfare state: the National Health Service, retirement provision, and social care. There was the emergence of a distinct youth culture. There was the birth control pill and the sexual revolution. It was an extraordinary coming together of many factors that led people to believe that we were entering an endless summer of experiment and fun with no bill to pay for our transgressions. No one who lived through the sixties will ever forget them.
But now our children and grandchildren are paying the price of abandoning a shared moral code: divided societies, dysfunctional politics, high rates of drug abuse and suicide, increasingly unequal economies, a loss of respect for truth and the protocols of reasoning together, and the many other incivilities of contemporary life.
When morality is outsourced to either the market or the state, society has no substance, only systems. And systems are not enough. The market and the state are about wealth and power, and they are hugely beneficial to the wealthy and the powerful, but not always to the poor and the powerless. The rich and strong will use their power to exploit the rest, financially, politically and, as we know after the rise of the #MeToo movement, sexually also. Thucydides tells us that the Athenians told the Melians: “the strong do what they want, while the weak suffer what they must.” The same, it often seems, is true today.
When there is no shared morality, there is no society. Instead, there are subgroups, and hence identity politics. In the absence of shared ideals, many conclude that the best way of campaigning is to damage your opponent by ad hominem attacks. The result is division, cynicism, and a breakdown of trust. The world is divided into the people like us and the people not like us, and what is lost is the notion of the common good. When the “I” takes precedence over the “We,” the result is weakened relationships, marriages, families, communities, neighborhoods, congregations, charities, regions, and entire societies.
What has become chillingly clear is the insight Émile Durkheim articulated in the 1890s, that in a society in which there was anomie—the absence of a shared moral code—there would be a rise in the rate of suicides. We cannot live without a structure, whether consciously learned or unconsciously absorbed, to guide us through what is otherwise unstructured chaos. This has surely been a factor in the upsurge of depression, stress-related syndromes, drug and alcohol abuse, and attempted and actual suicides especially among young people, teenage girls most of all.
The reason we cannot outsource morality to the market or the state is that they operate on completely different principles. The simplest way of seeing this is by a thought experiment. Imagine you have $1,000 and you decide to share it with nine others; you are left with a tenth of what you had at the beginning. Imagine you have total power, say a 100 percent share in a company, and then decide to share 90 percent of it among nine others; you have a tenth of the power you had before. Wealth and power operate by division. The more we share, the less we have.
Imagine now that you have a certain measure of influence, or friendship, or knowledge, or love, and you decide to share that with nine others: you do not have less. You may have more. That is because these are social goods: goods that exist by sharing. These are goods that have a moral or spiritual dimension, and they have this rare quality that the more we share, the more we have.
That is why the market and the state, the fields of economics and politics, are arenas of competition, while morality is the arena of cooperation. A society with only competition and very limited cooperation will be abrasive and ruthless, with glittering prizes for the winners and no consolation for the losers. It will be a low-trust environment in which lawyers play a large role and mutual confidence a very limited one.
A society with a strong, shared moral code is a high-trust place, where the winners set an example of caring for the losers—indeed, where they do not speak of them as losers but as fellow citizens. High-trust societies are those in which the “We” resonates more loudly than the “I,” where CEOs care for the team not just for themselves, where politicians act for the good of all, especially the marginal and disadvantaged, and where people in distress find comfort in community rather than being left to suffer on their own. We need to recover the sense of “all of us together.”
THIS IS not a work of cultural pessimism. I am hopeful for the future. Two of the participants in my BBC radio series on morality, David Brooks and Jean Twenge, have noted that Generation Z (or Gen-Z for short), those born in or after 1995, are more moral and altruistic than the preceding generations, Generation X and Millennials. I had the lively experience during the recording of our morality programs of sitting with teenagers from four British secondary schools, discussing moral challenges of our time, and despite the fact that the other participants were among the greatest experts in their field in the world, the teenagers emerged as the stars of the show. They were committed, insightful, thoughtful, and wise. Our Google Ngram searches showed that moral language has been used increasingly since the turn of the millennium. My own experience of lecturing in Britain and America these past few years has convinced me that there is a genuine interest in recovering a moral framework to guide us in some of the formidable challenges facing us, from climate change to artificial intelligence to mass immigration to economic inequality. The G7 has signed up to an engagement with impact economics, an approach to business that quantifies social impact as well as profit. (I say something about this in the last chapter.) These are positive signs.
There are those who believe that the loss of a shared moral framework is irreparable. Some have even spoken of a descent into the dark ages.20 Ludwig Wittgenstein said that trying to salvage damaged traditions by willful effort is like trying with one’s bare hands to repair a broken spider’s web. I do not share these views.
Hope is to be found in a remarkable passage in Steven Pinker’s The Language Instinct. He tells the story of linguists who studied pidgin English, originally used by slaves. (Prince Philip, for example, was delighted to discover on a visit to Papua New Guinea that he was referred to as fella belong Mrs. Queen.) A pidgin has words but no grammar, vocabulary but no syntax. What the linguists discovered, to their amazement, is that the children of pidgin speakers had created their own new language, called a creole, which is pidgin plus grammar. Their parents had been robbed of a language, but they, without even knowing what they were doing, had simply invented one.21
There exists, within nature and humanity, an astonishing range of powers to heal what has been harmed and mend what has been broken. These powers are embedded within life itself, with its creativity and capacity for self-renewal. That is the empirical basis of hope. Nature favors species able to recover, and history favors cultures that can do so.
Once, when undergoing a medical checkup, a doctor put me on a treadmill. “What are you measuring?” I asked him. “How fast I can go, or how long?”
“Neither,” he replied. “What I want to measure is, when you get off the machine, how long it takes your pulse to return to normal.” I realized that health is not a matter of never being ill. It is the ability to recover.
Recovering liberal democratic freedom will involve emphasizing responsibilities as well as rights; shared rules, not just individual choices; caring for others as well as for ourselves; and making space not just for self-interest but also for the common good. Morality is an essential feature of our human environment, as important as the market and the state but outsourceable to neither. Morality humanizes the competition for wealth and power. It is the redemption of our solitude.
When we move from the politics of “Me” to the politics of “Us,” we rediscover those life-transforming, counterintuitive truths: that a nation is strong when it cares for the weak, that it becomes rich when it cares for the poor, that it becomes invulnerable when it cares about the vulnerable. If we care for the future of democracy, we must recover that sense of shared morality that binds us to one another in a bond of mutual compassion and care. There is no liberty without morality, no freedom without responsibility, no viable “I” without the sustaining “We.”