Tell people there’s an invisible man in the sky who created the universe, and the vast majority will believe you. Tell them the paint is wet, and they have to touch it to be sure.
—George Carlin
This is the chapter most of you have been waiting for. I know this from conversations and correspondence. As soon as I mention the topic of rationality, people ask me why humanity appears to be losing its mind.
At the time of this writing, a glorious milestone in the history of rationality is coming into view: vaccines likely to end a deadly plague are being administered less than a year after the plague emerged. Yet in that same year, the Covid-19 pandemic set off a carnival of cockamamie conspiracy theories: that the disease was a bioweapon engineered in a Chinese lab, a hoax spread by the Democratic Party to sabotage Donald Trump’s chances of reelection, a subterfuge by Bill Gates to implant trackable microchips in people’s bodies, a plot by a cabal of global elites to control the world economy, a symptom of the rollout of fifth-generation mobile data networks, and a means for Anthony Fauci (director of the National Institute of Allergy and Infectious Diseases) to earn windfall profits from a vaccine.1 Shortly before the announcements of the vaccines, a third of Americans said they would reject them, part of an anti-vax movement that opposes the most benevolent invention in the history of our species.2 Covid quackery has been endorsed by celebrities, politicians, and, disturbingly, the most powerful person on earth at the time of the pandemic, US president Donald Trump.
Trump himself, who was consistently supported by around 40 percent of the American public, raised further doubts throughout his presidency on our collective capacity for reason. He predicted in February 2020 that Covid-19 would disappear “like a miracle,” and endorsed quack cures like malaria drugs, bleach injections, and light probes. He disdained basic public health measures like masks and distancing, even after he himself was stricken, inspiring millions of Americans to flout the measures and amplifying the toll of death and financial hardship.3 It was all part of a larger rejection of the norms of reason and science. Trump told around thirty thousand lies during his term, had a press secretary who touted “alternative facts,” claimed that climate change was a Chinese hoax, and suppressed knowledge from scientists in federal agencies overseeing public health and environmental protection.4 He repeatedly publicized QAnon, the millions-strong conspiracy cult that credits him with combating a cabal of Satan-worshiping pedophiles embedded in the American “deep state.” And he refused to acknowledge his defeat in the 2020 election, fighting crackbrained legal battles to overturn the results, led by lawyers who cited yet another conspiracy, this one by Cuba, Venezuela, and several governors and officials of his own party.
Covid quackery, climate denial, and conspiracy theories are symptoms of what some are calling an “epistemological crisis” and a “post-truth era.”5 Another symptom is fake news. In the second decade of the twenty-first century, social media have become sluices for torrents of tall tales like these:6
pope francis shocks world, endorses donald trump for president
yoko ono: “i had an affair with hillary clinton in the 1970s”
democrats vote to enhance med care for illegals now, vote down vets waiting 10 years for same service
trump to ban all tv shows that promote gay activity
woman sues samsung for $1.8m after cell phone gets stuck in her vagina
lottery winner arrested for dumping $200,000 of manure on ex-boss’s lawn
Also rampant are beliefs in ghouls, black magic, and other superstitions. As I mentioned in the first chapter, three quarters of Americans hold at least one paranormal belief. Here are some figures from the first decade of our century:7
Possession by the devil, 42 percent
Extrasensory perception, 41 percent
Ghosts and spirits, 32 percent
Astrology, 25 percent
Witches, 21 percent
Communicating with the dead, 29 percent
Reincarnation, 24 percent
Spiritual energy in mountains, trees, and crystals, 26 percent
Evil eye, curses, spells, 16 percent
Consulted a fortune-teller or psychic, 15 percent
Just as disturbingly to someone like me who likes to plot human progress, these beliefs show few signs of decreasing over the decades, and younger generations are no more skeptical than their elders (with astrology they are more credulous).8
Also popular are a miscellany of canards that the historian of science Michael Shermer calls “weird beliefs.”9 Many people endorse conspiracy theories like Holocaust denial, Kennedy assassination plots, and the 9/11 “Truther” theory that the twin towers were felled by a controlled demolition to justify the American invasion of Iraq. Various seers, cults, and ideologies have convinced their followers that the end of the world is nigh; they disagree on when, but are quick to postdate their predictions when they are unpleasantly surprised to find themselves living another day. And a quarter to a third of Americans believe we have been visited by extraterrestrials, either the contemporary ones that mutilate cattle and impregnate women to breed alien–human hybrids, or the ancient ones who built the pyramids and Easter Island statues.
How can we explain this pandemic of poppycock? As with Charlie Brown in the Peanuts strip, it can make your stomach hurt, especially when Lucy appears to represent a large portion of our compatriots:
Let’s begin by setting aside three popular explanations, not because they are wrong but because they are too glib to be satisfying. The first of these, I must admit, is the inventory of logical and statistical fallacies explained in the preceding chapters. To be sure, many superstitions originate in overinterpreting coincidences, failing to calibrate evidence against priors, overgeneralizing from anecdotes, and leaping from correlation to causation. A prime example is the misconception that vaccines cause autism, reinforced by the observation that autistic symptoms appear, coincidentally, around the age at which children are first inoculated. And all of them represent failures of critical thinking and of the grounding of belief in evidence; that’s what entitles us to say they’re false in the first place. Yet nothing from the cognitive psychology lab could have predicted QAnon, nor are its adherents likely to be disabused by a tutorial in logic or probability.
A second unpromising lead is to blame today’s irrationality on the current scapegoat for everything, social media. Conspiracy theories and viral falsehoods are probably as old as language.10 What are the accounts of miracles in scriptures, after all, but fake news about paranormal phenomena? For centuries Jews have been accused of conspiring to poison wells, sacrifice Christian children, control the world economy, and foment communist uprisings. At many times in history, other races, minorities, and guilds have also been credited with nefarious plots and targeted with violence.11 The political scientists Joseph Uscinski and Joseph Parent tracked the popularity of conspiracy theories in letters to the editor of major American newspapers from 1890 to 2010 and found no change over that period; nor did the numbers rise in the subsequent decade.12 As for fake news, before it was disseminated on Twitter and Facebook, outlandish episodes which happened to a friend of a friend were circulated as urban legends (the Hippie Babysitter, the Kentucky Fried Rat, Halloween Sadists) or emblazoned on the covers of supermarket tabloids (Baby Born Talking: Describes Heaven; Dick Cheney Is a Robot; Surgeons Transplant Young Boy’s Head onto His Sister’s Body).13 Social media may indeed be accelerating their spread, but the appetite for florid fantasies lies deep in human nature: people, not algorithms, compose these stories, and it’s people they appeal to. And for all the panic that fake news has sown, its political impact is slight: it titillates a faction of partisans rather than swaying a mass of undecideds.14
Finally, we must go beyond offhand excuses that just attribute one irrationality to another. It’s never a good explanation to say that people embrace some false belief because it gives them comfort or helps them make sense of the world, because that only raises the question of why people should get comfort and closure from beliefs that could not possibly do them any good. Reality is a powerful selection pressure. A hominid that soothed itself by believing that a lion was a turtle or that eating sand would nourish its body would be outreproduced by its reality-based rivals.
Nor will it do to write off humans as hopelessly irrational. Just as our foraging ancestors lived by their wits in unforgiving ecosystems, today’s conspiracy theorists and miracle-believers pass the demanding tests of their own worlds: they hold down jobs, bring up kids, and keep a roof over their heads and food in the fridge. For that matter, a favorite riposte by Trump’s defenders to the charge that he was cognitively impaired was “If he’s so dumb, how did he get to be president?” And unless you believe that scientists and philosophers are a superior breed of human, you have to acknowledge that most members of our species have the capacity to discover and accept the canons of rationality. To understand popular delusions and the madness of crowds, we have to examine cognitive faculties that work well in some environments and for some purposes but that go awry when applied at scale, in novel circumstances, or in the service of other goals.
Rationality is disinterested. It is the same for everyone everywhere, with a direction and momentum of its own. For that reason rationality can be a nuisance, an impediment, an affront. In Rebecca Newberger Goldstein’s novel 36 Arguments for the Existence of God: A Work of Fiction, an eminent literary scholar explains to a graduate student why he abhors deductive thinking:15
It is a form of torture for the imaginatively gifted, the very totalitarianism of thought, one line being made to march strictly in step behind the other, all leading inexorably to a single undeviating conclusion. A proof out of Euclid recalls to my mind nothing so much as the troops goose-stepping before the Supreme Dictator. I have always delighted in my mind’s refusal to follow a single line of any mathematical explanation offered to me. Why should these exacting sciences exact anything from me? Or as Dostoevsky’s Underground Man shrewdly argues, “Good God, what do I care about the laws of nature and arithmetic if, for one reason or another, I don’t like these laws, including the ‘two times two is four’?” Dostoevsky spurned the hegemaniacal logic and I can do no less.
The obvious reason that people avoid getting onto a train of reasoning is that they don’t like where it takes them. It may terminate in a conclusion that is not in their interest, such as an allocation of money, power, or prestige that is objectively fair but benefits someone else. As Upton Sinclair pointed out, “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.”16
The time-honored method to head off a line of reasoning before it arrives at an unwanted destination is to derail the reasoner by brute force. But there are less crude methods that exploit the inevitable uncertainties surrounding any issue and steer the argument in a favored direction with sophistry, spin-doctoring, and the other arts of persuasion. Both members of an apartment-hunting couple, for example, may emphasize the reasons why the flat that just happens to be closer to where he or she works is objectively better for the two of them, such as its space or affordability. It’s the stuff of everyday arguments.
The mustering of rhetorical resources to drive an argument toward a favored conclusion is called motivated reasoning.17 The motive may be to end at a congenial conclusion, but it may also be to flaunt the arguer’s wisdom, knowledge, or virtue. We all know the barroom blowhard, the debating champ, the legal eagle, the mansplainer, the competitive distance urinator, the intellectual pugilist who would rather be right than get it right.18
Many of the biases that populate the lists of cognitive infirmities are tactics of motivated reasoning. In chapter 1 we saw confirmation bias, such as in the selection task, where people who are asked to turn over the cards that test an “If P then Q” rule choose the P card, which can confirm it, but not the not-Q card, which can falsify it.19 They turn out to be more logical when they want the rule to be false. When the rule says that if someone has their emotional profile, that person is in danger of dying young, they correctly test the rule (and at the same time reassure themselves) by homing in on the people who have their profile and on the people who lived to a ripe old age.20
We are also motivated to regulate our information diet. In biased assimilation (or selective exposure), people seek out arguments that ratify their beliefs and shield themselves from those that might disconfirm them.21 (Who among us doesn’t take pleasure in reading editorials that are politically congenial, and get irritated by those from the other side?) Our self-protection continues with the arguments that do reach us. In biased evaluation, we deploy our ingenuity to upvote the arguments that support our position and pick nits in the ones that refute it. And there are the classic informal fallacies we saw in chapter 3: ad hominem, authority, bandwagon, genetic, affective, straw man, and so on. We are even biased about our biases. The psychologist Emily Pronin has found that, as in the mythical town where all the children are above average, a large majority of Americans consider themselves less susceptible to cognitive biases than the average American, and virtually none consider themselves more biased.22
So much of our reasoning seems tailored to winning arguments that some cognitive scientists, like Hugo Mercier and Dan Sperber, believe it is the adaptive function of reasoning.23 We evolved not as intuitive scientists but as intuitive lawyers. While people often try to get away with lame arguments for their own positions, they are quick to spot fallacies in other people’s arguments. Fortunately, this hypocrisy can be mobilized to make us more rational collectively than any of us is individually. The wisecrack circulated among veterans of committees that the IQ of a group is equal to the lowest IQ of any member of the group divided by the size of the group turns out to be exactly wrong.24 When people evaluate an idea in small groups with the right chemistry, which is that they don’t agree on everything but have a common interest in finding the truth, they catch on to each other’s fallacies and blind spots, and usually the truth wins. When individuals are given the Wason selection task, for example, only one in ten picks the right cards, but when they are put in groups, around seven in ten get it right. All it takes is for one member to see the correct answer, and almost always that person persuades the others.
People’s desire to get their way or act as know-it-alls can explain only part of our public irrationality. You can appreciate another part by considering this problem in evidence-based policy. Do gun-control measures decrease crime, because fewer criminals can obtain them, or increase it, because law-abiding citizens can no longer protect themselves?
Here are data from a hypothetical study that divided cities into those that adopted a ban on concealed handguns (first row) and those that did not (second row).25 Laid out in each column are the number of those cities that saw their crime rates improve (left column) or worsen (right column). From these data, would you conclude that gun control is effective at reducing crime?
Crime rate decreased |
Crime rate increased |
|
Gun control |
223 |
75 |
No gun control |
107 |
21 |
In fact, the data (which are fake) suggest that gun control increases crime. It’s easy to get it wrong, because the large number of cities with gun control in which the crime rate declined, 223, pops out. But that could just mean that crime decreased in the whole country, policy or no policy, and that more cities tried gun control than didn’t, a trend in political fashions. We need to look at the ratios. In cities with gun control, it’s around three to one (223 versus 75); in cities without, it is around five to one (107 versus 21). On average, the data say, a city was better off without gun control than with it.
As in the Cognitive Reflection Test (chapter 1), getting to the answer requires a bit of numeracy: the ability to set aside first impressions and do the math. People who are so-so in numeracy tend to be distracted by the big number and conclude that gun control works. But the real point of this illustration, devised by the legal scholar Dan Kahan and his collaborators, is what happened with the numerate respondents. The numerate Republicans tended to get the answer right, the numerate Democrats to get it wrong. The reason is that Democrats start out believing that gun control is effective and are all too quick to accept data showing they were right all along. Republicans can’t stomach the idea and scrutinize the data with a gimlet eye, which, if numerate, spots the real pattern.
Republicans might attribute their success to being more objective than the bleeding-heart libs, but of course the researchers ran a condition in which the knee-jerk wrong answer was congenial to Republicans. They simply flipped the column labels, so that the data now suggested that gun control works: it stanched a fivefold increase in crime, holding it to just a threefold increase. This time the numerate Republicans earned the dunce caps while the Democrats were the Einsteins. In a control condition, the team picked an issue that triggered neither Democrats nor Republicans: whether a skin cream was effective at treating a rash. With neither faction having a dog in the fight, the numerate Republicans and numerate Democrats performed the same. A recent meta-analysis of fifty studies by the psychologist Peter Ditto and his colleagues confirms the pattern. In study after study, liberals and conservatives accept or reject the same scientific conclusion depending on whether or not it supports their talking points, and they endorse or oppose the same policy depending on whether it was proposed by a Democratic or a Republican politician.26
Politically motivated numeracy and other forms of biased evaluation show that people reason their way into or out of a conclusion even when it offers them no personal advantage. It’s enough that the conclusion enhances the correctness or nobility of their political, religious, ethnic, or cultural tribe. It’s called, obviously enough, the myside bias, and it commandeers every kind of reasoning, even logic.27 Recall that the validity of a syllogism depends on its form, not its content, but that people let their knowledge seep in and judge an argument valid if it ends in a conclusion they know is true or want to be true. The same thing happens when the conclusion is politically congenial:
If college admissions are fair, then affirmative action laws are no longer necessary.
College admissions are not fair.
Therefore, affirmative action laws are necessary.
If less severe punishments deter people from committing crime, then capital punishment should not be used.
Less severe punishments do not deter people from committing crime.
Therefore, capital punishment should be used.
When people are asked to verify the logic of these arguments, both of which commit the formal fallacy of denying the antecedent, liberals mistakenly ratify the first and correctly nix the second; conservatives do the opposite.28
In Duck Soup, Chico Marx famously asked, “Who ya gonna believe, me or your own eyes?” When people are in the throes of the myside bias, the answer may not be their own eyes. In an update of a classic study showing that football fans always see more infractions by the opposing team, Kahan and collaborators showed a video of a protest in front of a building.29 When the title labeled it a protest against abortion at a health clinic, conservatives saw a peaceful demonstration, while liberals saw the protesters block the entrance and intimidate the enterers. When it was labeled a protest against the exclusion of gay people at a military recruiting center, it was the conservatives who saw pitchforks and torches and the liberals who saw Mahatma Gandhi.
One magazine reported the gun-control study under the headline The Most Depressing Discovery about the Brain, Ever. Certainly there are reasons to be depressed. One is that opinions that go against the scientific consensus, like creationism and the denial of human-made climate change, may not be symptoms of innumeracy or scientific illiteracy. Kahan has found that most believers and deniers are equally clueless about the scientific facts (many believers in climate change, for example, think that it has something to do with toxic waste dumps and the ozone hole). What predicts their belief is their politics: the farther to the right, the more denial.30
Another cause for gloom is that for all the talk of a replicability crisis, the myside bias is only too replicable. In The Bias That Divides Us, the psychologist Keith Stanovich finds it in every race, gender, cognitive style, education level, and IQ quantile, even among people who are too clever to fall for other cognitive biases like base-rate neglect and the gambler’s fallacy.31 The myside bias is not an across-the-board personality trait, but presses on whichever trigger or hot button is connected to the reasoner’s identity. Stanovich relates it to our political moment. We are not, he suggests, living in a “post-truth” society. The problem is that we are living in a myside society. The sides are the left and the right, and both sides believe in the truth but have incommensurable ideas of what the truth is. The bias has invaded more and more of our deliberations. The spectacle of face masks during a respiratory pandemic turning into political symbols is just the most recent symptom of the polarization.
We’ve long known that humans are keen to divide themselves into competitive teams, but it’s not clear why it’s now the left–right split that is pulling each side’s rationality in different directions rather than the customary fault lines of religion, race, and class. The right–left axis aligns with several moral and ideological dimensions: hierarchical versus egalitarian, libertarian versus communitarian, throne-and-altar versus Enlightenment, tribal versus cosmopolitan, tragic versus utopian visions, honor versus dignity cultures, binding versus individualizing moralities.32 But recent flip-flops in which side supports which cause, such as immigration, trade, and sympathy for Russia, suggests that the political sides have become sociocultural tribes rather than coherent ideologies.
In a recent diagnosis, a team of social scientists concluded that the sides are less like literal tribes, which are held together by kinship, than religious sects, which are held together by faith in their moral superiority and contempt for opposing sects.33 The rise of political sectarianism in the United States is commonly blamed (like everything else) on social media, but its roots lie deeper. They include the fractionation and polarization of broadcast media, with partisan talk radio and cable news displacing national networks; gerrymandering and other geographic distortions of political representation, which incentivize politicians to cater to cliques rather than coalitions; the reliance of politicians and think tanks on ideologically committed donors; the self-segregation of educated liberal professionals into urban enclaves; and the decline of class-crossing civil-society organizations like churches, service clubs, and volunteer groups.34
Could the myside bias possibly be rational? There is a Bayesian argument that one ought to weigh new evidence against the totality of one’s prior beliefs rather than taking every new study at face value. If liberalism has proven itself to be correct, then a study that appears to support a conservative position should not be allowed to overturn one’s beliefs. Not surprisingly, this was the response of several liberal academics to Ditto’s meta-analysis suggesting that political bias is bipartisan.35 Nothing guarantees that the favorite positions of the left and right at any historical moment will be aligned with the truth 50–50. Even if both sides interpret reality through their own beliefs, the side whose beliefs are warranted will be acting rationally. Maybe, they continue, the well-documented left-wing lopsidedness of academia is not an irrational bias but an accurate calibration of their Bayesian priors to the fact that the left is always correct.
The response from conservatives is (quoting Hamlet), “Lay not that flattering unction to your soul.”36 Though it may be true that left-wing positions are vindicated more often than right-wing ones (especially if, for whatever reason, the left is more congenial to science than the right), in the absence of disinterested benchmarks neither side is in a position to say. Certainly history has no shortage of examples of both sides getting it wrong, including some real doozies.37 Stanovich notes that the problem in justifying motivated reasoning with Bayesian priors is that the prior often reflects what the reasoner wants to be true rather than what he or she has grounds for believing is true.
There is a different and more perverse rationality to the myside bias, coming not from Bayes’s rule but from game theory. Kahan calls it expressive rationality: reasoning that is driven by the goal of being valued by one’s peer group rather than attaining the most accurate understanding of the world. People express opinions that advertise where their heart lies. As far as the fate of the expresser in a social milieu is concerned, flaunting those loyalty badges is anything but irrational. Voicing a local heresy, such as rejecting gun control in a Democratic social circle or advocating it in a Republican one, can mark you as a traitor, a quisling, someone who “doesn’t get it,” and condemn you to social death. Indeed, the best identity-signaling beliefs are often the most outlandish ones. Any fair-weather friend can say the world is round, but only a blood brother would say the world is flat, willingly incurring ridicule by outsiders.38
Unfortunately, what’s rational for each of us seeking acceptance in a clique is not so rational for all of us in a democracy seeking the best understanding of the world. Our problem is that we are trapped in a Tragedy of the Rationality Commons.39
The humor in the Peanuts strip in which Lucy gets buried in snow while insisting that it rises from the ground exposes a limitation on any explanation of human irrationality that invokes the ulterior motives in motivated reasoning. No matter how effectively a false belief flaunts the believer’s mental prowess or loyalty to the tribe, it’s still false, and should be punished by the cold, hard facts of the world. As the novelist Philip K. Dick wrote, reality is that which, when you stop believing in it, doesn’t go away. Why doesn’t reality push back and inhibit people from believing absurdities or from rewarding those who assert and share them?
The answer is that it depends what you mean by “believe.” Mercier notes that holders of weird beliefs often don’t have the courage of their convictions.40 Though millions of people endorsed the rumor that Hillary Clinton ran a child sex trafficking ring out of the basement of the Comet Ping Pong pizzeria in Washington (the Pizzagate conspiracy theory, a predecessor of QAnon), virtually none took steps commensurate with such an atrocity, such as calling the police. The righteous response of one of them was to leave a one-star review on Google. (“The pizza was incredibly undercooked. Suspicious professionally dressed men by the bar area that looked like regulars kept staring at my son and other kids in the place.”) It’s hardly the response most of us would have if we literally thought that children were being raped in the basement. At least Edgar Welch, the man who burst into the pizzeria with his gun blazing in a heroic attempt to rescue the children, took his beliefs seriously. The millions of others must have believed the rumor in a very different sense of “believe.”
Mercier also points out that impassioned believers in vast nefarious conspiracies, like the 9/11 Truthers and the chemtrail theorists (who hold that the water-vapor contrails left by jetliners are chemicals dispensed in a secret government program to drug the population), publish their manifestos and hold their meetings in the open, despite their belief in a brutally effective plot by an omnipotent regime to suppress brave truth-tellers like them. It’s not the strategy you see from dissidents in undeniably repressive regimes like North Korea or Saudi Arabia. Mercier, invoking a distinction made by Sperber, proposes that conspiracy theories and other weird beliefs are reflective, the result of conscious cogitation and theorizing, rather than intuitive, the convictions we feel in our bones.41 It’s a powerful distinction, though I draw it a bit differently, closer to the contrast that the social psychologist Robert Abelson (and the comedian George Carlin) drew between distal and testable beliefs.42
People divide their worlds into two zones. One consists of the physical objects around them, the other people they deal with face to face, the memory of their interactions, and the rules and norms that regulate their lives. People have mostly accurate beliefs about this zone, and they reason rationally within it. Within this zone, they believe there’s a real world and that beliefs about it are true or false. They have no choice: that’s the only way to keep gas in the car, money in the bank, and the kids clothed and fed. Call it the reality mindset.
The other zone is the world beyond immediate experience: the distant past, the unknowable future, faraway peoples and places, remote corridors of power, the microscopic, the cosmic, the counterfactual, the metaphysical. People may entertain notions about what happens in these zones, but they have no way of finding out, and anyway it makes no discernible difference to their lives. Beliefs in these zones are narratives, which may be entertaining or inspiring or morally edifying. Whether they are literally “true” or “false” is the wrong question. The function of these beliefs is to construct a social reality that binds the tribe or sect and gives it a moral purpose. Call it the mythology mindset.
Bertrand Russell famously said, “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” The key to understanding rampant irrationality is to recognize that Russell’s statement is not a truism but a revolutionary manifesto. For most of human history and prehistory, there were no grounds for supposing that propositions about remote worlds were true. But beliefs about them could be empowering or inspirational, and that made them desirable enough.
Russell’s maxim is the luxury of a technologically advanced society with science, history, journalism, and their infrastructure of truth-seeking, including archival records, digital datasets, high-tech instruments, and communities of editing, fact-checking, and peer review. We children of the Enlightenment embrace the radical creed of universal realism: we hold that all our beliefs should fall within the reality mindset. We care about whether our creation story, our founding legends, our theories of invisible nutrients and germs and forces, our conceptions of the powerful, our suspicions about our enemies, are true or false. That’s because we have the tools to get answers to these questions, or at least to assign them warranted degrees of credence. And we have a technocratic state that should, in theory, put these beliefs into practice.
But as desirable as that creed is, it is not the natural human way of believing. In granting an imperialistic mandate to the reality mindset to conquer the universe of belief and push mythology to the margins, we are the weird ones—or, as evolutionary social scientists like to say, the WEIRD ones: Western, Educated, Industrialized, Rich, Democratic.43 At least, the highly educated among us are, in our best moments. The human mind is adapted to understanding remote spheres of existence through a mythology mindset. It’s not because we descended from Pleistocene hunter-gatherers specifically, but because we descended from people who could not or did not sign on to the Enlightenment ideal of universal realism. Submitting all of one’s beliefs to the trials of reason and evidence is an unnatural skill, like literacy and numeracy, and must be instilled and cultivated.
And for all the conquests of the reality mindset, the mythology mindset still occupies swaths of territory in the landscape of mainstream belief. The obvious example is religion. More than two billion people believe that if one doesn’t accept Jesus as one’s savior one will be damned to eternal torment in hell. Fortunately, they don’t take the next logical step and try to convert people to Christianity at swordpoint for their own good, or torture heretics who might lure others into damnation. Yet in past centuries, when Christian belief fell into the reality zone, many Crusaders, Inquisitors, conquistadors, and soldiers in the Wars of Religion did exactly that. Like the Comet Ping Pong redeemer, they treated their beliefs as literally true. For that matter, though many people profess to believe in an afterlife, they seem to be in no hurry to leave this vale of tears for eternal bliss in paradise.
Thankfully, Western religious belief is safely parked in the mythology zone, where many people are protective of its sovereignty. In the mid-aughts, the “New Atheists,” Sam Harris, Daniel Dennett, Christopher Hitchens, and Richard Dawkins, became targets of vituperation not just from Bible-thumping evangelists but also from mainstream intellectuals. These faitheists (as the biologist Jerry Coyne called them), or believers in belief (Dennett’s term), did not counter that God in fact exists.44 They implied that it is inappropriate, or uncouth, or just not done, to consider God’s existence a matter of truth or falsity. Belief in God is an idea that falls outside the sphere of testable reality.
Another zone of mainstream unreality is the national myth. Most countries enshrine a founding narrative as part of their collective consciousness. At one time these were epics of heroes and gods, like the Iliad, the Aeneid, Arthurian legends, and Wagnerian operas. More recently they have been wars of independence or anticolonial struggles. Common themes include the nation’s ancient essence defined by a language, culture, and homeland; an extended slumber and glorious awakening; a long history of victimization and oppression; and a generation of superhuman liberators and founders. Guardians of the mythical heritage don’t feel a need to get to the bottom of what actually transpired, and may resent the historians who place it in the reality zone and unearth its shallow history, constructed identity, reciprocal provocations with the neighbors, and founding fathers’ feet of clay.
Still another zone of not-quite-true-not-quite-false belief is historical fiction and fictionalized history. It seems pedantic to point out that Henry V did not deliver the stirring words on Saint Crispin’s Day that Shakespeare attributed to him. Yet the play purports to be an account of real events rather than a figment of the playwright’s imagination, and we would not enjoy it in the same way otherwise. The same is true of fictionalized histories of more recent wars and struggles, which are, in effect, fake news set in the recent past. When the events come too close to the present or the fictionalization rewrites important facts, historians can sound an alarm, as when Oliver Stone brought to life an assassination conspiracy theory in the 1991 movie JFK. In 2020, the columnist Simon Jenkins objected to the television series The Crown, a dramatized history of Queen Elizabeth and her family which took liberties with many of the depicted events: “When you turn on your television tonight, imagine seeing the news acted rather than read. . . . Afterwards the BBC flashes up a statement saying all this was ‘based on true events,’ and hoping we enjoyed it.” 45 Yet his was a voice crying out in the wilderness. Most critics and viewers had no problem with the sumptuously filmed falsehoods, and Netflix refused to post a warning that some of the scenes were fictitious (though they did post a trigger warning about bulimia).46
The boundary between the reality and the mythology zones can vary with the times and the culture. Since the Enlightenment, the tides in the modern West have eroded the mythology zone, a historical shift that the sociologist Max Weber called “the disenchantment of the world.” But there are always skirmishes at the borders. The brazen lies and conspiracies of Trumpian post-truth can be seen as an attempt to claim political discourse for the land of mythology rather than the land of reality. Like the plots of legends, scripture, and drama, they are a kind of theater; whether they are provably true or false is beside the point.
Once we appreciate that humans can hold beliefs they don’t treat as factually true, we can begin to make sense of the rationality paradox—how a rational animal can embrace so much claptrap. It’s not that the conspiracy theorists, fake-news sharers, and consumers of pseudoscience always construe their myths as mythological. Sometimes their beliefs cross the line into reality with tragic results, as in Pizzagate, anti-vaxxers, and the Heaven’s Gate cult, whose thirty-nine devotees committed suicide in 1997 in preparation for their souls to be whisked away by a spaceship following the Hale-Bopp comet. But predispositions in human nature can combine with mythological truthiness to make weird beliefs easy to swallow. Let’s look at three genres.
Pseudoscience, paranormal woo-woo, and medical quackery engage some of our deepest cognitive intuitions.47 We are intuitive dualists, sensing that minds can exist apart from bodies.48 It comes naturally to us, and not just because we can’t see the neural networks which underlie the beliefs and desires of ourselves and others. Many of our experiences really do suggest that the mind is not tethered to the body, including dreams, trances, out-of-body experiences, and death. It’s not a leap for people to conclude that minds can commune with reality and with each other without needing a physical medium. And so we have telepathy, clairvoyance, souls, ghosts, reincarnation, and messages from the great beyond.
We are also intuitive essentialists, sensing that living things contain invisible substances that give them their form and powers.49 These intuitions inspire people to probe living things for their seeds, drugs, and poisons. But the mindset also makes people believe in homeopathy, herbal remedies, purging and bloodletting, and a rejection of foreign adulterants such as vaccines and genetically modified foods.
And we are intuitive teleologists.50 Just as our own plans and artifacts are designed with a purpose, so, we are apt to think, is the complexity of the living and nonliving world. Thus we are receptive to creationism, astrology, synchronicity, and the mystical belief that everything happens for a reason.
A scientific education is supposed to stifle these primitive intuitions, but for several reasons its reach is limited. One is that beliefs that are sacred to a religious or cultural faction, like creationism, the soul, and a divine purpose, are not easily surrendered, and they may be guarded within people’s mythology zone. Another is that even among the highly educated, scientific understanding is shallow. Few people can explain why the sky is blue or why the seasons change, let alone population genetics or viral immunology. Instead, educated people trust the university-based scientific establishment: its consensus is good enough for them.51
Unfortunately, for many people the boundary between the scientific establishment and the pseudoscientific fringe is obscure. The closest that most people come to science in their own lives is their doctor, and many doctors are more folk healers than experts in randomized clinical trials. Indeed, some of the celebrity doctors who appear on daytime talk shows are charlatans who exuberantly shill new-age flimflam. Mainstream television documentaries and news shows may also blur the lines and credulously dramatize fringe claims like ancient astronauts and crime-fighting psychics.52
For that matter, bona fide science communicators must shoulder some of the blame for failing to equip people with the deep understanding that would make pseudoscience incredible on the face of it. Science is often presented in schools and museums as just another form of occult magic, with exotic creatures and colorful chemicals and eye-popping illusions. Foundational principles, such as that the universe has no goals related to human concerns, that all physical interactions are governed by a few fundamental forces, that living bodies are intricate molecular machines, and that the mind is the information-processing activity of the brain, are never articulated, perhaps because they would seem to insult religious and moral sensibilities. We should not be surprised that what people take away from science education is a syncretic mishmash, where gravity and electromagnetism coexist with psi, qi, karma, and crystal healing.
To understand viral humbug such as urban legends, tabloid headlines, and fake news, we have to remember that it is fantastically entertaining. It plays out themes of sex, violence, revenge, danger, fame, magic, and taboo that have always titillated patrons of the arts, high and low. A fake headline like FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide would be an excellent plot in a suspense thriller. A recent quantitative analysis of the content of fake news concluded that “the same features that make urban legends, fiction, and in fact any narrative, culturally attractive also operate for online misinformation.”53
Often the entertainment spills into genres of comedy, including slapstick, satire, and farce: Morgue Employee Cremated by Mistake While Taking a Nap; Donald Trump Ends School Shootings by Banning Schools; Bigfoot Keeps Lumberjack as Love Slave. QAnon falls into still another genre of entertainment, the multiplatform alternate-reality game.54 Adherents parse cryptic clues periodically dropped by Q (the hypothetical government whistleblower), crowdsource their hypotheses, and gain internet fame by sharing their discoveries.
It’s no surprise that people seek out all manner of entertainment. What shocks us is that each of these works of art makes a factual claim. Yet our queasiness about blurring fact and fiction is not a universal human reaction, particularly when it pertains to zones that are remote from immediate experience, like faraway places and the lives of the rich and powerful. Just as religious and national myths become entrenched in the mainstream when they are felt to provide moral uplift, fake news may go viral when its spreaders think a higher value is at stake, like reinforcing solidarity within their own side and reminding comrades about the perfidiousness of the other one. Sometimes the moral is not even a coherent political strategy but a sense of moral superiority: the impression that rival social classes, and powerful institutions from which the sharers feel alienated, are decadent and corrupt.
Conspiracy theories, for their part, flourish because humans have always been vulnerable to real conspiracies.55 Foraging people can’t be too careful. The deadliest form of warfare among tribal peoples is not the pitched battle but the stealthy ambush and the predawn raid.56 The anthropologist Napoleon Chagnon writes that the Amazonian Yanomamö have the word nomohori, “dastardly trick,” for acts of treachery such as inviting neighbors to a feast and then massacring them on cue. Plots by enemy coalitions are unlike other hazards such as predators and lightning bolts because they deploy their ingenuity to penetrate the targets’ defenses and cover their own tracks. The only safeguard against this cloak-and-dagger subterfuge is to outthink them preemptively, which can lead to convoluted trains of conjecture and a refusal to take obvious facts at face value. In signal detection terms, the cost of missing a real conspiracy is higher than that of false-alarming to a suspected one. This calls for setting our bias toward the trigger-happy rather than the gun-shy end of the scale, adapting us to try to get wind of possible conspiracies even on tenuous evidence.57
Even today, conspiracies small and large really do exist. A group of employees may meet behind the back of an unpopular colleague to recommend that he be let go; a government or insurgency may plan a clandestine coup or invasion or sabotage. Conspiracy theories, like urban legends and fake news, find their way into rumors, and rumors are the stuff of conversation. Studies of rumors show that they tend to convey threats and dangers, and that they confer an aura of expertise on the spreader. And perhaps surprisingly, when they circulate among people with a vested interest in their content, such as within workplaces, they are usually correct.58
In everyday life, then, there are incentives for being a sentinel who warns people of hidden threats, or a relay who disseminates their warnings. The problem is that social and mass media allow rumors to spread through networks of people who have no stake in their truth. They consume the rumors for entertainment and affirmation rather than self-protection, and they lack the interest and means to follow them up. For the same reasons, originators and spreaders suffer no reputational damage for being wrong. Without these veracity checks, social media rumors, unlike workplace rumors, are incorrect more often than correct. Mercier suggests that the best way to inhibit the spread of dubious news is to pressure the spreaders to act on it: to call the police, rather than leaving a one-star review.
The remaining key to understanding the appeal of weird beliefs is to put the beliefs themselves under the microscope. Evolution works not just on bodies and brains but on ideas. A meme, as Richard Dawkins defined it when he coined the word, is not a captioned photograph circulated on the internet but an idea that has been shaped by generations of sharing to become highly shareable.59 Examples include earworms that people can’t stop humming or stories they feel compelled to pass along. Just as organisms evolve adaptations that protect them from being eaten, ideas may evolve adaptations that protect them from being refuted. The intellectual ecosystem is filled with these invasive ideas.60 “God works in mysterious ways.” “Denial is a defense mechanism of the ego.” “Psychic powers are inhibited by skeptical probing.” “If you fail to denounce this person as a racist, that shows you are a racist.” “Everyone is always selfish, because helping other people feels good.” And, of course, “The lack of evidence for this conspiracy shows what a diabolical conspiracy it is.” Conspiracy theories, by their very nature, are adapted to be spread.
To understand is not to forgive. We can see why humans steer their reasoning toward conclusions that work to the advantage of themselves or their sects, and why they distinguish a reality in which ideas are true or false from a mythology in which ideas are entertaining or inspirational, without conceding that these are good things. They are not good things. Reality is that which, when you apply motivated or myside or mythological reasoning to it, does not go away. False beliefs about vaccines, public health measures, and climate change threaten the well-being of billions. Conspiracy theories incite terrorism, pogroms, wars, and genocide. A corrosion of standards of truth undermines democracy and clears the ground for tyranny.
But for all the vulnerabilities of human reason, our picture of the future need not be a bot tweeting fake news forever. The arc of knowledge is a long one, and it bends toward rationality. We should not lose sight of how much rationality is out there. Few people in developed countries today believe in werewolves, animal sacrifice, bloodletting, miasmas, the divine right of leaders, or omens in eclipses and comets, though all were mainstream in centuries past. None of Trump’s thirty thousand falsehoods involved occult or paranormal forces, and each of these forces is rejected by a majority of Americans.61 Though a few scientific issues become religious or political bloody shirts, most do not: there are factions that distrust vaccines, but not antibiotics; climate change, but not coastal erosion.62 Despite their partisan biases, most people are pretty good at judging the veracity of headlines, and when they are presented with clear and trustworthy corrections of a false claim, they change their minds, whether it was politically congenial or not.63
We also have a beachhead of rationality in the cognitive style called Active Open-Mindedness, especially the subtype called Openness to Evidence.64 This is Russell’s credo that beliefs should be based on good grounds. It is a rejection of motivated reasoning; a commitment to placing all beliefs within the reality zone; an endorsement of the statement attributed to John Maynard Keynes, “When the facts change, I change my mind. What do you do, sir?”65 The psychologist Gordon Pennycook and his colleagues measured the attitude by having people fill out a questionnaire with items like these, where the parenthesized response ups the openness score:66
People should always take into consideration evidence that goes against their beliefs. (agree)
Certain beliefs are just too important to abandon no matter how good a case can be made against them. (disagree)
Beliefs should always be revised in response to new information or evidence. (agree)
No one can talk me out of something I know is right. (disagree)
I believe that loyalty to one’s ideals and principles is more important than “open-mindedness.” (disagree)
In a sample of American internet users, about a fifth of the respondents say they are impervious to evidence, but a majority at least aspire to being open to it. The people who are open to evidence are resistant to weird beliefs. They reject conspiracy theories, witchcraft, astrology, telepathy, omens, and the Loch Ness monster, together with a personal God, creationism, a young earth, a vaccine–autism link, and a denial of anthropogenic climate change.67 They are more trusting in government and science. And they tend to hold more liberal political positions, such as on abortion, same-sex marriage, capital punishment, and war aversion, generally in the same directions that the world as a whole has been trending.68 (The authors caution, though, that the correlations with conservatism are complicated.)
Openness to Evidence correlates with cognitive reflection (the ability to think twice and not fall for trick questions, which we met in chapter 1) and with a resistance to many of the cognitive illusions, biases, and fallacies we saw in chapters 3–9.69 This cluster of good cognitive habits, which Stanovich calls the Rationality Quotient (a play on the intelligence quotient or IQ), correlates with raw intelligence, though imperfectly: smart people can be closed-minded and impulsive, and duller ones open and reflective. Together with resisting weird beliefs, reflective people are better at spotting fake news and at rejecting pseudo-profound bullshit like “Hidden meaning transforms unparalleled abstract beauty.”70
If we could put something in the drinking water that would make everyone more open and reflective, the irrationality crisis would vanish. Failing that, let’s consider a broad set of policies and norms that might strengthen the cognitive immune systems in ourselves and our culture.71
Most sweeping would be a valorization of the norm of rationality itself. Now, we can no more impose values from the top down than we can dictate any cultural change that depends on millions of individual choices, like tattooing or slang. But norms can change over time, like the decline of ethnic slurs, littering, and wife jokes, when reflexes of tacit approval and disapproval proliferate through social networks. And so we can each do our part in smiling or frowning on rational and irrational habits. It would be nice to see people earn brownie points for acknowledging uncertainty in their beliefs, questioning the dogmas of their political sect, and changing their minds when the facts change, rather than for being steadfast warriors for the dogmas of their clique. Conversely, it could be a mortifying faux pas to overinterpret anecdotes, confuse correlation with causation, or commit an informal fallacy like guilt by association or the argument from authority. The “Rationality Community” identifies itself by these norms, but they should be the mores of the whole society rather than the hobby of a club of enthusiasts.72
Though it’s hard to steer the aircraft carrier that constitutes an entire society, particular institutions may have pressure points that savvy leaders and activists could prod. Legislatures are largely populated by lawyers, whose professional goal is victory rather than truth. Recently some scientists have begun to infiltrate the chambers, and they could try to spread the value of evidence-based problem solving among their colleagues. Advocates of any policy would be well advised not to brand it with sectarian symbolism; some climate experts, for example, lamented Al Gore becoming the face of climate change activism in the early 2000s, because that pigeonholed it as a left-wing cause, giving the right an excuse to oppose it.
Among politicians, both of the major American parties indulge in industrial-strength myside bias, but the blame is not symmetrical. Even before the Trumpian takeover, thoughtful Republican stalwarts had disparaged their own organization as “the party of stupid” for its anti-intellectualism and hostility to science.73 Since then, many others have been horrified by their party’s acquiescence to Trump’s maniacal lying and trolling: his game plan, in the admiring words of onetime strategist Steve Bannon, to “flood the zone with shit.”74 With Trump’s defeat, rational heads on the right should seek to restore American politics to a system with two parties that differ over policy rather than over the existence of facts and truth.
We are not helpless against the onslaught of “post-truth” disinformation. Though lying is as old as language, so are defenses against being lied to; as Mercier points out, without those defenses language could never have evolved.75 Societies, too, protect themselves against being flooded with shit: barefaced liars are held responsible with legal and reputational sanctions. These safeguards are belatedly being deployed. In a single week in early 2021, the companies that made the voting machines and software named in Trump’s conspiracy theory sued members of his legal team for defamation; Trump was banned from Twitter for violating its policy against inciting violence; a mendacious senator who pushed the stolen-election conspiracy theory in Congress lost a major book contract; and the editor of Forbes magazine announced, “Let it be known to the business world: Hire any of Trump’s fellow fabulists, and Forbes will assume that everything your company or firm talks about is a lie.”76
Since no one can know everything, and most people know almost nothing, rationality consists of outsourcing knowledge to institutions that specialize in creating and sharing it, primarily academia, public and private research units, and the press.77 That trust is a precious resource which should not be squandered. Though confidence in science has remained steady for decades, confidence in universities is sinking.78 A major reason for the mistrust is the universities’ suffocating left-wing monoculture, with its punishment of students and professors who question dogmas on gender, race, culture, genetics, colonialism, and sexual identity and orientation. Universities have turned themselves into laughingstocks for their assaults on common sense (as when a professor was recently suspended for mentioning the Chinese pause word ne ga because it reminded some students of the racial slur).79 On several occasions correspondents have asked me why they should trust the scientific consensus on climate change, since it comes out of institutions that brook no dissent. That is why universities have a responsibility to secure the credibility of science and scholarship by committing themselves to viewpoint diversity, free inquiry, critical thinking, and active open-mindedness.80
The press, perennially tied with Congress as the least trusted American institution, also has a special role to play in the infrastructure of rationality.81 Like universities, news and opinion sites ought to be paragons of viewpoint diversity and critical thinking. And as I argued in chapter 4, they should also become more numerate and data-savvy, mindful of the statistical illusions instilled by sensationalist anecdote chasing. To their credit, journalists have become more mindful of the way they can be played by disingenuous politicians and contribute to post-truth miasmas, and have begun to implement countermeasures like fact-checking, labeling false claims and not repeating them, stating facts affirmatively rather than negatively, correcting errors openly and swiftly, and avoiding a false balance between experts and cranks.82
Educational institutions, from elementary schools to universities, could make statistical and critical thinking a greater part of their curricula. Just as literacy and numeracy are given pride of place in schooling because they are a prerequisite to everything else, the tools of logic, probability, and causal inference run through every kind of human knowledge. Rationality should be the fourth R, together with reading, writing, and arithmetic. To be sure, mere instruction in probability fails to provide lifetime immunity to statistical fallacies. Students forget it as soon as the exam is over and they sell their textbooks, and even when they remember the material, almost no one makes the leap from abstract principles to everyday pitfalls.83 But well-designed courses and video games—ones that single out cognitive biases (the gambler’s fallacy, sunk costs, confirmation bias, and so on), challenge students to spot them in lifelike settings, reframe problems in mind-friendly formats, and provide them with immediate feedback on their errors—really can train them to avoid the fallacies outside the classroom.84
Rationality is a public good, and a public good sets the stage for a tragedy of the commons. In the Tragedy of the Rationality Commons, motivated reasoning for the benefit of oneself and one’s side produces an opportunity to free ride on our collective understanding.85 Each of us has a motive to prefer our truth, but together we’re better off with the truth.
Tragedies of the commons can be mitigated with informal norms in which members of a community police the grazing lands or fishing grounds by recognizing good citizens and stigmatizing exploiters.86 The suggestions I have made so far can, at best, fortify individual reasoners and inculcate the norm that sound reasoning is a virtue. But the commons also must be protected with incentives: payoffs that make it in each reasoner’s interests to endorse the ideas with the greatest warrant. Obviously we can’t implement a fallacy tax, but particular commons can agree on rules that jigger the incentives toward truth.
I’ve mentioned that successful institutions of rationality never depend on the brilliance of an individual, since not even the most rational among us is bias-free. Instead they have channels of feedback and knowledge aggregation that make the whole smarter than any of its parts.87 These include peer review in academia, testability in science, fact-checking and editing in journalism, checks and balances in governance, and adversarial proceedings in the judicial system.
The new media of every era open up a Wild West of apocrypha and intellectual property theft until truth-serving countermeasures are put into place.88 That’s what happened with books and then newspapers in the past, and it’s happening with digital media today. The media can become either crucibles of knowledge or cesspools of malarkey, depending on their incentive structure. The dream at the dawn of the internet age that giving everyone a platform would birth a new Enlightenment seems cringeworthy today, now that we are living with bots, trolls, flame wars, fake news, Twitter shaming mobs, and online harassment. As long as the currency in a digital platform consists of likes, shares, clicks, and eyeballs, we have no reason to think it will nurture rationality or truth. Wikipedia, in contrast, though not infallible, has become an astonishingly accurate resource despite being free and decentralized. That is because it implements intensive error correction and quality control, supported by “pillars” that are designed to marginalize myside biases.89 These include verifiability, a neutral point of view, respect and civility, and a mission to provide objective knowledge. As the site proclaims, “Wikipedia is not a soapbox, an advertising platform, a vanity press, [or] an experiment in anarchy or democracy.”90
At the time of this writing, those gargantuan experiments in anarchy and democracy, the social media platforms, have begun to wake up to the Tragedy of the Rationality Commons, having been roused by two alarms that went off in 2020: misinformation about the Covid pandemic, and threats to the integrity of the American presidential election. The platforms have tuned their algorithms to stop rewarding dangerous falsehoods, inserted warning labels and fact-checking links, and damped down the runaway dynamics that can viralize toxic content and send people down extremist rabbit holes. It’s too early to say which will work and which will not.91 Clearly, these efforts should be redoubled, with an eye to revamping the perverse incentive structure which rewards notoriety while providing no recompense to truth.
But just as social media probably get too much blame for partisan irrationality, their algorithmic tweaks won’t be enough to repair it. We should be creative in changing the rules in other arenas so that disinterested truth is given an edge over myside bias. In opinion journalism, pundits could be judged by the accuracy of their forecasts rather than their ability to sow fear and loathing or to fire up a faction.92 In policy, medicine, policing, and other specialties, evidence-based evaluation should be a mainstream, not a niche, practice.93 And in governance, elections, which can bring out the worst in reasoning, could be supplemented with deliberative democracy, such as panels of citizens tasked with recommending a policy.94 This mechanism puts to use the discovery that in groups of cooperative but intellectually diverse reasoners, the truth usually wins.95
Human reasoning has its fallacies, biases, and indulgence in mythology. But the ultimate explanation for the paradox of how our species could be both so rational and so irrational is not some bug in our cognitive software. It lies in the duality of self and other: our powers of reason are guided by our motives and limited by our points of view. We saw in chapter 2 that the core of morality is impartiality: the reconciliation of our own selfish interests with others’. So, too, is impartiality the core of rationality: a reconciliation of our biased and incomplete notions into an understanding of reality that transcends any one of us. Rationality, then, is not just a cognitive virtue but a moral one.