1

The Origin of the Specious

This is a book about truth—or, more specifically, about things that aren’t the truth.

Unfortunately, this means that, before we get any further into the book, we need to have a bit of a think about what “truth” actually is. And, more important, what it isn’t.

The thing is, this all gets messy remarkably quickly, because of the sheer variety of ways there are of being wrong. This might come as a surprise to some people. A lot of us assume that there’s simply true and false, and moreover that they’re easy to tell apart. Unfortunately, it’s not quite that simple. Throughout history, those who’ve pondered the nature of truth and its opposites have realized one central principle over and over again: while there’s an extremely limited number of ways of being right, there’s an almost infinite number of ways to be wrong.

“Truth had ever one father, but lies are a thousand men’s bastards, and are begotten everywhere,”4 the Elizabethan writer Thomas Dekker bemoaned in 1606. Or as the sixteenth-century philosopher Michel de Montaigne put it in his essay “Of Liars”: “If falsehood had, like truth, but one face only, we should be upon better terms...but the reverse of truth has a hundred thousand forms, and a field indefinite, without bound or limit.”5

This book is an attempt to catalogue just a few of those hundred thousand forms.

Ours is far from the first period in history to have become obsessed with truth and the lack of it. Indeed, there’s a whole couple of centuries that, in Europe, are sometimes known as the “Age of Dissimulation” because lying was so prevalent—the continent was being torn apart by religious strife from the 1500s onward, and everybody had to wear a mask of deception just to survive. Machiavelli, a man so connected with the art of political deception that we still (rather unfairly) use his name to describe it, wrote, in 1521, that “for a long time I have not said what I believed, nor do I even believe what I say, and if indeed I do happen to tell the truth, I hide it among so many lies that it is hard to find.”6 Let’s be honest—we’ve all had days at work like that.

Posthumous portrait of Niccolo Machiavelli standing

Niccolò Machiavelli: he knew.

So concerned with falsehood were people throughout history that they came up with a remarkable variety of ways to identify liars. The Vedas of ancient India proposed a method based on body language, saying that a liar “does not answer questions, or they are evasive answers; he speaks nonsense, rubs the great toe along the ground, and shivers; his face is discolored; he rubs the roots of the hair with his fingers; and he tries by every means to leave the house.”7 Also in India, a few centuries later, was a weight-based method: the accused liar would be put on a set of scales with a perfect counterbalance. They would then get off, the scales would be given a short speech exhorting them to reveal the truth, and the person would get back on. If they were lighter than before, they were not guilty; if they were the same weight or heavier, they were guilty.8

(Interestingly this implies a completely different relationship between weight and truth from many occult trials in Europe: in India, lightness was associated with innocence, whereas in Europe appearing unexpectedly buoyant could be enough to condemn someone accused of witchcraft. As such, the Indian approach is a judicial process that makes the rare argument for the benefits of weeing yourself while in court.)

Of course, other cultures preferred simpler, more direct methods of identifying liars, such as red-hot pokers or boiling water. It is unclear if these were any more effective.

For a long time, people have devoted considerable effort to trying to classify the different types of falsehood. It was kind of the theological equivalent of writing a BuzzFeed list. As early as AD 395, Saint Augustine came roaring out of the gate by identifying eight types of lie, in descending order of badness: lies in religious teaching; lies that harm others and help no one; lies that harm others and help someone; lies told for the pleasure of lying; lies told to “please others in smooth discourse”; lies that harm no one and that help someone materially; lies that harm no one and that help someone spiritually; and lies that harm no one and that protect someone from “bodily defilement.” (I think, by the last one, he means “cockblocking,” but I’m not 100 percent sure.)

These days, of course, we classify lies differently. But, even then, there are subtleties that you might not be aware of. Everybody’s heard of white lies—harmless social fictions intended to enable us to all get along without killing each other—but did you know there are other colors of lie? “Yellow lies” are those told out of embarrassment, shame or cowardice, to cover up a failing: “My laptop crashed and deleted that report I said I’d definitely have finished by today.” “Blue lies” are the opposite, lies downplaying your achievements, told from modesty (“oh, the report’s nothing special; Cathy wrote most of it, really”). “Red lies” might be the most interesting of all—they’re lies that are told without any intent to deceive. The speaker knows they are lying, the speaker’s audience knows that they’re lying and the speaker knows that the audience knows. The point, here, isn’t to mislead anybody—it’s to signal something to the audience that can’t be spoken out loud (whether that’s basically “fuck you” or the more benign “shall we all just pretend that didn’t happen”). Imagine a couple denying to their neighbors that they had a huge row last night when they know everybody could hear it, and you’re in the right territory.

It’s often said that a lie can travel halfway around the world while the truth is still getting its boots on. (The question of exactly who said this is a thornier matter. It’s often attributed to Mark Twain, or to Winston Churchill, or to Thomas Jefferson or to any number of the other usual suspects for quote attribution. These attributions are, of course, all lies. The earliest formulation of it may in fact have been from the iconic Irish satirist Jonathan Swift, who wrote in 1710 that “Falsehood flies, and the Truth comes limping after it.”)

Portrait of Jonathan Swift writing in his journal

Jonathan Swift, pondering some bullshit.

Whoever said it, it’s certainly true that bullshit can move with remarkable and terrifying speed, as you’ll know if you’ve ever tried to debunk rumors on the internet—that is, in fact, my day job, so, believe me, I get it.

But, in reality, the reason that untruth so often has the advantage over truth has less to do with fact and fiction’s relative speed, or even with truth’s impractical footwear choices, and more to do with the sheer scale and variety of falsehoods on offer. For every lie that travels halfway around the world, there may well be thousands that never make it out of the front door. But the sheer number of possible lies out there—unconstrained as they are by the need to match up to reality—provides a huge Darwinian testing ground to find the most compelling and long lasting among them—those zombie untruths that will keep coming back again and again. It’s like those species of fish that lay two million eggs, just so that two of their offspring will survive.

The truth, by contrast...well, it’s kind of boring. It just sort of sits there, a small gray blob of indeterminate size, familiar yet inscrutable. In addition to being slightly dull, it’s also remarkably frustrating; as anybody whose job it is to try to pin down small fragments of truth will testify, it has a nasty habit of slipping out from under your grasp just when you think you’ve got hold of it.

There are, of course, certain things that are simply, uncontestably true: fire is hotter than ice; the speed of light in a vacuum is a constant; the best song ever recorded is “Dancing on My Own” by Robyn. But once you go beyond these immutable laws of nature, everything gets murky alarmingly quickly. You find yourself saying things like “The best available evidence suggests...” and “Yes, but what about the big picture?” rather a lot. Anyone who has spent time in pursuit of accuracy and evidence understands how every new fragment of knowledge has a tendency only to raise ten more questions; every time you think you’re approaching enlightenment, reality recedes further toward the horizon, while you’re left drowning in a sea of caveats. Truth, by this measure, is not so much a thing; it’s more of a long, irritating journey toward a destination you’ll never reach.

The myriad untruths our world offers us, meanwhile, are seductive, adaptable and—if we’re honest—often tremendous fun.

It’s that sheer variety of untruth that this book will look at, because lies are in fact only one manifestation of the “hundred thousand forms” that the reverse of truth can take.

For example, there’s spin, the art of political deception. The cunning thing about spin is that it doesn’t even necessarily need to lie in order to be dishonest. While many politicians do lie (shock news, I know), the real peak of the spinner’s craft is managing to suggest something wholly untrue by saying only things that are true—building a house of nonsense from honest bricks. Then there’s delusion, the consistent ability people have to be wrong while convincing themselves that they’re right, from the ways we overestimate our own qualities to the ways we can succumb to mass hysteria and mob rule. And then, perhaps the most widespread and damaging of all, there’s bullshit.

We have the philosopher Harry G. Frankfurt to thank for our understanding of bullshit—he was the first to devote serious time to analyzing this complex subject, in his seminal work On Bullshit. (Yes, Harry Frankfurt is clearly having a great time being a philosopher.)

Frankfurt’s key insight is that—despite what you might think—lying and bullshitting aren’t actually the same thing. He writes: “It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction.”

In other words, a liar cares deeply about truth for the same reason that a sailor cares deeply about icebergs. They need to know exactly where the truth resides so that they can take precise and deliberate actions to avoid it. For the bullshitter, by contrast, truth is irrelevant; they are happy to take it or leave it. When bullshitting, a little accidental accuracy may be best regarded as an optional extra; if the bullshit world you are creating sometimes overlaps with the real world, it does you little damage and may even be a helpful bonus. For a liar, on the other hand, the careless admission of an inconvenient fact may prove fatal.

Bullshit operates on dream logic, merrily plowing through inconsistencies because, well, it makes sense at the time. Frankfurt notes that this “indifference to how things really are” is, to his mind, “of the essence of bullshit.”

Their effect on the world, as a result, is profoundly different. Lying is a scalpel; bullshit is a bulldozer. If you’ve looked around at the world in recent times and wondered how these lying liars can get away with their lies so brazenly, and why people won’t call out their lies for being lies, well...there’s your answer. You were accusing them of the wrong thing. Lying—a tricky, detail-focused and analytical profession—is not necessarily our main problem. Our main problem is bullshit.

And then, beyond all of these varieties of untruth, there’s just plain old being wrong.

As I’ve mentioned, my day job is at a fact-checking organization, and as such we come into pretty regular contact with the whole pantheon of ways that people can be wrong. So much so that, last year, we invented a sort of thought experiment to try to get people to ponder all the different types of errors that might be found in the wild. The idea is to strip away all the confusing, messy stuff that surrounds most things in the world and take each story down to a single, simple factual claim from a single source—one where you can’t rely on any other evidence beyond that claim to back up or disprove it. We call it “the clock game,” and here it is:

You are startled awake by the insistent ringing of a phone. You open your eyes. You are in an unfamiliar room, dimly lit by a faint light seeping around what you assume is the bathroom door. From the universal design cues that say “sort of but not really like a home,” you sense that you’re in a hotel room of some kind. You aren’t sure where you are or how you got here—but, from the foggy state of your brain, you begin to realize that you’re extremely jet-lagged.

You have no idea how long you’ve been asleep.

You look around the room for some kind of clue. There is no clock visible, and blackout curtains cover the windows, offering no hint of whether it’s day or night outside. The bedside phone is still ringing, far too loudly for your comfort. Fumblingly, you pick it up.

“Hey, you made it!” says a slightly too cheery voice at the other end. The voice has an indeterminate accent you can’t quite place.

“Buh?” you reply. “Who is this?”

“It’s Barry!” says the voice. “Glad to finally connect with you!” You’re not sure you know a Barry, but you decide to go with it.

“I, er...” you start, before realizing you don’t have anywhere else to go with that sentence. “Uh...what time is it?” you settle on, limply.

“Wait a minute,” says this person who claims to be Barry, “let me go and look at the clock.”

You hear the noise of a phone being put down and steps receding into the distance. A period of time passes, which could be a few seconds or could be several minutes, you’re not sure. The steps return.

“It’s five o’clock, mate,” says the self-professed Barry.

“Okay,” you say.

The point of the game is this: Can you list all the different ways in which your belief about the time could, in this moment, be wrong? Spoiler alert: there are probably more ways than you think! To date, we’ve got somewhere over twenty, and, even then, we’ve almost certainly missed a few.

Go on, take a moment and see how many you come up with. Imagine some easy-listening music is playing at this point.

[“TAKE FIVE” BY DAVE BRUBECK PLAYS WHILE YOU THINK DEEPLY ABOUT CLOCKS AND ALSO POSSIBLY WONDER IF THE AUTHOR HAS GONE MAD.]

Okay, are you back? Good! Let’s take the obvious ones first. Barry’s clock might be wrong: it might run too fast or too slow, or it might have stopped completely; or it might run at the perfect speed but have been set to the wrong time in the first place. It might be a really hard-to-read clock, one of those over-fancy designer jobs made out of reclaimed driftwood and glass globes, which looks very nice on the wall but isn’t very useful for telling any time other than splinter-past-bauble. It might not be a clock at all. Maybe it’s just a painting of a clock. Maybe Barry doesn’t have a clock and just got someone to write the time down on a piece of paper earlier that day.

Maybe you and Barry are in different time zones—so, while he was perfectly correct about the time, it isn’t true for you. Maybe he rounded it to the nearest hour, for convenience, but that’s not actually very useful for you, because you wanted to know if it was closer to half past. Maybe it was five o’clock when he looked at the clock, but by the time he got back to the phone, it wasn’t anymore.

Maybe Barry was deliberately lying to you, for whichever of the many nefarious purposes that Barries have. Maybe he wasn’t lying but was bullshitting, because he can’t read the time but didn’t want to admit that. Maybe he thinks he can read the time but actually doesn’t know how clocks work. Maybe he meant to say “nine o’clock” but misspoke.

Or maybe he did say “nine o’clock,” and you just misheard him. Maybe you’re the one who doesn’t actually understand how time works, and right now you’re thinking, Ah, five o’clock, so it’s almost midnight. Maybe you supposed that he wouldn’t factor in the time it took to get back to the phone, so you assume it’s actually something like five past five now, but in fact he’d already done that and so you’ve overcorrected.

Maybe, in your slightly paranoid state, you assume that Barry is lying to you—so now the one thing you think you know for sure is that it definitely isn’t five o’clock. But you’re wrong. Barry is a good man, and your friend, and he would never lie to you. It really is five o’clock, and your lack of trust has led you astray.

Maybe you and Barry don’t even use the same time system. Maybe he’s a NASA engineer working on a Mars project, and his clock is set to the Martian day, which is thirty-seven minutes longer than Earth’s.

Maybe “It’s five o’clock, mate” wasn’t even an attempt to tell you the time, but a code-word check-in for the secret agency you both work for, which you have forgotten all about due to traumatic amnesia.

Maybe time, that mysterious river down which we all must be carried, cannot truly be measured by humans, and all our efforts to do so are but crude approximations.

Or maybe...maybe he just meant a.m. when you assumed he meant p.m.

Now, this may all strike you as, frankly, nonsense—but, in fact, every one of those ways you could be wrong about the time matches up to a real-life example of how bad information gets out into the world. Yes, even the stupid ones, like Barry working on Martian time or him trying to give you a superspy code word.

Some of the real-world equivalents are pretty obvious; rounding too far, not adjusting for errors (like the time between clock and phone) or not realizing that your source is simply unreliable (like the clock being slow) are all common problems, especially when you’re dealing with facts based on data. Trying to tell the time from a stopped clock or a piece of paper matches up with the human habit of being extremely certain about things when it should be clear that we don’t actually have any useful information to go on. Barry’s Martian clock is a surprisingly common one—people just don’t realize that they’re using completely different definitions of the same basic concept (remember that Christopher Columbus only “discovered” America because he had the wrong idea about how far away Asia was, which was because he’d worked out the circumference of the earth using a source that he assumed was given in Roman miles but was actually talking about Arabic miles, which are a totally different length).

Weirdly, when researching this book, I discovered that we weren’t the first to hit on this kind of thought experiment. In 1936, Vilhjalmur Stefansson, a man with a somewhat checkered career as an intrepid Arctic explorer, took a bit of a career turn and wrote a book called Adventures in Error—the quote at the beginning of this book comes from it. In it, he gives a very similar example, only he uses a cow instead of a clock:

Take an example: A man comes from out-of-doors with the report that there is a red cow in the front yard...we are confronted with numerous other sources of error. The observer may have confused the sex of the animal. Perhaps it was an ox. Or if not the sex, the age may have been misjudged, and it may have been a heifer. The man may have been color-blind, and the cow (wholly apart from the philosophical aspect) may not have been red. And even if it was a red cow, the dog may have seen her the instant our observer turned his back, and by the time he told us she was in the front yard, she may in reality have been vanishing in a cloud of dust down the road.9

I hope that all this waffle about cows and clocks has convinced you that, if it sometimes seems like we’re drowning in a sea of falsehood, there’s a very good reason for that: it just has a natural advantage over truth, because there can be so much more of it. But that’s not the only advantage it has. There are lots of things about our brains and our societies that allow falsehood to flourish.

For many centuries, we’ve believed that lying was a uniquely human trait, our original sin. But it turns out that humans aren’t the only creatures that lie. For starters, the lives of many animals and plants are founded on deception—think of the opossum pretending to be dead, or the cuckoo sitting parasitically in another bird’s nest or the orchid that looks like a sexy lady bee to fool horny male bees into pollinating it. But, you might reasonably say, that’s not lying exactly—it’s just the involuntary end product of many generations of an evolutionary arms race. Which is fair enough, but there’s plenty of evidence that some of the smarter animals are perfectly capable of intentional and deliberate deception.

To take one particularly memorable example: in his work “Can Animals Lie?,” the semiotician Thomas A. Sebeok mentions a “handsome tiger” living in the Zurich zoo who had learned to deliberately lure visitors toward the bars of his cage by means of “a certain sequence of interesting activities.”10 When the enthralled tourist got close enough, the tiger would—and there’s no way to say this delicately—drench them with a powerful stream of piss. The tiger was apparently so pleased with this trick that the zoo management eventually had to put up a sign warning visitors that the tiger was not to be trusted.

That pissy tiger is far from alone. A dolphin at a research facility in Mississippi who had been trained to help clear rubbish from its pool by being given rewards of fish learned to hide pieces of garbage under a rock, which it would then bring to the surface to scam fish on demand.11 Chimpanzees have been recorded performing a wide range of deceptions. They grin involuntarily when they’re nervous—one chimp, being threatened by another chimp behind them, was seen physically pushing its lips back down over its teeth before turning around and bluffing that they weren’t scared. Another young male, the least dominant of its group, was seen trying to surreptitiously seduce a lady chimp that the dominant males wouldn’t normally let him approach. When one of the older males interrupted him, he covered up his erection with his hands, like a character in a 1970s British sex comedy.12

Trickery is built into much of the natural world, so maybe we shouldn’t be too hard on ourselves for telling the occasional fib.

It’s not simply that deception is natural—it’s that it appears to have evolved. One scientific study showed that across all the primates there was a close correlation between the size of the neocortex (the part of the mammalian brain that deals with complex tasks, like language) and the frequency of deception in those species.13 In other words, a bigger brain equals more lying. The challenges of living in complicated social groups—including the need to sometimes deceive your peers—may well have driven the increasing complexity and size of our brains.

That link between cognitive power and deception is replicated as we grow up. Children generally start telling their first lies at about the age of two and a half, not long after they’ve started talking. Initially, the first lies are simple “wish fulfillment” lies: “I would like to not be the person who ate the cookies.”14 But, as kids’ mental abilities develop, as they get a theory of mind and begin to understand the complex nature of their interactions with others, their skill at lying marches in lockstep.

How deeply embedded is falsehood in our daily lives? Possibly more than you think. Psychological studies suggest that within the first ten minutes of conversation when you meet someone new you will, on average, have told three lies.15 Other studies suggest that, on average, each of us lies at least once every single day—although those studies are based on asking people to report how often they lie and so are vulnerable to the possibility that the participants are...lying about that.

That’s not the only potential issue with asking people how often they lie. In writing this book, one of my original plans was to keep a “lie diary”—to spend several weeks diligently noting and recording every single time I uttered a falsehood. It was going to be an attempt to gain an insight into just how much untruth permeates our lives, even (or especially) for those of us who believe ourselves to be fundamentally honest people. I was excited by this prospect, although also nervous; exactly how many friendships, I wondered, would be destroyed forever by the publication of this book?

In the end, I needn’t have worried. Not because it turned out that I am a beacon of purity and truth (I mean, obviously I am), but because every attempt I made to try to record my lies ground to a crashing halt after about a day.

Quite simply, I wasn’t able to spot when I was bullshitting.

The thing is, I know for a fact that I did tell lies during this time. None of them were especially heinous; I was not doing any massive crimes during the writing of this book. Broadly, they fell into three categories: lies about what I’d already done, lies about what I was able to do in the near future and lies about my social life.

The first category mostly consisted of texts and emails to my publisher and agent insisting that the book was going really well and I’d got a lot written. (Sorry.) The second was primarily to colleagues, asserting confidently that I would get around to that thing I’d promised them next and that I’d definitely have something for them by tomorrow. (Sorry again.) The third was that broad category of little white lies that keep society from falling into a death spiral of mutual recrimination: fabricated excuses for not being able to attend a party; transparently false claims about having only just seen a text; hollow assurances that, yes, you are unquestionably being the reasonable one in this argument and the other person sounds like a complete asshole who definitely doesn’t maybe have a point.

(That last category would probably have been considerably larger if it wasn’t for the fact that I was, well, trying to write a book at the time and so spent many months turning down invitations to go to the pub for perfectly genuine reasons—namely, that I had to concentrate on the important business of staring blankly at a screen and not writing anything. Pro tip, introverts: an imminent book deadline is an excellent and entirely genuine excuse for getting out of social engagements!)

Most of the time I was well aware that these were lies at the time I told them, with the occasional exception of promises that I’d get something done (which were sometimes based on pure and simple delusion that I could work solidly for every one of the thirty-six hours that I understand there to be in a day). And yet something happened in my brain during the act of telling them—a switch would flip, and I would temporarily blank out that I was dishing up small servings of horseshit. It’s something I’d never really noticed until I set myself the task of noting down all my little white lies: I simply wasn’t able to recognize them in the moment. It was like my brain had a self-defense mechanism against telling on itself.

I have no idea if anybody else’s brain works like this. It’s entirely possible that I might have just inadvertently discovered I’m a psychopath. But, at a guess, I’d say the chances are pretty good that this happens to quite a lot of people.

Liars lie; bullshitters bullshit. That much is easy. But what’s really interesting isn’t why people say things that aren’t true—that’s always going to happen. No, the really interesting question is why some lies stick around—why, despite all our professed reverence for truth and all the structures we’ve set up as a society to identify and root out falsehood, some untruths become widely believed. In other words, how do bullshitters get away with it?

The reason is that, in addition to their numerical advantage over truth, there are some structural reasons that mean falsehoods have the upper hand. Throughout this book, we’ll keep encountering the seven main ways that untruths spread and take hold.

The Effort Barrier

You get an effort barrier when relative difficulty of checking the truth of something outweighs its apparent importance. The key thing about this is that it works at both ends of the scale: it applies to things that would be relatively easy to check but are so trivial that nobody bothers, and to things that are clearly pretty important but are also really hard to check. The reason that sixteenth-century explorers could get away with claiming that a race of twelve-foot-tall giants lived in Patagonia is the same as the reason you can usually get away with upgrading your AP math grade from a B to an A on your CV. Yes, someone could check, but are they really going to bother?

This is something that seasoned bullshitters understand instinctively. It’s simply inefficient to craft untruths that are built to withstand far more scrutiny than they’ll ever receive. A talented liar pitches their falsehoods, both big and small, just on the far side of the effort barrier.

Information Vacuums

We often like to think of Truth and Lies as being in some kind of eternal battle. But one effect of the effort barrier is that, quite a lot of the time, Truth never even shows up to the fight. There are an awful lot of things in the world that we just don’t really know anything about. And, in the absence of information, we tend to lower our guard whenever something that claims to be information shows up—even if there’s no good reason to believe it.

This all ties in with the cognitive bias known as “anchoring”: our brain’s tendency to latch on to the first piece of information we get about any subject and give it far more weight than anything else. When there isn’t good information on something, crappy information will always flood in to fill the void—and, a lot of the time, it refuses to budge, even when better information finally shows up.

The Bullshit Feedback Loop

None of us can work out the entire world by ourselves. All of us have to rely on others for our information. This is a good thing—together, we can find out much more about the world than we ever could alone—but it does come with some downsides. And one major downside is the bullshit feedback loop. You get one of these when a dodgy piece of information is repeated, but, rather than the repetition being seen for what it is (just somebody copying someone else, adding no extra level of verification to the claim), it instead gets treated as confirmation that the original dodgy info was accurate. If this goes on too long, the problem expands. It’s no longer merely that the claim gets repeated; eventually, it becomes so established that people start adjusting what they say to accommodate the dodgy facts—everybody knows it’s true, so even if you’re staring directly at evidence proving it to be false, it’s probably because there’s something wrong with your eyes.

So person A tells person B something wrong, and then tells person C too. Person C is skeptical, but then person B tells it to them as well, and person C interprets that as a second source and is now convinced. Person C runs to tell person D the exciting news, whereupon person D tells person A, who takes it as evidence that they were right all along. Meanwhile, people E, F, G, H and I have also heard the same thing from multiple people, and it’s become accepted as common knowledge. At this point, person J tentatively asks, “Are we actually sure about that?” and is promptly burned as a heretic by the rest of the alphabet.

Or, to take a familiar example, it’s the thing where a newspaper copies a fact from Wikipedia and then gets cited on Wikipedia as evidence that the fact was correct.

Wanting It to Be True

There are a whole host of things our brains do that make us particularly bad at sniffing out the difference between truth and not-truth. These have a bunch of technical names you’ve probably heard of—things like “motivated reasoning” and “confirmation bias”—but they all basically come down to the fact that, when we want to believe something, working out whether or not it’s actually true comes pretty far down our brain’s list of priorities. It doesn’t really matter whether it’s something that supports our political stance, something that matches up with our prejudices, or just basic wish fulfillment of the “maybe I have won the lottery in Spain, despite never entering” type, we’ll cheerfully come up with spurious reasons to assign even the most ridiculous claim credibility, cherry-picking only the evidence that supports it, while blithely ignoring that vast mountain of evidence that says it’s crap.

The Ego Trap

Even when falsehoods do get unmasked, there’s something that often stands in the way of the truth spreading as easily as the lies that got their boots on first: simply, we really do not like to admit that we’re wrong. Our brains don’t like doing it, and there are a whole host of cognitive biases that push us away from even acknowledging that we might have fucked up. And, if we do come to realize that we’ve been taken in by something false, there are a multitude of social pressures that make us want to cover it up. Once bullshit has us in its grasp, we can be rather unwilling to break free.

Just Not Caring

Even when there is an opportunity to push back against untruth, we don’t always take it. We might think it’s just not important whether something’s true or false (especially if we like the lie). But, equally, we might think that pushing back would be ineffective and so not bother. We might believe that lying is so widespread that we get overwhelmed by the scale of it all and just give up. Equally, we might think that...well, if everybody’s doing it, I should get in on the game too.

All of these are understandable, but bad.

Lack of Imagination

Perhaps one of the strongest advantages that untruth has is, quite simply, we don’t understand all the myriad and surprising ways that it can manifest. This makes sense—after all, we have to live our lives on the assumption that most of the stuff we’re told is true; otherwise we’d descend into a spiral of gibbering paranoia. But this can lead to us radically underestimating the likelihood that something might not be true. We assume that if we read something in the news, then it’s probably true. We think that if someone seems trustworthy, then they aren’t trying to scam us. We believe that if lots of eyewitnesses said they saw something, then there must have been something there. None of those assumptions is as reliable as we might think.

Fundamentally, we just haven’t been paying enough attention to the business of falsehood. We haven’t studied it and we don’t talk about it, with the result that we don’t always recognize it when we see it.

Hopefully, by the end of this book, that won’t be a problem anymore.