Fourteen
The Suspicious Society
When regard for truth has broken down or even slightly weakened, all things will remain doubtful.
-SAINT AUGUSTINE






At the outset of this book we wondered whether more lies than ever are being told. In a sense, the answer to that question is beside the point. Because if we feel more lies are being told—and we obviously do—the effect is the same regardless of whether that feeling is valid: a rising level of wariness. In an era as lie-tolerant as ours, suspicion is inevitable. As bad as deception itself is the sense that we’re being deceived so routinely. From potential mates to prospective employees or even our neighbors, we feel less and less sure whom exactly we’re dealing with, or how much of what they tell us to believe.
In the suspicious society, “Google” has become a verb. Not only employers and journalists but suitors routinely Google each other by entering names in Internet search engines to find out what legal problems they might have had, how often they’ve been married, or if they’re at all who they said they were. Background-checking services abound, available for hire to investigate suitors, babysitters, roommates, employees, and business associates.
Earlier we discussed the well-established psychological principle that most human beings operate on the basis of a “truth bias”—that they assume whatever someone tells them is more likely to be true than false. As deception of all kinds becomes commonplace, the truth bias could give way to a lie bias. In that condition we’ll question the veracity of anything we’re told. Some already do. Before learning that half the subjects they’d watched on videotapes were telling the truth, a group of police officers studied by Paul Ekman tended to think all of them were lying. Sustaining that level of suspicion is taxing. Research on mental processes confirms that it takes far less effort to believe than to disbelieve (which is a primary reason for the truth bias). Being on guard lest someone succeed in telling us a lie is emotionally, spiritually, and physically draining. Moreover, suspicion not only doesn’t enhance our ability to detect lies, it can even make us worse natural lie detectors. What we’re left with is weariness born of wariness.
As an experiment, Rhodes College psychology professor Chris Wetzel warned members of his class “Detecting Con Artists and Impostors” that he’d lie to them once per lecture. Any student who regurgitated one of his lies on an exam would be penalized. Within weeks Wetzel had to end the experiment. “It became too disruptive,” he explained. “You almost have to become a paranoid to question everything and see what’s going on, and most of us are not willing to pay that price. It would almost drive you nuts to be that vigilant for the truth.”
On a small scale, this experiment replicated what is happening in society as a whole. Wetzel’s experience suggested why, even though we’re deceived so often, most of us aren’t very good at detecting deceivers. We are predisposed to believe what others tell us. If we weren’t, the stability of individuals and society alike would collapse. Giving others the benefit of the ethical doubt makes a civil society possible, even though this means overlooking occasions when we think we might have been deceived. The truth bias is also the basis for personal relationships of all kinds. To live on the basis of a lie bias, in a state of perpetual suspicion, would virtually eliminate any prospect of human intimacy.
When lying becomes too prevalent, and liars too skilled, even those who tell the truth are subject to the assumption that they aren’t. An old joke: Two Russians meet at a railway station in Moscow. One asks the other where he’s going. “To Minsk,” the first man responds. “You are such a liar!” says the second. “You say you’re going to Minsk because you want me to believe you’re going to Pinsk. But I know for a fact that you are going to Minsk. So why are you lying to me?”
Not long after 9/11, an e-mail from Afghan American writer Mir Tamim Ansary raced around cyberspace. This eloquent appeal, in which Ansary pleaded with the U.S. government not to bomb a country that had little left worth bombing, was eventually expanded into a book (West of Kabul, East of New York). Before then, however, Ansary’s plea had been forwarded to so many inboxes that lots of recipients assumed it was a hoax.
The real danger is not that we won’t develop the necessary skepticism about lies and apocrypha but that, once we do, we will discount legitimate information. This is the inevitable impact of promiscuous lying. A man I know whom I’ll call Tom lies on a regular basis. Tom’s lies—about where he’s been, what he’s done, whom he knows—are so offhand that anyone who doesn’t know him well assumes he’s a credible human being. Only those who must work with Tom realize how routinely he deceives them on matters large and small. Tom’s lies are so frequent that his coworkers doubt even his most casual remarks. If Tom says he’s going to lunch, they wonder where he’s really going. If he says he met with the bursar, colleagues wonder whom Tom actually met with. If he says he isn’t feeling well, they figure Tom’s trying to get out of doing something.
This replicates on a small scale what happens more broadly when totalitarian regimes try to brainwash their populace into believing lies. Instead, recipients of these lies begin to question everything they’re told. In time they come to assume that nothing their government tells them can be believed—even that which can be proven. Ultimately members of such societies don’t lose just a capacity to assess the credibility of official pronouncements, they lose interest. This was the fate of those who suffered decades of oppression by Joseph Stalin, Mao Zedong, and Saddam Hussein. “There always comes a point beyond which lying becomes counterproductive,” concluded political philosopher Hannah Arendt. “This point is reached when the audience to which the lies are addressed is forced to disregard altogether the distinguishing line between truth and falsehood in order to be able to survive.”
We haven’t reached that point yet. We’re still more determined to uncover lies than resigned to being duped. A staple article in supermarket tabloids is “How to Tell if Someone Is Lying.” A genre of popular books includes Stan Walters’s The Truth About Lying: How to Spot a Lie and Protect Yourself from Deception, and Never Be Lied To Again: How to Get the Truth in 5 Minutes or Less in Any Conversation or Situation by David Lieberman. Lieberman’s book is filled with creepy tips on how to spot the many dissemblers its author assumes we confront every day. “Using a blend of hypnosis and a system I have developed called Trance-Scripts,” he writes, “you’ll be able to give commands directly to people’s unconscious minds—all in conversation and without their awareness. Through this process you can persuade others to tell the truth.”
For the less literate, a multitude of gadgets promise to alert them when someone is lying. These gadgets go by names such as Truster, Handy Truster, and the Truth Phone. Most are based on the shaky premise that stress can be detected in a liar’s voice, especially on the telephone. Alternatively, a software program purports to spot lies in e-mail. The more such devices are in use, the more suspicious we become—not just that we’re being lied to but that our voices and even our e-mails are being scrutinized for falsehoods.
Lie-detecting devices are both a measure of how suspicious we’ve become and a source of suspicion themselves. Being on high alert doesn’t help much in detecting lies, but does make us more wary of liars and truth tellers alike. Suspicion simply begets more suspicion, not enhanced detection of deception. In one of Bella DePaulo’s studies, personnel officers who were warned that some subjects might try to deceive them in a simulated job interview were no better able to identify deceivers than those who weren’t warned. The warned interviewers were less confident about their judgments, however, and more inclined to suspect every subject was being dishonest. Subjects in turn felt less comfortable being interviewed by wary interviewers. “Increased suspiciousness,” concluded DePaulo, “in and of itself, served only to destroy the confidence of both perceiver and perceived in their own interpersonal skills, and to erode their trust in each other. The effects on the persons who were suspected of deceit are especially noteworthy, because those persons had no direct way of knowing that they were the objects of suspicion.”
The question Bella DePaulo says she’s asked most often is “Where’s the nose?” What clues will reveal when someone is lying as surely as Pinocchio’s lengthening nose? She responds that none are that foolproof. Her survey of 120 studies of human lie detection found that most showed subjects spotted lies at little better than a chance rate. What this means is that anyone we meet could lie to us at any time about anything at all and we would have no reliable way to expose his or her deceptions. DePaulo calls this “Pinocchio’s Revenge.” Geppetto’s boy retaliated for the anguish of his big-nose lie detector by making it nearly impossible for the rest of us to detect lies.
Some beg to disagree. Those with a professional need to unmask liars typically have great confidence in their ability to spot cues of dishonesty. According to the owner of a California polygraph service, liars always look uncomfortable, don’t rest their hands on the arms of a chair, twist their feet, fidget, twitch, and let their eyes rove. A Chicago company that trains interviewers advises them that liars are more likely to slouch, turn away, avoid eye contact, make erratic changes of posture, or engage in grooming gestures when answering key questions. Variations on this theme are commonplace among polygraph operators, customs inspectors, and police officers everywhere. Nearly all their assumptions are little better than folklore. Research on interrogators of many kinds has determined that while most have definite opinions about tip-offs that lies are being told, these opinions are usually wrong. Study after study has shown that rules of thumb even professional lie catchers use to flush their prey are, to say the least, unreliable. Shifty eyes, cleared throats, changes of posture, twisted feet, delay in answering, hand over mouth—none of these commonly used lie-detection cues has any proven validity. As we’ve already noted, the most popular cue of all—unsteady eye contact—is worse than useless as evidence that lies are being told.
Among the many subjects he’s studied who have a professional interest in uncovering deception, Paul Ekman has found no correlation whatsoever between confidence in their ability to spot lies and an actual ability to do so. Even agents from the Drug Enforcement Administration, the Bureau of Alcohol, Tobacco, and Firearms, the FBI, and the CIA, as well as police officers, customs inspectors, military officers, forensic psychiatrists, trial lawyers, and courtroom judges, have performed little better than anyone else in studies of lie detection conducted by Ekman and others. A German psychologist who expected police officer subjects to be superior detectors of lies found that they identified truthful statements made on videotape at a rate somewhat better than chance (58 percent), but did far worse than chance when it came to spotting dishonest videotaped statements (31 percent). In both cases these officers had been quite confident of their ability to distinguish between liars and truth tellers.
Unwarranted faith in their ability to spot liars is a key reason that so many of those with a professional interest in doing this are so bad at it. Any success they enjoy in unmasking liars is usually in spite of their invalid detection strategies, not because of them. Those strategies may be more impediment than help. A study in Britain found that those who judge themselves more “intuitive” detect lies of others at a lower rate (59 percent) than those who consider themselves more cerebral (69 percent). The psychologist who conducted this study thought it suggested that intuitive people rely too heavily on invalid body cues than those who simply listen carefully to what they’re being told.
This isn’t to say that there are no successful intuitive human lie detectors or useful ways to spot liars. Ekman has found that U.S. Secret Service agents are better-than-average lie detectors, perhaps because they’re so attuned to recognizing anomalous behavior. He has also found specific individuals who have a flair for spotting liars. (Ekman calls them his “Diogenes Sample.”) Just as there are natural performers who lie well, the psychologist has concluded, there seem to be natural-born lie catchers. This ability is unrelated to age, gender, job experience, or any other discernible factor. Ekman’s eleven-year-old daughter proved unusually adept at detecting lies, nearly as good as the best Secret Service agent. The ability to spot lies seems to be a gift, much like the ability to hit a baseball or paint a picture.
Nonetheless, over three decades’ time Paul Ekman has identified what he considers reliable evidence that lies are being told: fleeting, involuntary facial movements at variance with words being spoken. These “micro-expressions” could be little more than a forced smile, knitted eyebrows, or wrinkled forehead. When making presentations Ekman sometimes shows slow-motion video of a momentary snarl on the face of cool Kato Kaelin testifying dishonestly at O.J. Simpson’s trial about not having a book contract; the British spy Kim Philby smirking briefly while denying that he was engaged in espionage; or Margaret Thatcher’s fluttering eyelids as the Tory prime minister said she hadn’t authorized the sinking of an Argentine cruiser during the Falklands war, a cruiser that a British ship was about to torpedo. (Thatcher later admitted her deception.) Since micro-expressions such as these usually last for less than a second, they are discernible only in slow-motion video. Spotting them takes an hour of analysis for each minute of tape, on average. Such a complex, time-consuming approach is not practical for most professionals, let alone a wife wondering if her husband is actually working late as he says he is. Although Ekman offers workshops for professionals in everyday applications of micro-expression analysis, the usefulness of this method for the average person is nearly nil.
It has been established that some easily spotted cues are more indicative of deceptive behavior than others. They include an elevated blink rate, dilated pupils, higher voice pitch, and artificial smiling. Wiping your mouth with your hand, straightening your desk as you speak, and preening your hair may indicate lying—but only in certain women. Some liars get tongue-tied and stumble over their words, but so do some truth tellers. (This cue is a tip-off only if you know how articulate the speaker is ordinarily.) And, yes, your nose does get bigger when you lie. Researchers at Chicago’s Smell and Taste Foundation and at the University of Illinois have found that nasal tissues become engorged with blood when we dissemble, making our noses swell. This is usually not visible to others, however, though it is felt by dissemblers, who may touch their nose in response.
In the end there is no surefire, clearly discernible tip-off to all lying in every person. Cues vary from person to person, and occasion to occasion. Small, everyday lies (“I’ve got a call on the other line”) are nearly impossible to spot because they’re told so routinely. Big ones with a lot of high-voltage emotional content are more likely to produce cues. Accomplished deceivers are very good at repressing lie cues, however, and mimicking someone who is telling the truth. This includes those subjected to vigorous lie detection.
As long as some human beings have told lies, others have yearned for a reliable way to detect them. The methods they’ve developed have usually employed state-of-the-art technology. At one time this meant the thumbscrew, the garrote, and the rack. These lie-detection methods were not 100 percent dependable, however, because those being interrogated by such means were likely to say anything, true or false, to relieve their pain. In ancient India suspected liars were forced to chew dry rice. The amount that stuck to the roof of their mouth was then measured as a gauge of their honesty (the more rice, the more honest). This is not as wacky as it sounds: a dry mouth can be symptomatic of anxiety (presumably about being caught telling a lie). That’s why the Bedouin custom of having suspected liars lick a hot iron, on the theory that if their tongue stuck to the iron they were lying, also had a certain logic. An ability to touch hot irons without blistering—a medieval European means of identifying liars—was logic-free, however. So was dunking possible liars in water to see if they’d sink or float, a nowin situation if ever there was one (in dunking theory, liars’ bodies floated, only to be hauled from the water and hung; truth tellers sank, and drowned).
Modern lie-detection tools haven’t progressed all that far from chewing dry rice and licking hot irons. This includes the polygraph. Since it was invented in 1921, no one who has paid serious attention to the so-called lie detector takes it seriously as a dependable tool for spotting lies. Two comprehensive studies commissioned by the federal government concluded that this contraption was fundamentally unreliable. A historian of the polygraph, Geoffrey Bunn of Toronto’s York University, calls it an “entertainment device.” After much research, University of Minnesota psychologist David Lykken concluded that the polygraph’s effectiveness in catching lies is little better than chance.
All lie detectors do is measure autonomic nervous arousal. But truth tellers can be nervous, and liars calm (accounting for lots of false positives and negatives in polygraph results). A former American policeman says he can teach subjects how to fool a polygraph. This takes about ten minutes. So why are lie detectors still used so extensively (a million times a year in the United States alone, by one estimate)? After reviewing hundreds of studies for the New England Journal of Medicine, Los Angeles physician Robert Steinbrook concluded that “the polygraph appeals to an often simplistic desire for certainty in the face of complexity, and a misplaced faith in the power of a machine.”
There’s more to the story, however. Even though polygraphs are unreliable tools for assessing lies told by job applicants or government leakers, in specific criminal cases they have sometimes proved effective. This has less to do with the machine itself than the way it’s used. Clever polygraph operators can get anxious suspects to confess wrongdoing by making them afraid that their lies are about to be exposed. This process is sometimes enhanced by lying to subjects about how foolproof lie detectors are. One hapless suspect confessed to arson after police in Doylestown, Pennsylvania, put a kitchen colander on his head with battery jumper cables leading from the colander to a photocopier that churned out paper reading “HE’S LYING.”
Despite being discredited by scientific research, voice-stress analyzers are still in common use as lie-detection tools. More recent methods employ infrared cameras that measure facial heat thought to be associated with lying, and software programs that accelerate analysis of microexpressions. Those working with different kinds of brain scanners say they can identify neural activity associated with lying. Perhaps this points the direction toward lie detectors of the future. For now, however, lie detection by brain scan is unwieldy, unproven, and prohibitively expensive.
With luck we will never develop a lie detector that is 100 percent dependable. Not that this will stop us from trying. In the suspicious society there will always be a market for tools that promise to reveal deception. The demand for them is more telling than the products themselves. Lie-detection devices are both a measure of how suspicious we’ve become, and a source of our atmosphere of wariness.
When one or a handful of those in any profession are exposed as dishonest, all others in that profession get a black eye. In the wake of the Enron-Arthur Andersen debacle, perceived ethics of all accountants plummeted in public opinion polls. (Business executives didn’t fare too well to start with.) The integrity of clergymen in general suffered as a result of the many Catholic priests who were exposed as pedophiles and prevaricators. This syndrome could have something to do with how badly psychologists as a group fare in surveys of perceived honesty. One such survey, using sixty undergraduates as subjects, found they ranked eighth out of twelve professions on the perceived-honesty scale, below teachers and doctors (but ahead of politicians, at least). B. L. Kintz, the psychologist who conducted this study, seemed nonplussed by its results. When it came to attitudes toward lying, Kintz observed, “for some reason they [the subjects] believe that psychologists are not too unfavorably inclined.” Kintz wondered why this might be. Here’s one possibility: academic psychologists routinely enlist students in studies under false pretenses. Kintz’s report of his own study, “Eye Contact While Lying During an Interview,” has this to say about its methodology (in which an unknowing student subject was paired with a knowing confederate): “After introducing the two students, the experimenter explained the nature of the experiment. This explanation, however, was the beginning of a cover story designed so that the subject would not be aware that the emphasis of the experiment was on lying.” At best such subjects are later debriefed about having taken part in a psychological experiment based on false premises. Tens (if not hundreds) of thousands of deceived student-subjects like them leave college with a circumspect view of psychologists’ ethics. And they tell their friends.
Deception of any kind can have unexpected consequences. Dissembling in one area ricochets into others. Scientists who fake findings don’t just debase the currency of research in their own field but weaken confidence in physicians who base treatment decisions on that research. Similarly, physicians who deceive patients about their condition and are known to do so create problems for more candid colleagues, whose patients may wonder if they’re being deceived too. After all the revelations about reporters making things up, the credibility of journalists in general was damaged. Two-thirds of those polled in a survey by the Media Studies Center thought journalists often or sometimes made things up, and three-quarters thought they often or sometimes plagiarized other people’s work. When duped too often, readers wonder how to distinguish between fiction and nonfiction. Finally they stop trying, but do remain suspicious. This is the price every nonfiction writer pays for the few who make things up: a climate of wariness in which all must work. At least wariness is better than cynicism, though. What most horrified journalists about the Jayson Blair scandal was how few of the many subjects he misrepresented felt compelled to complain. Gerald Boyd, the Times’s managing editor during this episode, noted that he used to hear from readers if his newspaper got so much as a middle initial wrong. The fact that hardly any subject or reader bothered to question Blair’s far more egregious errors told him something about the degraded credibility of journalists as a group.
In the backwash of writer-embellishers, those who try to maintain standards of veracity now feel the need to say: This is true; I didn’t make it up. Trust me. I’m not a fantasist. I do not invent facts and call them the truth. In an age of digital enhancement, photographer Elliot Erwitt thought he had to assure those who bought a collection of his photos that all were printed as taken. In an age of creative nonfiction, I felt the need to assure readers of one of my own nonfiction books that “everyone written about in this book is 100 percent real. There are no fictional characters, or composites.”
Deceivers and their apologists rarely consider the broad implications of dishonest behavior. It may be morally ambiguous to tell small, benign lies. This is the parking ticket of ethical crimes. Each such case is of little consequence in itself. When enough of us peddle fantasies as reality, however, society as a whole begins to lose its grounding in reality. Casual duplicity picks at the threads of our social fabric. The sum of lying, big and small, is a culture in which credibility is on the run.
Those who lie and are known to have lied—even on minor matters, even without malicious intent—have trouble getting others to believe their true statements. A single lie unmasked undermines every honest statement. It’s hard to have confidence in those who we know tell lies, even if we condone this practice in principle. Friedrich Nietzsche himself, the eloquent defender of artful liars, once commented to a friend, “Not that you lied to me, but that I no longer believe you, has shaken me.”
Once revealed, lies cast a shadow far back into the past and forward into the future. We understand why someone might lie to us, but trust them less once they do. The key question becomes: So what else are you lying about? Bill Clinton’s credibility never recovered fully from all his fibbing, about Monica Lewinsky in particular. This was not because we didn’t understand, and even forgive, Clinton’s lies, but because they raised questions in our minds about how many others he might have gotten away with.
Reporters dated the unraveling of Al Gore’s 2000 campaign for president to the false statement he made during a debate with George W. Bush about accompanying the head of the Federal Emergency Management Agency to a disaster site in Texas. Republicans pointed out to reporters that this was simply untrue (it was an underling he’d been with). The press then resurrected Gore’s many other fibs and stretchers, leaving him open to the charge of being a “serial fabricator.” Gore protested that most of these gaffes were petty, which was accurate, but also beside the point.
In a sense there is no such thing as a petty lie. Any lie of any size once revealed puts all other statements from the same source in play as possible lies. The core consequence of casual lying is shattered credibility. Credibility is like pottery. Once broken it can be glued back together, but is never quite as strong. A car dealer who said he had disciplined and even fired employees for lying about themselves added that he might retain one who’d had time to establish a solid track record. On the other hand, said the business owner, “I think they’d be very hard to promote because you could never totally trust them again.”
Given the opportunity, those willing to make small things up about themselves are likely to make up bigger things about other topics. Oliver North didn’t just lie about trading arms for hostages during the Iran-Contra scandal but misrepresented his service record. Mike Barnicle’s columns in the Boston Globe that included fabrications and plagiarisms followed lies he told about writing speeches for Robert Kennedy and being a screenwriter on the movie The Candidate. Before making up a Washington Post article that won a Pulitzer Prize, Janet Cooke claimed falsely on her job application that she had a degree from Vassar. Analyst Jack Grubman, who manipulated accounts at Salomon Smith Barney, also said he had graduated from MIT when he hadn’t. Henry Reid, who claimed unearned graduate and undergraduate degrees, later sold parts of donated cadavers when he became the director of UCLA’s body donor program. Lyndon Johnson didn’t just puff up his own military experience and that of his grandfather but lied when he said North Vietnam had attacked an American ship in the Gulf of Tonkin. Once it became clear how routinely he deceived others about matters large and small, LBJ’s reputation was irreparably damaged.
When a person we thought was honest turns out to be dishonest, it doesn’t shake just our confidence in that person but our confidence in general. If we discover that someone we trust has told us a lie, we’re forced to reexamine everyone we trust. When institutions we thought were trustworthy—churches, accounting firms, research laboratories—turn out to be unworthy of trust, we wonder if any institutions can be trusted. After a country’s leader, or a clergyman, or a college professor gets caught manipulating the truth, we question whether anyone’s honesty can be counted on. This is the way in which post-truthful behavior by specific individuals picks away at our social contract as a whole.
Even the most ardent postmodern relativist wouldn’t argue that truth telling has no place in human discourse. No society could function on that basis. Civilization would crumble if we assumed others were as likely to lie as to tell the truth. Our social contract cannot survive lying so routine that citizens consider it normal. One sign of a healthy democracy is its citizens’ capacity for outrage when they are deceived. Since Watergate, and Enron, and the cover-up of child-abusing priests, too many of us have resigned ourselves to lying being the norm among public figures. In the process we have become enablers of the post-truth society.
Political columnist David Broder once observed that the post-Watergate climate of duplicity was a product of reporters and voters who accepted deception as a way of life. This cynicism is worse than outrage. Strachan Donnelly, president of the Hastings Center in Briarcliff Manor, New York, believes it has led to a vicious cycle. Politicians deceive, their constituents become cynical, expectations for political conduct decline, which in turn makes it easier for politicians to keep deceiving. “We’re getting what we’re asking for, in a way,” concluded Connelly.
That is the social cost of constant deception: suspicion followed by resignation. It is not just society that pays for deceptive behavior, however. Individuals do as well. Lying seldom has a positive impact on either the lied to or the liar.