3

THE PRECISE PROCESS WE ARE talking about when we say, “believe,” and where we think it happens—the brain? the heart? the stomach?—are poorly understood. DiFalco believed Franklin when he said he had done the killings, while Walt Weiford and Sergeant Alkire did not. We tend to treat believability as if it were synonymous with truthfulness or akin to solving a mathematical equation, but the relationship between what is believable and what is true and, further, what makes a story believable to one listener but not to another turn out to be some of the murkiest parts of human cognition.

The twelve jurors who voted to convict Jacob Beard were given the following instructions: “After making your assessment concerning the credibility of a witness, you may decide to believe all of that witness’s testimony, only a portion of it, or none of it.…In making your assessment, you should carefully scrutinize all of the testimony given, the circumstances under which each witness has testified, and every matter in evidence which tends to show whether a witness, in your opinion, is or is not worthy of belief.”

The instructions also asked them to consider, when determining believability, “each witness’s intelligence, motive to falsify, state of mind, and appearance and manner while on the witness stand.” Yet much of this—“every matter in evidence,” a witness’s “appearance and manner”—leave great room for subjectivity. Judge Lobban several times directed the jury to use “common sense” and to view the evidence “in the light of your own observations and experience in the ordinary affairs of life,” as if this would manifest twelve identical metal compasses.

For centuries we believed that humans are generally rational beings, applying rational thought and usually achieving sound judgment, except under circumstances where our feelings get in the way. But in the 1980s, just after Vicki and Nancy were killed, a sea change began to sweep through the scholarly community. What if it wasn’t that our feelings were the source of our errors in logic, experts began to ask, but rather that our “machinery of cognition” contained errors in its very design?

Scholars who studied the processes of mind relevant to civil and criminal trial proceedings took up this idea with gusto. Researchers Nancy Pennington and Reid Hastie at the University of Colorado applied it to studying the workings of the minds of judges and juries and published a series of studies in the late 1980s and early 1990s that advanced a theory called the Story Model. Their theory holds that cognitively, instead of taking in each piece of information one at a time and judging it on its logical merits, humans judge legal evidence en masse, forming a story out of it. We then match the story we have built to the relevant legal term—guilty or not, murder in the first or second degree, and so forth.

Researchers found that the stories that judges and jury members construct influence their assessments of how credible a given witness is or the importance of a given piece of evidence. They also found that we tend to fill in any gaps in evidence with inferred causes and associations, consistent with the story we’ve built, and omit pieces of information that are unrelated or contradictory to our story. That is, if you’ve already started to tell yourself that the defendant is a good man wrongly accused of murdering his daughter, you will be more likely to disbelieve the ex-wife who takes the stand to testify to his violent temper and find reasons to discount the bloody footprint that matches his shoe.

The justice process is a war between competing stories and a quest to win the imaginations of the people who matter at every stage—investigators, prosecutors, judges, and juries. But what factors determine why these people choose one story over another?

Most experts in the field have come to the consensus that it is the “ease of story construction”—that is, the easier it is for the parties that matter to form a story out of the events in the first place, the more likely they are to believe that story.

According to the Story Model, two characteristics determine how “believable” an average person will find a particular story: coverage, or the extent to which the story accounts for evidence presented at trial, and coherence, or the story’s wholeness, lack of contradictions, and “plausibility”—“the extent to which the story is consistent with knowledge of real or imagined events in the real world.”

The more familiar the story is, then, the easier the narrative connection forged between teller and listener. At its most basic, many experts say, our brains work by slotting our experiences into molds of classic stories. Each time we hear a new story, we figure out what it means by trying to match it to one of our stored narratives. “We are always looking for the closest possible matches,” write Roger Schank and Robert Abelson, professors at Northwestern and Yale, in their essay “Knowledge and Memory: The Real Story.” “We are looking to say, in effect, ‘Well, something like that happened to me, too,’ or, ‘I had an idea about something like that myself.’”

In terms of plausibility, it’s not hard to see things from Alkire or Weiford’s perspective. A mentally ill white supremacist serial killer roaming the nation is unfamiliar enough, let alone one who happens to be passing through one of the most rural counties in the United States, miles from any interstate, at the exact same time that thousands of other outsiders also happen to be passing through it. In America, where misogyny and violence against women are rampant, in a county where alcohol use is high, a story of local men fueled by alcohol killing women for no reason may feel strangely and deeply familiar.

It is also widely known that judges and jury members’ perceptions of witness credibility are seriously influenced by seemingly irrelevant factors and that prepackaged stories from the world affect these judgments too. Jurors tend to see experts offering scientific testimony as more credible if they are attractive and confident. Rape victims who speak loudly or with anger or who do not break down in tears or who once went on a date with their rapist are less likely to be believed, studies show, because these actions deviate from the preconceived stories of rape we know. “When creating their stories jurors rely on their mental scripts that include stereotypes and regular arrangements of events,” writes scholar Katharina Kluwe of Loyola University Chicago. “Accordingly, they use their existing knowledge and beliefs to fill in missing information, to sort out contradictory evidence, and to determine the believability of a story.” Our courtrooms then, are where some of our most toxic stereotypes and flattest truths are made and reinforced.

This view seems deathly dark unless you think of our brains less as maliciously negligent and more as simply inclined toward rest and relaxation. This is essentially what Daniel Kahneman argues in his acclaimed work Thinking Fast and Slow, in which he writes that we all have two “systems” working in tandem to conserve our mental resources and function efficiently: System 1 is fast, instinctive, automatic, subconscious, and constantly busy generating impressions, intuitions, intentions, and feelings; System 2 is logical, effortful, comprehensive, and slow. System 1 requires little of us and is sufficient for most of our everyday functioning, but it’s not equipped to handle complex processes like long division, deciding which house to buy, or holding two conflicting ideas in the mind at once. Most of our mental questions are ones we do not even know we are asking, and they are answered by System 1, but when there is a question for which System 1 has no reply, System 2 is pushed into action.

We don’t like this idea that System 1—a force outside our conscious control—exerts such enormous influence over our mental life, but according to Kahneman, it is so. “You believe you know what goes on in your mind, which often consists of one conscious thought leading in an orderly way to another,” he writes, “but that is not the only way the mind works, nor indeed is that the typical way. Most impressions and thoughts arise in your conscious experience without your knowing how they got there.…You know far less about yourself than you feel you do.”

System 1 is continuously taking in information from the world and spinning a story from it. Your most intimate friend says a single word into the telephone, and you know she is angry with you; a tall man on the subway platform is shouting curses, and you sense a threat and move away. System 1 also identifies incongruity and reacts with surprise; Kahneman offers the example of an upper-class British man’s voice saying, “I have a large tattoo on my back.” Studies measured that the brains of participants had a response to this in just two hundred milliseconds: something is off; something does not make sense; people with moneyed British accents cannot also have large tattoos down their backs.

“If endorsed by System 2, impressions and intuitions [generated by System 1] turn into beliefs, and impulses turn into voluntary actions,” writes Kahneman. “When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually.”

There is just one problem: System 1 is a blunt tool—it has so much work to do that it cannot afford to wade into the fine print every time—so it operates from the perspective of categories—the most typical case, the most plausible meaning—and thus produces the best story possible out of the information available. It tends to minimize ambiguity, suppress doubt, and exaggerate coherence—the degree to which the elements of the story go together, cause each other, and add up to a meaning or message—in order to tell stories that reinforce our existing judgments and beliefs. We are evolutionarily programmed to make links between our perceptions and their most likely meanings. “Coherence means that you’re going to adopt one interpretation in general. Ambiguity tends to be suppressed. Other things that don’t fit fall by the wayside, Kahneman instructs: “We see a world that is vastly more coherent than the world actually is.”

Can such errors in thinking and judgment be overcome? Kahneman says, essentially, no.