“As I was leaving the tier that night, I was told that I didn’t see shit.
And me being the person that I am—I try to be friends with
everybody—I said, ‘See what? I didn’t see nothing.’ ”
—U.S. Army Specialist Jeremy Sivits on Abu Ghraib1
IN JANUARY 2001, Walt Pavlo pleaded guilty to wire fraud and money laundering and was sentenced to two years in prison. It might as well have been twenty, for he had lost everything: his job, his wife, his family, most of all his sense of himself. The little boy who’d been brought up to know the difference between right and wrong now had to wrestle with the incontrovertible fact that good Walt had done bad things. Who was he now? A bad-guy fraudster who’d burned through shareholder cash partying like a rock star on the Cayman Islands? Or a fundamentally good guy who had, somehow, lost himself?
Pavlo doesn’t think many people with a conscience get away with white-collar crime because, he says, you can’t bear the reality of what you’ve done. To this day, you can read the anguish on his face as he struggles to reclaim the good man he always thought he was. He’s working again, for a small recycling firm that’s glad to have someone more qualified than they might be able to afford. But the battle still goes on inside Pavlo’s head. He’s neat, well dressed, attentive, punctual—and these things matter to him, disproportionately: every detail signals whether he’s still on track. He has to convince a tough jury: himself. But he still isn’t sure of the verdict.
When he got out of prison, Pavlo contacted Frank Abagnale, the famous fraudster whose life story was the subject of Spielberg’s film Catch Me If You Can. Abagnale advised him to start talking to people about his experience. If you stay at it, Abagnale advised, you can be successful, but you have to have the right message. You have to bear the consequences of what you’ve done, and you have to go it alone. Pavlo took this advice and started teaching business-school students how easy it is to go wrong. He would give up to eight talks a day for a single fee. Punctilious about not making too much money, he didn’t want to profit from his crime.
One day, he was giving a talk in South Dakota.
“It was very cold outside. I was talking to a bunch of sophomores and I smelled smoke. But I kept on with my talk. Ten minutes later, smoke started rolling in through the air vents. We evacuated the building. About ten minutes later, it turned out it was no big deal—just someone burning pizza boxes. But when we all came back in, I asked, ‘How many of you smelled smoke and said nothing?’ Everyone giggled and raised their hand. ‘How come no one said anything?’ Some said they didn’t want to look stupid; others said they weren’t sure it was smoke. But that is how it is in business. You smell the smoke. You know there’s something wrong and you think: maybe it’s just me, maybe I’m wrong—and you don’t do anything. By the time there’s a fire in the building, it’s too late.”
Although he didn’t know it, Pavlo had witnessed an accidental reenactment of a famous experiment conducted in 1968 by two young psychologists, Bibb Latané and John Darley, in which they placed one, two, or three volunteers in a room and asked them to fill in a questionnaire. As they did so, the room slowly began to fill with smoke. The two psychologists wanted to know under which conditions the volunteers would be most likely to do something about the smoke: Were you more likely to respond to an emergency when alone or with confederates?
Their results were shocking. One person alone would, within two minutes, do something about the smoke: look for its source, check its temperature, go and get help. But when there were two people in the room, only one out of ten reported the smoke. The rest stayed there, doggedly filling out their questionnaires while coughing and rubbing their eyes. And when there were three people in the room—which the psychologists theorized should produce three times the response rate of one person—only one in twenty-four people reported the smoke within the first four minutes, even though, by then, they could scarcely see.2
Darley and Latané coined the phrase “bystander effect” to describe what they’d discovered. Their interest in the phenomenon had been sparked by the murder of a young New Yorker, Kitty Genovese, who was stabbed to death in the middle of a street in New York City in 1964. At the time, it was thought that up to as many as thirty-eight people had witnessed the attack, which had lasted more than thirty minutes, yet not one had so much as phoned the police. Preachers, news commentators, and politicians instantly pontificated about callous New Yorkers and inner-city anomie, but Darley and Latané were skeptical—and they were living in New York themselves at the time. Was it just New Yorkers who were especially venal—or would anyone respond passively to an emergency? Which, if any, of us was so superior?
The very first experiment isolated volunteers in booths and led them to think they were overhearing someone having an epileptic seizure. In cases in which the volunteers thought they alone knew what was happening, 85 percent reported the incident. But in cases in which the volunteers believed that others elsewhere would also know about the seizure, only a third did anything about it. The experiment indicated that the larger the number of people who witness an emergency, the fewer who will intervene.3 Collectively, we become blind to events that, alone, we see readily.
In that first experiment, the witnesses knew others shared their knowledge, but they were isolated from one another because Latané and Darley wanted to rule out conformity as one of the explanations for the behavior, to focus on individual decisions. By the time they did their version of the experiment in the smoke-filled room, they knew that bystander behavior might be exacerbated by conformity but was not determined by it.
Their results provoked repetition in a flurry of variations, in which men and women, black and white, young and old were witness to all kinds of emergencies: robberies, faints, asthma attacks, screams, falls, crashes, and electric shocks.4 They all confirmed the original thesis: the more people who witness an event, the less likely it is that any will respond to it. Even just thinking about other people reduces levels of altruism.
News headlines keep confirming their findings. The advent of cell phones should have made reporting incidents faster and easier, but they haven’t changed behavior. In October 2009, a fifteen-year-old girl in Richmond, California, was raped by ten high school students, an event witnessed by at least twenty people.
There is no reason to conclude that the witnesses were exceptionally bad people. That’s what Latané and Darley’s work had proved. We are all likely to behave this way. When we are in groups, we see bad things happening but act as though we are blind to them. Just as we all think we won’t obey Milgram’s instructor, so we do not believe that we would be passive bystanders. But the evidence is against us.
Internet chat rooms have shown just how far bystander behavior can scale. In 1998, Larry Froistad, a twenty-nine-year-old computer programmer, confessed to two hundred people in a chat room for recovering alcoholics that he had set his house on fire to murder his five-year-old daughter.
“When she was asleep, I got wickedly drunk, set the house on fire, went to bed, listened to her scream twice, climbed out the window and set about putting on a show of shock and surprise.”5
Only three of the two hundred members of the chat room reported him. Froistad was subsequently tried, convicted, and sentenced to forty years in prison. Subsequent studies of the behavior of more than four hundred chat groups just reinforced the bystander thesis: more people, less response. But one salient detail did emerge: you were more likely to get help if you could ask for your helper by name.
The bystander effect demonstrates the tremendous tension between our social selves and our individual selves. Left on our own, we mostly do the right thing. But in a group, our moral selves and our social selves come into conflict, which is painful. The nonintervening subjects of Darley and Latané’s experiments had not, they said, decided not to intervene. Rather they were frozen in a state of indecision and conflict about whether and how to intervene. Looking for a way out of that discomfort, they choose the easier path, a kind of moral shortcut.
One of the initial explanations for bystander behavior was ambiguity: It can be hard to know quite what’s happening or what response is most appropriate. But perhaps we are also ambiguous about our own role within the event. We hope that the situation isn’t so dire that it needs our intervention. Perhaps it will pass or we’ve just misinterpreted it. Responding could provoke conflict and we don’t like conflict. Conformity comes into this, too: I sure would look like an idiot if I rushed to help and it turned out to be nothing. Our fear of embarrassment is the tip of the iceberg that is the ancient fear of exclusion, and it turns out to be astonishingly potent. We are more likely to intervene when we are the sole witness; once there are other witnesses, we become anxious about doing the right thing (whatever that is), about being seen and being judged by the group.
A great deal of bystander behavior is trivial, or at least feels that way. You see a young employee being mocked by her peers for her dress sense. It doesn’t matter to you, but being trivialized is professionally damaging—yet it’s the rare boss or colleague that doles out clothing advice. Our most frequent exposure to bystander apathy occurs at work, where we see (or think we see) colleagues indulging in abusive, unsafe, or illegal activity. We don’t want to be the ones to complain and, anyway, we might be wrong. Where this concerns clothing, it may not matter. But where it involves vulnerable, less powerful people, it can be highly dangerous. This emerges powerfully in a study of nurses who regularly observed a colleague, Annie, cruelly abusing patients.
“She had a really negative effect on patients, especially the elderly ones or those who were long-stay,” one of the nurses, Sean, recalled. “Making a big deal about getting them a bed pan or helping them to the bathroom during the night meant a lot of them would stop taking anything to drink from late afternoon to ensure they didn’t need the bathroom during the night … and yet dehydration is a big problem with elderly patients. And we had a number of patients who suffered a long, painful night because she had been really abrupt with them when they’d first asked for something for the pain, and they were too scared to ask again.”6
Annie was the talk of the hospital; everyone—even those on other wards—knew she was a terrible nurse. But no one did anything. Some said she was intimidating: “When drunks were admitted to the ward, she never had any trouble from them. If even a big, drunk aggressive guy gets the vibe that he should behave himself or else, you can imagine how easily the rest of us were intimidated by her.” They were afraid that if they did say anything, and nothing happened, working with her would only get worse. Others blamed themselves, hoping they could learn how better to work with Annie.
“At first, I thought of this as my problem,” Sean admitted. “I was pretty new and I’m not the most assertive of people.”
Some nurses made excuses for her, explaining what a hard life she had had, while others tried to focus on her redeeming qualities: she was good in a crisis (when caring takes a back seat). All the nurses tried to avoid working with her and they all talked about her. As in all workplace gossip, the fact that everyone knew what was going on was part of the problem.
“Knowing that colleagues were concerned and keeping an eye on her made me feel a bit better about it for quite a while,” recalled another nurse, Mandy. “It was only gradually that it dawned on me that nothing had actually changed: She was no better, in fact she was probably getting worse.” Responsibility for Annie was being diffused: Since everyone knew about it, in theory everyone was responsible for it—which means that they all wanted someone else to act. But by remaining passive, the nurses actually reduced the likelihood that anybody would bring about a positive solution.
Diffusion of responsibility—the rule of nobody—is a common feature of many large organizations, where almost nothing is done alone. Even one of the most famous whistleblowers of all time—Daniel Ellsberg—succumbed to it for a while.
Ellsberg started working in 1964 for U.S. Assistant Secretary of Defense for International Security Affairs John T. McNaughton “on secret plans to escalate the war in Vietnam, although both of us personally regarded these as wrongheaded and dangerous. Unfortunately, by decisions of President Johnson and Secretary [of Defense] McNamara, these plans were carried out in the spring of 1965.”7
Despite his belief that the war was wrong, Ellsberg continued to work for the government. In 1967, he was assigned to work on the top-secret McNamara study of U.S. decision making in Vietnam—what came to be known as the Pentagon Papers, a seven-thousand-page analysis of twenty-three years of American policy in the region. Ellsberg kept hoping—quite reasonably—that those with access to the papers would read them; he personally urged Henry Kissinger to do so. He also kept hoping that someone else—any of the “scores of officials, perhaps a hundred” that had access to the papers—would give them to the Senate. Still no one did. It was not until 1971 that Ellsberg finally took action himself, risking life imprisonment by leaking them to the press.
That Ellsberg waited in no way undermines his courage; rather it illustrates how even a tremendously brave man could think, and hope, that someone else—with more authority, more power, more protection—would act. In Ellsberg’s case, as in the more mundane case of the nurses, there were so many people who could act—or could be expected to act—that no one felt uniquely responsible for doing so.
Sometimes diffusion of responsibility can be no more than a fancy phrase for passing the buck. We take our cues from those above us—especially in hierarchical organizations—and if they do nothing, what are we supposed to do? We can always use our salary, our career, to justify turning a blind eye to what we know is wrong.
When she worked at Enron as a contract specialist, Lynn Brewer completed a legal brief concerning a large gas contract that she knew to be “a quarter-billion-dollar scam.” Enron simply did not own the gas it had used as collateral; the company was in default of the agreement the day it entered into the deal. But no one else seemed worried by it. Brewer had the nerve to track down the detail—something no one expected her to do—but she had absorbed enough of the company’s competitive ethos to know not to include details of the fraud in her brief. She agonized at length, knowing that ignoring the scam and helping to cover it up was wrong, but she was loathe to jeopardize her salary, her career, and the excitement of a new job. So she drafted the brief as though there were nothing wrong with it, and then, in a wonderful piece of displacement activity, deemed others to be the bystanders.
“Bound by attorney-client privilege and a legal code of ethics, going outside the company and blowing the whistle was not an option. It would cost us not only our jobs but our careers … Why hadn’t the banks caught the minor missing detail? Or had they too been wooed by the promises of Enron’s Yellow Brick Road the same way I had?”8
The “Yellow Brick Road” reference is germane. In 1997, Enron’s mining and metals team had staged a holiday skit of The Wizard of Oz. Sherron Watkins played the Wicked Witch and the Wizard was played by Jere Overdyke in a scarlet satin pimp suit pretending to be the CFO, Andy Fastow.9 Everyone could read the subtext: Just as the wizard is a fraud, so the company’s financial wizardry was nothing more than smoke and mirrors. In the audience, everyone laughed knowingly—every one of them a bystander. They may have felt themselves merely passive witnesses to a crime too big for them to stop—but the reality was that, with every crooked contract, the Enron employees were validating their crooked corporation. The crucial point about bystanders is that they have potential influence, and their choice not to use it means that they aren’t neutral but run the risk of morphing gently into perpetrators. A neutral stance isn’t possible. After all, you can’t bring down the sixth largest corporation in America without a great deal of help.
That everyone was well paid, that they all liked each other and had so much in common, that many were exhausted by long hours and global travel, all helped to secure their collusion. But what moral clarity they might have had was also severely confused by the intense ambiguity of their environment. Orientation videos and TV ads for Enron specifically celebrated the “defiant and the visionary.” CEO Jeff Skilling and chairman Ken Lay deliberately set out to build a vast company that operated with the freedom and latitude of entrepreneurial start-ups. They actively scorned rules and regulations and encouraged competitive risk-taking. At the same time, they were spending a fortune lobbying Washington to get any inconvenient regulation abolished. It could be very hard, at any given time, to know whether a law was to be taken seriously, or merely an obstacle to be circumvented. The cliché “don’t ask for permission, ask for forgiveness” was understood as license to do anything that generated revenue.
Such shifting sands make it intensely difficult to keep hold of any clear sense of moral norms. If everyone is doing crooked deals, and being rewarded for it, then what is normal? If everyone is taking moral shortcuts, where can you see signs for the main road? How can you maintain a sense of yourself as a good person if the definition of good keeps changing?
“As a child,” says Walt Pavlo, “when you did things wrong, your parents said so. I know right from wrong. But as an adult when I was hiding money, I knew I was doing wrong but I was being rewarded and promoted! It’s extremely confusing. You rationalize: it can’t be that wrong if nobody’s stopping me. So I’m going to call it okay because it fits my life model that good things happen to good people. Good things are happening and therefore I must be good.”
Of course, Pavlo had a boss. Didn’t the boss start to wonder how such huge debts managed to disappear? No. In a classic example of “don’t ask, don’t tell” his boss—who was, anyway, under a lot of pressure and exhausted—never asked: What are you doing and how are you doing it?
“If he had asked, that would have relieved me of so much pressure. It would have been a relief if he’d said anything!”
But instead the boss chose to be a bystander, too.
Bystander behavior need not always involve a crime. In companies, it is more often implicated in the failure to respond to strategic business threats. Classically, this is taught as the blindness of buggy-whip manufacturers to the all-too-obvious rise of the automobile, or as the myopia of U.S. television manufacturers to the emerging excellence of Japanese electronics. The cases are legion, but teaching the stories of these industrial train crashes never seems to prevent them.
In 1999, the advent of the Internet and the rise of digital music had all of the record labels in conniptions. Why were kids stealing their music? How could they be stopped? The music business, with its own history of corruption, nonetheless was determined to stop kids from ripping off its profits. Years before the iPod had even launched, kids were ripping CDs and sharing them—via e-mail, Napster, Kazaa, or any number of file-sharing sites. Nobody wanted to pay for music anymore; anything you wanted could be found for free somewhere online. The record companies dabbled in a few Internet initiatives but were too corrupt to make good partners for anyone. And they invested heavily in lobbying, hoping that their trade association, the Recording Industry Association of America, would buy enough friends in Washington to stop the future from happening. As profits melted before their eyes, torn between denial and aggression, they presented a disunited, incoherent front to a consumer base years ahead of them. Whatever they tried to do, it was always too late.
If you weren’t in the music business, this was easy to see. And it didn’t take a gigantic leap of the imagination to see that what was happening in music was just a glimpse of the future for the movie industry. But it hadn’t happened to them yet. Broadband penetration in the late 1990s was still under 50 percent in the United States. Even though, theoretically, you could download a movie, in practice nobody had the patience to do it down phone lines. But the music business presaged the looming disaster lying in wait for the film business.
At any rate, that’s what it looked like to me when I approached the studios with a proposal that was, in essence, a precursor of iTunes. Wouldn’t it make sense to seize the initiative and learn about online distribution and consumption now, rather than wait to have it owned by someone else? At the time, I ran an online media business called iCAST. Well connected and extravagantly well funded, we had a lot of meetings in Hollywood; even if people didn’t like or understand all of our ideas, they wanted our money.
I remember in particular one meeting with an army of executives from Disney, many of whom are still in place today. We laid out the threat, as we saw it, and proposed a number of ways we could work together to seize the initiative. I didn’t recognize it at the time, but in this meeting, we witnessed the full panoply of bystander behavior. They all saw what was happening in the music business. They understood that the same crisis awaited their business. But no one was willing to take the risk of intervening. The market, they said, was ambiguous; no one knew for certain how digital entertainment would develop. And whose responsibility was it to take the first step?
Corporate politics meant that doing nothing would always be safer than recommending a bold course of action. Besides, the executives asked, if the strategic risk were real, wouldn’t some of the other studios be doing something about it? If they weren’t, why should Disney? Nobody in the room—though they were all proudly very senior—felt quite senior enough to make a decision. I remember sitting there thinking, in the 1980s, the TV networks were dubbed the “three blind mice” because of their failure to take the advent of cable seriously. Now the film studios were producing the remake.
You only have to mention the words willful blindness to hear the same story about a different industry: how beverage companies ignored the advent of vitamin drinks; packaged goods businesses don’t think it matters whether products are environmentally sound; pharmaceutical companies don’t pay attention to off-label prescribing; and gun manufacturers still pretend a secondary market (selling to kids and criminals) has nothing to do with them. The knowledge is there, spoken or unspoken, but the executives do nothing. In these cases, it is their livelihoods, not their lives, that are threatened—but their pattern of behavior is just the same as that of Kitty Genovese’s neighbors.
These are classic business stories, because it is so human and so common for innovation to fail not through lack of ideas but through lack of courage. Business leaders always claim that innovation is what they want but they’re often paralyzed into inaction by hoping and assuming that someone else, somewhere, will take the risk. They may see a looming crisis but, like the participants in Latané and Darley’s experiments, they would rather continue earnestly to complete their questionnaires than get up and acknowledge that the room is full of smoke and they can no longer see. Just like the witnesses to a crime, once someone does seize the initiative, bystanders are left feeling anxious and uncomfortable, dimly aware that they missed their moment.
Nobody really knows if bystander behaviors are innate or learned, but they certainly start very young. Most children witness bullying at school, by teachers or peers or both, and feel uncomfortably grateful not to be the victim. They may even learn, as my son did, the invaluable life lesson of how to stay off of a bully’s radar screen. But a salient characteristic of all bullying is that it craves an audience, which most kids, acting as bystanders, provide.
In recent years, the U.S. Department of Justice and the police have become very interested in bullying, calling it the most underreported safety problem on American school campuses.10 Two thirds of school shooters who were still alive to report after the incident had previously been bullied; that on-campus shootings occur at all may, in some part, be attributed to the rage and frustration felt by the victim, for whom all bystanders are colluders.
Bystanders play a big part in bullying. Despite many campaigns, and the requirement that all British schools have an anti-bullying policy, a national survey into bullying in 2006 showed that 69 percent of schoolchildren said that they had been bullied, 87 percent of parents said that their children had been bullied, but 83 percent of teachers said that they had not seen bullying at their school.11
Bullying is, by its nature, hard to measure but it is ubiquitous, with Canadian rates at least as high as the UK’s. Although Canada has laws against bullying, they don’t seem to prevent stunts like “Kick a Ginger Day” which inspired over 4,700 Canadians to support the idea of tormenting red-headed people.
In many instances, bystanders act as “reinforcers,” by either providing an encouraging audience or protecting the bully by their failure to intervene. Both types of bystander validate the bully by their very presence. Only 10 to 20 percent of witnesses ever provide any real help. Perhaps most disappointing of all is the police observation that bullying is seldom reported because children follow adult examples.
“It is at school that children learn to be bystanders,” says Ervin Staub, professor of psychology at the University of Massachusetts, Amherst. Staub has devoted his life to the study of good and evil. He is himself a Holocaust survivor and his work on genocide and mass violence has led him to work in Rwanda, Burundi, and the Congo, as well as in New Orleans and Los Angeles. His interest in bullying derives from his observation that all mass violence requires, and is also inflamed by, bystanders. It’s for that reason that he conducted a lengthy study of bullying in schools, which is where, he believes, the behavior starts.
“Sadly, teachers don’t intervene often,” says Staub. “It turns out that some teachers think kids should take care of their own business. So they don’t do anything. But this is problematic because as adults, we should provide kids with guidance. When we are passive, that sends a message that there’s no need to act. I very much believe that bystanders give perpetrators and bullies a message that what they are doing is accepted. It tends to make them believe that they are supported.”
In western Massachusetts, Staub has begun to develop a curriculum to teach kids how better to respond when they see bullying. Its goal, he says, is to create a sympathetic but unaccepting attitude to bullies.
“We try to help the students understand the impact on kids who are bullied; kids don’t process that necessarily. And we train them to engage. They don’t have to do it alone, just turn to a friend and say, ‘Hey, we need to do something here, we have to stop this.’ We also try to get them to turn towards and support the victim because the passivity of bystanders makes victims feel there is no sympathy for them. You know, it often takes very, very little to stop a bully; people just don’t realize the power that they have.”
So far, Staub has run his program only once, but early indications are that students emerge more confident in ways in which they can and will intervene. He’ll be running it again because he wants those kids to understand the power they have. Staub is dedicated to understanding the processes of mass violence in order to understand how to prevent it.
“I have done work in Rwanda for many years now and one of the things we do there is try to help people understand the influences that lead to violence, what are the social conditions and psychological changes that people undergo. By providing this kind of information, I believe it is less likely that people close their eyes. They know what to look for. Things that normally go under your radar, when you know what you are looking for, those things are likely to connect, to click in. You think: I have to pay attention to this, this matters.
“One of my strong positions is that in case of mass violence, we must act early. It is easier to act earlier—before ideologies and positions have developed and intensified. It’s just the same in individual situations, even if you’re not thinking about ideology, just thinking of a bully. Once they’re a little way down the road, they’re in a position where if they stop in response to someone, they lose face. So it’s easier to intervene before they have too much to lose.
“The same is true in other group situations. We tend to think: this is a small thing, it’s not big enough to worry about it. But by the time it is big enough to worry about, it is too late. Mass violence evolves progressively, always with small steps like excluding people, creating offices that serve discrimination. When no one does anything about those changes that sends a sign, too—a sign that they can go farther. I’ll give you an example: Goebbels. After the Évian Conference in 1938, when the community of nations gathered in Switzerland to talk about taking in Jewish refugees from Germany and no one wanted to take them in, Goebbels wrote in his diary, ‘They would like to do what we are doing but they don’t have the courage.’ He took the message that if, at the first step, no one would stop him, then he could keep going.”
Today, over forty years since he collaborated on the bystander experiments, John Darley works within the leafy enclave of Princeton University. He’s since moved on to do important work that explores how large organizations, and the people in them, become corrupt. In many ways, he has not strayed so very far from his original, groundbreaking work. The greatest evil, he argues, always requires large numbers of participants who contribute by their failure to intervene.
Darley was born into psychology; as a teenager, he remembers his father helping Leon Festinger with logistics when he infiltrated the cognitive dissonance of Marian Keech and her flying-saucer devotees. For someone who has spent his life delving into such dark aspects of human behavior, Darley is a delightfully affable and mellow character, open, accessible, and still eager to explore human behavior and the social and corporate structures that reinforce it. He’s particularly intrigued by the way morality impinges on thinking only intermittently.
“The moral mind is not a default mind,” he says. “In very competitive environments, where you’re under a lot of stress, a lot of cognitive load, you won’t necessarily even see that there is a moral consideration at all. Most corruption, I think, starts with an intuitive act, not a deliberate one. Is this a System One/System Two error, I wonder.”
Darley is referring to a body of work by the psychologist and Nobel laureate Daniel Kahneman, who argues that we operate two modes of thinking: System One, which is intuitive, associative, very fast, and born of habit. It is, in essence, shortcut thinking—and much of the time, it is good enough. System Two is more deliberative, analytical, and slow and requires much more effort; it’s what we use if we want to solve a math problem correctly but one of its other purposes is to monitor System One for errors.12
“Many decisions made intuitively are reasonable,” Darley continued. “But the problem is that sometimes they are not. And the reasoning system monitoring—System Two—is intermittent and lax. That’s the problem: no alarm goes off. So you sell mortgages to people who can’t afford them but you’re in a competitive environment, under a lot of pressure to perform, so you don’t monitor the morality of your actions. And of course, as soon as that mortgage is securitized, you’re not responsible for it, so you don’t care if the owner defaults. Responsibility has been diffused. From your perspective, System One has worked just fine. And of course the most bonused are the most blind because for them to look carefully is, quite literally, too costly. I think when you look at how all of that plays out, you see that in conditions of high stress—from competitive environments, compensation structures, or just company politics—the high stress tends to distance moral reflection.”
Darley has written extensively about the degree to which corruption on a wide scale, and evil on a wide scale, require large numbers of participants and bystanders. Corporate meltdowns like those at Enron, MCI, or the banks require the work of thousands of people, all failing to see the moral implications of their work. And you can’t have historic catastrophes, like the Holocaust, unless millions of individuals have taken, in effect, the same moral shortcut.
“Under conditions of high stress,” Darley continues to gnaw on the problem, “you may still have inklings, suspicions. But you may inhabit an environment that valorizes blindness, so you don’t look. Who or what is it you are blind to? In the end, I think it’s you. You become blind … to yourself … to your better self.”
In 1938, the residents of Mauthausen, Austria, were feeling optimistic. Their stone quarry was once again being worked, as massive rebuilding in Germany and Austria brought new business to the town. In nearby Linz, Austria, Hitler had great plans for his hometown: a new bridge, town hall, art museum, party headquarters, and two monuments, one commemorating the Anschluss and one commemorating the composer Anton Bruckner. The town’s economy was looking up, too, because the building of a new concentration camp created jobs and new business for suppliers and craftsmen.
When prisoners began to arrive in August of that year, the SS tried to keep the population at a distance. But the camp was in full view, and ordinary life had to continue. Inevitably that meant that the brutal treatment of prisoners was witnessed by many bystanders. One was a farmer, Eleanore Gusenbauer, who didn’t like what she saw. So she wrote to complain:
In the Concentration Camp Mauthausen at the work site in Vienna Ditch inmates are being shot repeatedly; those badly struck live for yet some time, and so remain lying next to the dead for hours and even half a day long.
My property lies upon an elevation next to the Vienna Ditch and one is often an unwilling witness to such outrages.
I am anyway sickly and such a sight makes such a demand on my nerves that in the long run I cannot bear this.
I request that it be arranged that such inhuman deeds be discontinued, or else be done where one does not have to see it.”13