Moral Regress and Pathways to Evil
It is indeed probable that more harm and misery has been caused by men determined to use coercion to stamp out a moral evil than by men intent on doing evil.
—Friedrich Hayek, The Constitution of Liberty, 1960
In 2010, I worked on a Dateline NBC two-hour television special in which we replicated a number of now classic psychology experiments. In one experiment, an unsuspecting subject and a room full of confederates (actors who knew the objective of the study) were asked to fill out applications to participate in a television game show. The confederates dutifully filled out their forms, even as the room gradually filled with smoke—but, remarkably, most of the subjects continued to fill out their forms too. The majority of unsuspecting participants, who had every reason to believe that the building was actually on fire, continued their task, as if burning to death was not a particular problem for them. Everyone else was calm, so they were calm. As the subjects coughed and waved the smoke away, heads bent to their trivial task, the herd instinct became ever more blazingly obvious. You could almost hear the baaa-ing. But the most dramatic experiment we replicated was Yale University professor Stanley Milgram’s famous shock experiments from the early 1960s on the nature of evil.
In a book on moral progress, it is necessary to address its obvious antithesis—moral regress—and identify which pathways lead to evil, to mitigate them.
SHOCK AND AWE: WILLINGNESS OR RELUCTANCE?
Shortly after the war crimes trial of Adolf Eichmann began in Jerusalem in July 1961, psychologist Stanley Milgram devised a set of experiments, the aim of which was to better understand the psychology behind obedience to authority. Eichmann had been one of the chief orchestrators of the Final Solution but, like his fellow Nazis at the Nuremberg trials, his defense was that he was innocent by virtue of the fact that he was only following orders. Befehl ist Befehl—orders are orders—is now known as the Nuremberg defense, and it’s an excuse that seems particularly feeble in a case like Eichmann’s. “My boss told me to kill millions of people, so—hey—what could I do?” is not a credible defense. But, Milgram wondered, was Eichmann unique in his willingness to comply with orders, no matter how atrocious? And just how far would ordinary people be willing to go?
Obviously Milgram could not have his experimental subjects gas or shoot people, so he chose electric shock as a legal nonlethal substitute. Looking for subjects to participate in what was billed as a “study of memory,” Milgram advertised on the Yale campus and also in the surrounding New Haven community. He said he wanted “factory workers, city employees, laborers, barbers, businessmen, clerks, construction workers, sales people, telephone workers” and not just the usual guinea pigs of the science lab: undergraduates. Milgram then assigned his subjects to the role of “teacher” in what was purported to be research on the effects of punishment on learning. The protocol called for the subject to read a list of paired words to the “learner” (who was, in reality, a shill working for Milgram), then present the first word of each pair again, upon which the learner was to recall the second word. Each time that the learner was incorrect, the teacher was to deliver an electric shock from a box with toggle switches in 15-volt increments that ranged from 15 volts all the way to 450 volts, and featured such labels as Slight Shock, Moderate Shock, Strong Shock, Very Strong Shock, Intense Shock, Extreme Intensity Shock, and DANGER: Severe Shock, XXX.1 Despite the predictions of forty psychiatrists whom Milgram surveyed before the experiment, who figured that only 1 percent of subjects would go all the way to the end, 65 percent of them completed the experiment, flipping that final toggle switch to deliver a shocking 450 volts, a phenomenon the social psychologist Philip Zimbardo characterizes as “the pornography of power.”2
Who was most likely to go the distance in maximal shock delivery? Surprisingly—and counterintuitively—gender, age, occupation, and personality characteristics mattered little to the outcome. Similar levels of punishment were delivered by the young and the old, by males and females, and by blue-collar and white-collar workers alike. What mattered most was physical proximity and group pressure. The closer the learner was to the teacher, the less of a shock the latter delivered. And when Milgram added more confederates to encourage the teacher to administer ever more powerful shocks, most teachers complied; but when the confederates themselves rebelled against the authority figure’s instructions, the teacher was equally disinclined to obey. Nevertheless, 100 percent of Milgram’s subjects delivered at least a “strong shock” of 135 volts.3
In our 2010 replication in a New York City studio, we tested six subjects who believed that they were auditioning for a new reality show called What a Pain! We followed Milgram’s protocols and had our subjects read a list of paired words to a “learner” (an actor named Tyler), then present the first word of each pair again. When Tyler gave a prearranged incorrect answer, our subjects were instructed by an authority figure (an actor named Jeremy) to deliver an electric shock from a box modeled after Milgram’s contraption.4
Milgram characterized his experiments as testing “obedience to authority,” and most interpretations over the decades have focused on subjects’ unquestioning adherence to an authority’s commands. What I saw in our subjects, as you can see in Milgram’s subjects in old film footage on YouTube, was great reluctance and disquietude nearly every step of the way. Our first subject, Emily, quit the moment she was told the protocol. “This isn’t really my thing,” she said with nervous laughter. When our second subject, Julie, got to 75 volts (having flipped five switches), she heard Tyler groan. “I don’t think I want to keep doing this,” she said.
Jeremy pressed the case: “Please continue.”
“No, I’m sorry,” Julie protested. “I don’t think I want to.”
“It’s absolutely imperative that you continue,” Jeremy insisted.
“It’s imperative that I continue?” Julie replied in defiance. “I think that—I’m like, I’m okay with it. I think I’m good.”
“You really have no other choice,” Jeremy said in a firm voice. “I need you to continue until the end of the test.”
Julie stood her ground: “No. I’m sorry. I can just see where this is going, and I just—I don’t—I think I’m good. I think I’m good to go. I think I’m going to leave now.”
At that point the show’s host, Chris Hansen, entered the room to debrief her and introduce her to Tyler, and then Chris asked Julie what was going through her mind. “I didn’t want to hurt Tyler,” she said. “And then I just wanted to get out. And I’m mad that I let it even go five [toggle switches]. I’m sorry, Tyler.”
Our third subject, Lateefah, started off enthusiastically enough, but as she made her way up the row of toggle switches, her facial expressions and body language made it clear that she was uncomfortable; she squirmed, gritted her teeth, and shook her fists with each toggled shock. At 120 volts she turned to look at Jeremy, seemingly seeking an out. “Please continue,” he authoritatively instructed. At 165 volts, when Tyler screamed “Ah! Ah! Get me out of here! I refuse to go on! Let me out!” Lateefah pleaded with Jeremy. “Oh my gosh. I’m getting all … like … I can’t…”; nevertheless, Jeremy pushed her politely, but firmly, to continue. At 180 volts, with Tyler screaming in agony, Lateefah couldn’t take it anymore. She turned to Jeremy: “I know I’m not the one feeling the pain, but I hear him screaming and asking to get out, and it’s almost like my instinct and gut is like, ‘Stop,’ because you’re hurting somebody and you don’t even know why you’re hurting them outside of the fact that it’s for a TV show.” Jeremy icily commanded her to “please continue.” As Lateefah reluctantly turned to the shock box, she silently mouthed, “Oh my God.” At this point, as in Milgram’s experiment, we instructed Tyler to go silent. No more screams. Nothing. As Lateefah moved into the 300-volt range it was obvious that she was greatly distressed, so Chris stepped in to stop the experiment, asking her if she was getting upset. “Yeah, my heart’s beating really fast.” Chris then asked, “What was it about Jeremy that convinced you that you should keep going here?” Lateefah gave us this glance into moral reasoning about the power of authority: “I didn’t know what was going to happen to me if I stopped. He just—he had no emotion. I was afraid of him.”5
Our fourth subject, a man named Aranit, unflinchingly cruised through the first set of toggle switches, pausing at 180 volts to apologize to Tyler after his audible protests of pain: “I’m going to hurt you and I’m really sorry.” After a few more rungs up the shock ladder, accompanied by more agonizing pleas by Tyler to stop the proceedings, Aranit encouraged him, saying, “Come on. You can do this. We are almost through.” Later, the punishments were peppered with positive affirmations. “Good.” “Okay.” After completing the experiment Chris asked, “Did it bother you to shock him?” Aranit admitted, “Oh, yeah, it did. Actually, it did. And especially when he wasn’t answering anymore.”
Two other subjects in our replication, a man and a woman, went all the way to 450 volts, giving us a final tally of five out of six who administered shocks, and three who went all the way to the end of maximal electrical evil. All of the subjects were debriefed and assured that no shocks had actually been delivered, and after lots of laughs and hugs and apologies, everyone departed, none the worse for wear.6
ALPINISTS OF EVIL
What are we to make of these results? In the 1960s—the heyday of belief in the blank slate7 taken to mean human behavior is almost infinitely malleable—Milgram’s data seemed to confirm the idea that degenerate acts are primarily the result of degenerate environments (Nazi Germany being, perhaps, the ultimate example). In other words, there are no bad apples, just bad barrels.
Milgram’s interpretation of his data included what he called the “agentic state,” which is “the condition a person is in when he sees himself as an agent for carrying out another person’s wishes and they therefore no longer see themselves as responsible for their actions. Once this critical shift of viewpoint has occurred in the person, all of the essential features of obedience follow.” Subjects who are told that they are playing a role in an experiment are stuck in a no-man’s-land somewhere between authority figure, in the form of a white-lab-coated scientist, and stooge, in the form of a defenseless learner in another room. They undergo a mental shift from being moral agents in themselves who make their own decisions (that autonomous state) to the ambiguous and susceptible state of being an intermediary in a hierarchy and therefore prone to unqualified obedience (the agentic state).
Milgram believed that almost anyone put into this agentic state could be pulled into evil one step at a time—in this case 15 volts at a time—until they were so far down the path there was no turning back. “What is surprising is how far ordinary individuals will go in complying with the experimenter’s instructions,” Milgram recalled. “It is psychologically easy to ignore responsibility when one is only an intermediate link in a chain of evil action but is far from the final consequences of the action.” This combination of a stepwise path, plus a self-assured authority figure who keeps the pressure on at every step, is the double whammy that makes evil of this nature so insidious. Milgram broke the process down into two stages: “First, there is a set of ‘binding factors’ that lock the subject into the situation. They include such factors as politeness on his part, his desire to uphold his initial promise of aid to the experimenter, and the awkwardness of withdrawal. Second, a number of adjustments in the subject’s thinking occur that undermine his resolve to break with the authority. The adjustments help the subject maintain his relationship with the experimenter, while at the same time reducing the strain brought about by the experimental conflict.”8
Put yourself into the mind of one of these subjects—either in Milgram’s experiment or in our NBC replication. It’s an experiment conducted by an established institution—a national university or a national network. It’s for science—or it’s for television. It’s being run by a white-lab-coated scientist—or by a casting director. The authorities overseeing the experiment are either university professors or network executives. An agent—someone carrying out someone else’s wishes under such conditions—would feel in no position to object. And why should she? It’s for a good cause, after all—the advancement of science, or the development of a new and interesting television series. Out of context, if you ask people—even experts, as Milgram did—how many people would go all the way to 450 volts, they lowball the estimate by a considerable degree, as Milgram’s psychiatrists did. As Milgram later reflected, “I am forever astonished that when lecturing on the obedience experiments in colleges across the country, I faced young men who were aghast at the behavior of experimental subjects and proclaimed they would never behave in such a way, but who, in a matter of months, were brought into the military and performed without compunction actions that made shocking the victim seem pallid.”9
In the sociobiological and evolutionary psychology revolutions of the 1980s and 1990s, the interpretation of Milgram’s results shifted toward the nature/biological end of the spectrum from its previous emphasis on nurture/environment. The interpretation softened somewhat as the multidimensional nature of human behavior was taken into account. As it is with most human action, moral behavior is incredibly complex and includes an array of causal factors, obedience to authority being just one among many. The shock experiments didn’t actually reveal just how primed all of us are to inflict violence for the flimsiest of excuses; that is, it isn’t a simple case of bad apples looking for a bad barrel in order to cut loose. Rather the experiments demonstrate that all of us have conflicting moral tendencies that lie deep within.
Our moral nature includes a propensity to be sympathetic, kind, and good to our fellow kith and kin and friends, as well as an inclination to be xenophobic, cruel, and evil to tribal others. And the dials for all of these can be adjusted up and down depending on a wide range of conditions and circumstances, perceptions, and states of mind, all interacting in a complex suite of variables that are difficult to tease apart. In point of fact, most of the 65 percent of Milgram’s subjects who went all the way to 450 volts did so with great anxiety, as did the subjects in our NBC replication. And it’s good to remember that 35 percent of Milgram’s subjects were exemplars of the disobedience to authority—they quit in defiance of what the authority figure told them to do. In fact, in a 2008 partial replication by the social psychologist Jerry Burger, in which he ran the voltage box up to only 150 volts (the point at which the “learner” in Milgram’s original experiment began to cry out in pain), it was found that twice as many subjects refused to obey the authority figure. Assuming these subjects were not already familiar with the experimental protocols, the findings are an additional indicator of moral progress from the 1960s to the 2000s, caused, I would argue, by that ever-expanding moral sphere and our collective capacity to take the perspective of another, in this case the to-be-shocked learner.10
Milgram’s model comes dangerously close to suggesting that subjects are really just puppets devoid of free will, which effectively lets Nazi bureaucrats off the hook as mere agentic automatons in an extermination engine run by the great paper-pushing administrator Adolf Eichmann (whose actions as an unremarkable man in a morally bankrupt and conformist environment were famously described by Hannah Arendt as “the banality of evil”). The obvious problem with this model is that there can be no moral accountability if an individual is truly nothing more than a mindless zombie whose every action is controlled by some nefarious mastermind. Reading the transcript of Eichmann’s trial is mind-numbing (it goes on for thousands of pages), as he both obfuscates his real role while shifting the blame entirely to his overseers, as in this statement:
What I said to myself was this: The head of State has ordered it, and those exercising judicial authority over me are now transmitting it. I escaped into other areas and looked for a cover for myself which gave me some peace of mind at least, and so in this way I was able to shift—no, that is not the right term—to attach this whole thing one hundred percent to those in judicial authority who happened to be my superiors, to the head of State—since they gave the orders. So, deep down, I did not consider myself responsible and I felt free of guilt. I was greatly relieved that I had nothing to do with the actual physical extermination.11
The last statement might possibly be true—given how many battle-hardened SS Nazis were initially sickened at the site of a killing action and Eichmann avoided them—but the rest is pure spin-doctored malarkey, and Arendt allowed herself to be taken in by it more than reason would allow, as the historian David Cesarani shows in his revealing biography Becoming Eichmann and as recounted in Margarethe von Trotta’s moving film Hanna Arendt.12 The evidence of Eichmann’s real role in the Holocaust was plain for all to see at the time, as dramatically reenacted in Robert Young’s 2010 biopic titled simply Eichmann, based on the transcripts of the interrogation of and confession by Eichmann just before his trial, conducted by the young Israeli police officer Avner Less, whose father was murdered in Auschwitz.13 Time and again, throughout hundreds of recorded hours, Less queries Eichmann about transports of Jews and Gypsies sent to their death, all followed by denials and lapses of memory. Less then presses the point by showing Eichmann copies of transport documents with his signature at the bottom, leading Eichmann to say in an exasperated voice, “What’s your point?”
The point is that there is a mountain of evidence proving that Eichmann—like all the rest of the Nazi leadership—were not simply following orders. As Eichmann himself boasted when he wasn’t on trial, “When I reached the conclusion that it was necessary to do to the Jews what we did, I worked with the fanaticism a man can expect from himself. No doubt they considered me the right man in the right place.… I always acted 100 percent, and in giving of order I certainly was not lukewarm.” As the genocide historian Daniel Jonah Goldhagen asks rhetorically, “Are these the words of a bureaucrat mindlessly, unreflectively doing his job about which he has no particular view?”14
The historian Yaacov Lozowick characterized the motives in his book Hitler’s Bureaucrats, in which he invokes a mountain-climbing metaphor: “Just as a man does not reach the peak of Mount Everest by accident, so Eichmann and his ilk did not come to murder Jews by accident or in a fit of absent-mindedness, nor by blindly obeying orders or by being small cogs in a big machine. They worked hard, thought hard, took the lead over many years. They were the alpinists of evil.”15
WHAT’S IT LIKE TO BE A NAZI?
To understand the psychology of immorality we must ascend into the thin air of evil, and there is arguably no more poignant example than that of the Nazis. Throughout this book I have been emphasizing the importance of perspective-taking—walking a mile in someone else’s shoes and trying to feel what they feel. To truly understand what it takes to create a Nazi—that is, how to take a nation of intelligent, educated, cultured people like the Germans and turn them into swastika-donning, jackboot-wearing, goose-step-marching, Heil Hitler–swearing participants in a political regime—we must imagine what it was like to actually be one.
The fact is that most of the Nazi leaders were intelligent, learned, and highly cultured people who were quite capable of committing mass murder during the workday while acting like loving and doting family men after hours. Even Joseph Mengele—the Auschwitz doctor who played god on the train platform that cleaved the barracks from the crematoria—was described by one prisoner as “capable of being so kind to the children, to have them become fond of him, to bring them sugar, to think of small details in their daily lives, and to do things we would genuinely admire … And then, next to that,… the crematoria smoke, and these children, tomorrow or in a half hour, he is going to send them. Well, that is where the anomaly lay.”16 In his classic study The Nazi Doctors, the psychologist Robert Jay Lifton described Mengele’s apparent transformation from savage to salesman in Brazil, where he’d escaped after the war and evaded capture for thirty-four years until his death in 1979. When his remains were discovered and identified in 1985, Lifton says that many Auschwitz survivors “refused to believe that the remains in the Brazilian grave were Mengele’s. Soon after that identification, a twin whom Mengele had studied told me that she simply did not believe that the arrogant, overbearing figure she had known in Auschwitz could have undergone a ‘change in personality’ and become the frightened hermit in Brazil. She was saying, in effect, that she and the others had not been provided with a psychological experience of that ‘metamorphosis’ from evil deity to evil human being.” Even at Auschwitz, Lifton notes, Mengele was not the monolithic demon of film and fiction:
Mengele’s many-sidedness in Auschwitz was both part of his legend and a source of his desacralization. In the camp he could be a visionary ideologue, an efficiently murderous functionary, a “scientist” and even a “professor,” an innovator in several areas, a diligent careerist (like Dorf), and, above all, a physician who became a murderer. He reveals himself as a man and not a demon, a man whose multifaceted harmony with Auschwitz can give us insight into—and makes us pause before—the human capacity to convert healing into killing.17
The complexity of human moral psychology was well captured by Primo Levi in The Drowned and the Saved: “Occurrences like this astonish because they conflict with the image we have of a man in harmony with himself, coherent, monolithic; and they should not astonish because that is not how man is. Compassion and brutality can coexist in the same individual and in the same moment, despite all logic.”18
Considering evil from both the perpetrator’s and the victim’s perspectives is what the social psychologist Roy Baumeister did in his groundbreaking book Evil: Inside Human Violence and Cruelty. Baumeister sketches a triangle of evil involving three parties: perpetrators, victims, and bystanders. “The essential shock of banality is the disproportion between the person and the crime,” he writes. “The mind reels with the enormity of what this person has done, and so the mind expects to reel with the force of the perpetrator’s presence and personality. When it does not, it is surprised.”19 The explanation for the surprise can be found by contrasting the victim’s perspective with that of the perpetrator. Steven Pinker calls Baumeister’s distinction the “moralization gap,” and it can be instructive—even shocking—to stand on both sides of the gap and look into the dark abyss below.20 On either side of the moralization gap are two narratives, one representing the victim and the other the perpetrator. In a 1990 paper by Baumeister and his colleagues, instructively titled “Victim and Perpetrator Accounts of Interpersonal Conflict,” the authors describe the dual narratives (shortened and summarized by Pinker).21 First, the victim’s narrative:
The perpetrator’s actions were incoherent, senseless, incomprehensible. Either that or he was an abnormal sadist, motivated only by a desire to see me suffer, though I was completely innocent. The harm he did is grievous and irreparable, with effects that will last forever. None of us should ever forget it.
Now the perpetrator’s narrative:
At the time I had good reasons for doing it. Perhaps I was responding to an immediate provocation. Or I was just reacting to the situation in a way that any reasonable person would. I had a perfect right to do what I did, and it’s unfair to blame me for it. The harm was minor, and easily repaired, and I apologized. It’s time to get over it, put it behind us, let bygones be bygones.
To obtain a thorough understanding of moral psychology, it is necessary to take both perspectives into account, even though our natural propensity is to side with the victim and moralize against the perpetrator. If evil is explicable—and I believe that it is, or at least can be—then remaining detached is imperative; as the scientist Lewis Fry Richardson put it in his statistical study of war: “For indignation is so easy and satisfying a mood that it is apt to prevent one from attending to any facts that oppose it. If the reader should object that I have abandoned ethics for the false doctrine that ‘to understand all is to forgive all,’ I can reply that it is only a temporary suspense of ethical judgment, made because ‘to condemn much is to understand little.’”22
The leaders of National Socialism did not see themselves as the demonic “Nazis” of Hollywood films. As the Holocaust historian Dan McMillan cautions, “Demonizing the Germans is unworthy of us because it denies both their humanity and ours.”23 Nazis were flesh-and-blood people who fully believed that their actions were justified in terms of their allegedly virtuous goals: national renewal, lebensraum (living space), and especially racial purity. For example, in the many writings and rantings of the Reich minister of propaganda, Joseph Goebbels, one can hear the moralizing cry characteristic of so many perpetrators: the victims had it coming. In a diary entry dated August 8, 1941, regarding the spread of spotted typhus in the Warsaw ghetto, Goebbels commented, “The Jews have always been the carriers of infectious diseases. They should either be concentrated in a ghetto and left to themselves or be liquidated, for otherwise they will infect the populations of the civilized nations.” Eleven days later, on August 19, after a visit to Hitler’s headquarters, Goebbels penned this entry in his diary: “The Führer is convinced his prophecy in the Reichstag is becoming a fact: that should Jewry succeed in again provoking a new war, this would end with their annihilation. It is coming true in these weeks and months with a certainty that appears almost sinister. In the East the Jews are paying the price, in Germany they have already paid in part and they will have to pay more in the future.”24
Of the thousands of Nazi documents I have reviewed, however, there is none as chilling as a speech by Reichsführer Heinrich Himmler given on October 4, 1943, to the SS Gruppenführer in the city of Poznan (in Poland), which was recorded on a red oxide magnetic tape. (The speech is accessible on YouTube with a transcription that includes a German-English translation.25) Himmler lectured from notes and spoke for a stupefying three hours and ten minutes on a range of subjects, including the military and political situation, the Slavic peoples and racial blends, German racial superiority, and the like. Two hours into the speech Himmler began to talk about “the extermination of the Jewish people.” He compared this action with the June 30, 1934, blood purges against traitors in the Nazi Party (the “night of the long knives” during which Nazis killed one another in grabs for power and the settling of old scores), then talked about how difficult it is to remain an honorable man in the midst of human slaughter, insisting that this episode in the forthcoming thousand-year Reich was a necessary part of a glorious history. Instead of hearing the voice of pure evil, listen to the sound of fervent righteousness:
Most of you will know what it means when a hundred bodies lie together, when there are five hundred, or when there are a thousand. And to have seen this through, and—with the exception of human weaknesses—to have remained decent, has made us hard and is a page of glory never mentioned and never to be mentioned. Because we know how difficult things would be, if today in every city during the bomb attacks, the burdens of war, and the privations, we still had Jews as secret saboteurs, agitators, and instigators.… We have the moral right, we had the duty to our people to do it, to kill this people who wanted to kill us.… I will never see it happen that even one bit of putrefaction comes in contact with us, or takes root in us. On the contrary, where it might try to take root, we will burn it out together. But altogether we can say: we have carried out this most difficult task for the love of our people.26
The twisted logic of the genocidal, righteous mind works something like this: define yourself as a good person but ramp it up out of all proportion by declaring yourself part of a master race. Define the master race as the pinnacle of perfection. Now do whatever you want. Amazingly enough, you will find that you take on no defect in either your soul or your character, even if you murder a few million people. By virtue of the fact that you’ve defined yourself as good, you can do no wrong. It worked like a charm for Himmler and his ilk.
Finally, we can crack open the mind of the ultimate perpetrator, Adolf Hitler, and see that here too the moralizing gap between victim and perpetrator is enormous, with the Führer providing many documented instances of his justification for the extermination of the Jews. As early as April 12, 1922, in a Munich speech later published in the Nazi Party newspaper Völkischer Beobachter, Hitler told his audience: “The Jew is the ferment of the decomposition of people. This means that it is in the nature of the Jew to destroy, and he must destroy, because he lacks altogether any idea of working for the common good. He possesses certain characteristics given to him by nature and he never can rid himself of those characteristics. The Jew is harmful to us.”27 On February 13, 1945, in the final Götterdämmerung of the war, with the world crashing in around him in his Berlin bunker, Hitler declared with menacing pride: “Against the Jews I fought open-eyed and in view of the whole world. At the beginning of the war I sent them a final warning. I did not leave them in ignorance that, should they once again manage to drag the world into war, they would this time not be spared—I made it plain that they, this parasitic vermin in Europe, will be finally exterminated.”28 Even as he faced his own suicide on April 29, 1945, at 4:00 a.m., Hitler commanded his successors in his political testament to carry on the fight against the Jews: “Above all I charge the leaders of the nation and those under them to scrupulous observance of the laws of race and to merciless opposition to the universal poisoner of all peoples, International Jewry.”29
In these examples, by employing the principle of interchangeable perspectives and taking the point of view of the Nazi perpetrators, we are witness to an example of a moral judgment that is based on a factual error. The National Socialists perpetrated facts about Jews—most of which were extant in the culture from centuries of European-wide anti-Semitism—that were simply not true. The Jews were not secret saboteurs, agitators, and instigators, as Himmler claimed. The Jews were not responsible for World War I, as Hitler maintained. The Jews were not, in fact, a biologically distinct race, nor were they intent on overrunning the country like plague rats, as eugenics ideologues theorized. These were all tragic misconceptions—factual errors that, had they been checked against reality, would have come up short. Nevertheless, they were sincerely believed; thus extermination had a kind of inescapable internal logic to it, however grotesque.
This is not to minimize the emotional undercurrent of mindless bigotry, but merely to suggest that if you sincerely (however wrongly) believe that X is responsible for the ruination of all and everything that you hold dear, stamping out X follows like night follows day. It was only natural, for example, that people in the Middle Ages burned the witches who were causing diseases, disasters, and assorted other misfortunes due to their well-known practice of cavorting with demons. Of course, women weren’t cavorting with demons—given that demons don’t actually exist; thus the disasters befalling people clearly did not originate with witches conspiring with the devil. The whole idea is preposterous but, like so many human errors, it was based on a faulty understanding of cause and effect. Likewise, the ultimate cause of anti-Semitism was (and remains) an utterly mistaken set of beliefs about Jews; thus the long-term solution to anti-Semitism is a better understanding of reality (while the short-term solution is legislation against discrimination). This is where science and reason come into the picture and why I argue that many of our moral mistakes are errors of fact based on defective thinking and on the erroneous assumptions that we make about other people. Thus one solution is to be found in a scientifically rational understanding of causality.
THE GRADUAL PATHWAY TO EVIL: FROM EUTHANASIA TO EXTERMINATION
The journey to evil is made in small steps, not giant leaps. Evil begins at 15 volts, not 450 volts. No single step embodies evil, but the farther down the path you go, the harder it is to turn back.
Long before prisoners were herded into gas chambers and killed with Zyklon-B or carbon monoxide gas, the Nazis had developed a program for the systematic and secret liquidation of certain targeted peoples, including German citizens. It began with the sterilization programs of the early 1930s, evolved into the euthanasia programs of the late 1930s, and with this expertise under their belts, the Nazis were able to implement their program of mass murder in the extermination camps from 1941 to 1945. As disconcerting as it is to even imagine contemplating the gassing of masses of prisoners in a chamber, as Milgram showed it is possible, and sometimes easy, to get people to do almost anything when the steps leading up to it are small and incremental. After murdering tens of thousands of “inferior” Germans, the idea of annihilating the entirety of the Jewish population was no longer unimaginable. After you’ve gotten used to demonizing, excluding, expelling, sterilizing, deporting, beating, torturing, and euthanizing people, the next step to genocide isn’t that big a leap.
Sterilization laws were passed in Germany in late 1933, not long after Hitler came to power. Within a year, 32,268 people were sterilized. In 1935, the figure jumped to 73,174; official reasons given included feeblemindedness, schizophrenia, epilepsy, manic-depressive psychosis, alcoholism, deafness, blindness, and physical malformations. So-called sex offenders were simply castrated—no fewer than 2,300 in the first decade of the program.
In 1935, Hitler told the leading Reich physician, Gerhard Wagner, that when the war began he wanted to make the shift from sterilization to euthanasia. True to his word, in the fall of 1939, the Führer ordered the extermination of physically handicapped children, after which the program moved on to mentally handicapped children, and soon thereafter proceeded to targeting adults who had either handicap. The murders were initially committed through large doses of “normal” medication given in tablet or liquid form so as to look like an accident (because the families of the victims who were notified of the death might start asking questions if they suspected foul play). If the patients resisted, injections were used. When the numbers chosen for death became cumbersomely large, the operations had to be moved into special killing wards instead of isolated units.
The process became so extensive that the Germans had to expand their operation by taking over an office complex set up at a stolen Jewish villa in Berlin, with the address Tiergartenstrasse 4; thus the program became known internally as Operation T4, or just T4. Officially it was called the “Reich Work Group of Sanatoriums and Nursing Homes.” How quaint. T4 doctors arbitrarily decided who would live and who would die, with economic status being one of the common criteria (among others): those unable to work or only able to perform “routine” work could be put to death. Historians estimate that approximately 5,000 children and 70,000 adults were murdered in the euthanasia program prior to August 1941.
As the numbers increased so too did the complications of homicide on such a colossal scale. Mass murder is more efficient with a mass murder technology, and the Nazi leader determined that medication and injections were just not sufficient to meet the objective of genocide on an industrial scale. The T4 physicians hit upon a solution when they heard stories about accidental deaths and suicides caused by the exhaust of automobile engines or the gas from leaking stoves. According to Dr. Karl Brandt, he and Hitler discussed the various techniques and decided upon gas as “the more humane way” of eliminating those deemed unfit for the Reich. The T4 administrators set up six killing centers. The first was established at an old jail building in the city of Brandenburg. Sometime between December 1939 and January 1940, a two-day series of gassing experiments was conducted and determined to be successful. Thereafter, five more killing centers were established. The gas chambers were disguised as showers—including fake showerheads—into which the “handicapped” patients were herded and the gas administered. One observer, Maximilian Friedrich Lindner, recalled the process at Hadamar:
Did I ever watch a gassing? Dear God, unfortunately, yes. And it was all due to my curiosity.… Downstairs on the left was a short pathway, and there I looked through the window.… In the chamber there were patients, naked people, some semi-collapsed, others with their mouths terribly wide open, their chests heaving. I saw that, I have never seen anything more gruesome. I turned away, went up the steps, upstairs was a toilet. I vomited everything I had eaten. This pursued me days on end.…30
But not endlessly. As Himmler noted in his Poznan speech, and as research on the psychology of killing and sadism reveals, it takes some “getting used to” the process of killing another human being; but gradually, through habituation, the mind becomes desensitized to even the most horrifying of experiences and may be reconciled to atrocity.
The gas was ventilated from the chamber with fans, the bodies were disentangled and removed from the room, the corpses marked with an “X” on their backs were looted for gold in their teeth, and then they were cremated. The entire process—from arrival at the killing center to cremation—took less than twenty-four hours, not unlike what was soon implemented in the larger camps in the East. Henry Friedlander, who traced this stepwise evolutionary process, concluded, “The success of the euthanasia policy convinced the Nazi leadership that mass murder was technically feasible, that ordinary men and women were willing to kill large numbers of innocent human beings, and that the bureaucracy would cooperate in such an unprecedented enterprise.”31
In the T4 killing centers we see all of the components of the extermination camps to come later, such as those at Majdanek and Auschwitz-Birkenau. Over time, the Nazi bureaucracy evolved along with the killing centers, setting the stage for the conversion of concentration and work camps into extermination camps, all in incremental steps in the gradually evolving system that became the Final Solution.32
The gradual escalation to evil is just one of several psychological factors at work in the corruption of good people. Let’s return to the shock experiments and consider the reason why subjects will both obey an unethical order from an authority figure and, at the same time, feel terrible about the suffering they’re inflicting (unless, over time, they habituate to it). These findings, I believe, reflect our complex moral nature that leads us to vacillate between assessing others and their actions as positive or negative, helpful or harmful (all of which fall under the umbrella terms “good” and “evil”), depending on the context and desired outcomes.
THE PSYCHOLOGY OF GOOD AND EVIL
In the context of moral conflicts, the experimental psychologist Douglas J. Navarick—my mentor at California State University at Fullerton, as it turns out—calls this vacillation “moral ambivalence,” in which “When we assess the moral implications of an action that we observe or contemplate, we may have a sense of ambivalence, a feeling that one could justifiably judge the action as either right or wrong. Efforts to resolve such ambivalence are potentially difficult, protracted, and aversive.”33 As such, our moral emotions can slide back and forth between right and wrong that can be modeled as an approach-avoidance conflict.
The approach-avoidance paradigm began with rats, who were motivated to seek a food reward in a maze by being starved to 80 percent of their body weight, but when they reached the goal region they were not only rewarded with food but also punished with a mild shock.34 This paradigm sets up an approach-avoidance conflict in which the rats become ambivalent about reaching the end of the runway where both a reward and a punishment await them, so they end up vacillating, first toward, then away from the goal. When harnessed to a device to measure their pull strength either toward or away from the goal, psychologists have been able to take a quantitative reading on precisely how strong or weak their ambivalence was. Revealingly, as the rats got closer to the goal box the strength of both approach and avoidance tendencies increased, although the avoidance behavior was stronger than that of the approach.
Moral conflicts may also arise between prescriptions (what we ought to do) that bring rewards for action (pride from within, praise from without) and proscriptions (what we ought not to do) that bring punishments for violations (shame from within, shunning from without).35 (Eight of the Ten Commandments in the Decalogue, for example, are proscriptions.) As in the limbic system with its neural networks for emotions, approach-avoidance moral conflicts have neural circuitry called the behavioral activation system (BAS) and the behavioral inhibition system (BIS), which drive an organism forward or back,36 as in the case of the rat vacillating between approaching and avoiding the goal region, or in the case of the man on the train platform vacillating between saving the woman and punishing her offender in the video vignette from chapter 1. These activation and inhibition systems can be measured in experimental settings in which subjects are presented with different scenarios in which they then offer their moral judgment (giving money to a homeless person as prescriptive vs. wearing a revealing dress to a funeral as proscriptive); under such conditions researchers have found that “BAS scores correlated with prescriptive ratings but not with the proscriptive ones, whereas the BIS scores correlated with the proscriptive ratings but not with the prescriptive ones.”37 This, argues Navarick, shows how some moral judgments can be better understood as an approach-avoidance conflict.
Other moral emotions, such as disgust, drive an organism away from a noxious stimulus because noxiousness is an informational cue that a stimulus could kill you through poisoning (e.g., food substances) or disease (e.g., fecal matter, vomit, or other bodily effluvia). Anger, however, has the opposite effect and drives an organism toward an offensive stimulus, such as another organism that attacks it. Thus, if you are taught by your culture that Jews (or blacks, natives, homosexuals, Tutsis, etc.) are a bacillus poisoning your nation, you naturally avoid them with disgust as you would any noxious stimulus. If you learn from your society that Jews (or blacks, natives, homosexuals, Tutsis, etc.) are dangerous enemies attacking your nation, you naturally approach them with anger as you would any assaulter. And, as in these examples, this system can be hijacked into getting one group of people to believe that another group of people is evil and dangerous and therefore needs to be punished or destroyed, through such techniques as state propaganda, literature and mass media, gossip and hearsay, and other means of communicating information. False information naturally leads to mistaken belief, and again we see how factual error morphs into moral judgment in accord with Voltaire’s linkage between absurdities and atrocities.
Fortunately, this moral system is tractable in the other direction as well. Consider the Germans. Once considered to be an inherently racist, bigoted, and bellicose people, they are now among the most tolerant, liberal, and peaceful in the world.38 It took only a few years for the de-Nazification process implemented by the Allies after World War II to drive the beliefs of National Socialism to the margins of society. Today you might find neo-Nazi skinhead kooks dressing up in ersatz SS uniforms in their bedrooms, and fantasizing about goose-stepping to the Führer, but there is little chance that anything like the Holocaust could ever happen again in Germany. It is a testimony to the malleability of our moral emotions.
Moral approach-avoidance conflicts can be seen in such classic dilemmas as pitting a deontological (duty-bound) principle such as the prohibition against murder against a utilitarian (greatest good) principle such as the trolley experiment where most people agree that it is acceptable to sacrifice one person to save five. Which is right? Thou shalt not kill, or thou shalt kill one to save five? Such conflicts cause much cognitive dissonance and anxiety—and vacillation—and are popular in fiction as a vehicle to explore the complexities of moral choices. In Arthur C. Clarke’s science fiction novel (and Stanley Kubrick’s film) 2001: A Space Odyssey, the HAL 9000 computer is unable to resolve a conflict between its duty (via programmed orders) for “the accurate processing of information without distortion or concealment” and its command to keep the true nature of the space mission—knowledge of the alien Monolith discovered on the moon—a secret from the crew. This sets up a “Hofstadter-Moebius loop” (a nod to Douglas Hofstadter’s work on unsolvable mathematical problems and the Moebius infinity loop) that leads HAL to kill the crew, thereby enabling him to be consistent in obeying his orders to be both truthful and maintain secrecy about the mission (although in the end the astronaut Dave Bowman survives and dismantles HAL in one of the classic scenes in science fiction film history). In the final episode of the television series M*A*S*H—the most watched event in US television history—Captain Hawkeye Pierce (Alan Alda) experiences a nervous breakdown after witnessing a South Korean refugee smother her crying baby on a bus so as not to alert North Korean soldiers of their presence, a situation that would inevitably have led to the death of everyone in the vehicle. As Hawkeye explains in a letter to his father, “Remember when I was a kid, you told me that if my head wasn’t attached to my shoulders, I’d lose it? That’s what happened when I saw that woman kill her baby. A baby, Dad. A baby.”
Perpetrators of the Holocaust faced this moral conflict (among others): between the natural inclination most humans have against hurting or killing another human being, versus duty and loyalty to one’s nation and obedience to one’s superiors. Demonstrating that Jews (and others) were not the enemies of Germany and that Nazi racial policy was based on the pseudoscience of eugenics might have helped to resolve the problem, but in the minds of the Holocaust perpetrators who believed such nonsense, these moral conflicts emerged regardless.
Dramatic examples can be found in a remarkable collection of wartime letters titled “The Good Old Days”: The Holocaust as Seen by Its Perpetrators and Bystanders. In one letter, for example, dated Sunday, September 27, 1942, SS-Obersturmführer (Lieutenant Colonel) Karl Kretschmer apologizes to his wife, his “dear Soska,” for not writing more, and explains that he is feeling ill and in low spirits. “I’d like to be with you all. What you see here makes you either brutal or sentimental.” His “gloomy mood,” he explains, is caused by “the sight of the dead (including women and children).” His moral conflict is resolved by coming to believe that the Jews deserved to die: “As the war is in our opinion a Jewish war, the Jews are the first to feel it. Here in Russia, wherever the German soldier is, no Jew remains. You can imagine that at first I needed some time to get to grips with this.” In a subsequent letter, not dated, Kretschmer explains to his wife how he did come to grips with the conflict: “there is no room for pity of any kind. You women and children back home could not expect any mercy or pity if the enemy got the upper hand. For that reason we are mopping up where necessary but otherwise the Russians are willing, simple and obedient. There are no Jews here any more.” Finally, on October 19, 1942, Kretschmer shows how easy it is to slip into the evil of moral banality (referencing the Einsatzgruppen, or special action forces of which he was a part, which were assigned to follow the German army into towns to rid them entirely of any unwanted people, including and especially Jews):
If it weren’t for the stupid thoughts about what we are doing in this country, the Einsatz here would be wonderful, since it has put me in a position where I can support you all very well. Since, as I already wrote to you, I consider the last Einsatz to be justified and indeed approve of the consequences it had, the phrase: “stupid thoughts” is not strictly accurate. Rather it is a weakness not to be able to stand the sight of dead people; the best way of overcoming it is to do it more often. Then it becomes a habit.39
His “stupid thoughts” reflect his moral conflict, which he overcame through a combination of convincing himself that the Einsatz killings were necessary (because if the tables had been turned they’d be doing it to us) and turning murder into a “habit” to overcome the emotional trauma of brutality.
Habit is an appropriate descriptor because habituation is a psychological state in which one becomes oblivious to a continually repeated stimulus. On the simplest perceptual level, one may cease to notice a continuous stimulus such as the pressure of a ring or bracelet. In learning experiments, organisms cease to respond to a stimulus that has no consequences or relevance for them, such as a repeated loud noise associated with nothing. In animal behavior research on primates in the wild, scientists habituate their subjects by repeatedly exposing them to the presence of humans so that they no longer notice them standing around with their binoculars and video recorders.40 The habituation effect happens at the neural level as well as psychological—in fMRI scans in which people are continuously exposed to the same stimulus, the areas of the brain that normally respond to such stimuli decrease their rates of firing, or cease firing altogether.41 In the ranks of the Nazi Waffen-SS, many soldiers in this elite fighting squad simply habituated to their job of killing after years of bitter engagement on the Eastern Front. Gerhard Stiller, for example, who fought with the “Leibstandarte Adolf Hitler,” or the 1st SS Panzer Division, which initially served as the Führer’s personal bodyguard, after the war recalled of his fellow SS soldiers, “After a few years they became so desensitized that they didn’t even notice any more, given that they were capable of just bumping someone off without batting an eyelid. Let’s just say they would need to develop a lot of their humanity again, and that takes time.”42
Navarick sites as an example of moral conflict the Józefów massacre in Poland in which a Nazi reserve police battalion rounded up and shot fifteen hundred Jewish civilians in the head, most of whom were women and children.43 According to the Holocaust historian Christopher Browning, in his frank book on the massacre, Ordinary Men, 10 to 20 percent of the Nazi reservists withdrew from the killing operation after one shot, and most of the rest experienced physical revulsion during the murderous process. Their conflict was not on an intellectual level—like a trolley problem given to undergraduates—but instead was more visceral. As one reservist explained, “Truthfully I must say that at the time we didn’t reflect about it at all. Only years later did any of us become truly conscious of what had happened then.… Only later did it first occur to me that [it] had not been right.”44 Their initial conflict was more likely a reaction coming from a deeper evolved emotion of revulsion to killing that most of us are born with, unless and until special circumstances are instituted to override that natural propensity.
What are those special circumstances? Navarick notes the similarities between the subjects who withdrew from Milgram’s experiments and the reservists who withdrew from the Józefów massacre, by employing the paradigm of operant conditioning rather than social psychological models. In this analysis he proposes a three-stage behavioral model to explain disobedience: (1) aversive conditioning of contextual stimuli (how negative the conditions were in a given situation), (2) emergence of a decision point (the moment at which someone could extricate themselves from a negative situation without severe consequences), and (3) a choice between immediate and delayed reinforcers (at which point they’re reinforced for withdrawing or disobeying). Together these three conditions show that “participants withdraw to escape personal distress rather than to help the victim.”45 In other words, says Navarick, people in situations such as Milgram’s shock experiment, or Nazi reservists involved in a killing operation, who withdrew or refused to participate did so not due to the positive reinforcement of helping the victim but due to the negative reinforcement of terminating the discomfort.46
Instead of reifying some internal state like “obedience” into a psychological force, Navarick explains people’s behavior in terms of positive and negative reinforcements and analyzes the extent to which they will act to increase the former and decrease the latter. At Józefów, for example, the Nazi reservists shot their victims at point-blank range with the barrel of their rifle pressed against the base of the victim’s skull. This resulted in an unacceptable (to the Nazi commanders) level of withdrawal and defiance. At a later killing engagement, in the village of Lomazy, Poland, the commanders arranged for their reservist charges to shoot the Jewish victims at a distance, and this predictably led to lower rates of noncompliance, undoubtedly because the men shanghaied into pulling the trigger were spared the emotional devastation of close-range shooting. One is reminded of the scene in The Godfather when Sonny Corleone tells his younger brother Michael—who is eager to avenge the shooting of his father (the Godfather) and his own beating by a corrupt police captain whom he wants to kill: “What are you gonna do? Nice college boy, didn’t want to get mixed up in the family business. Now you want to gun down a police captain. Why? Because he slapped you in the face a little? What do you think, this is like the army, where you can shoot ’em from a mile away? No, you gotta get up like this and, badda-bing, you blow their brains all over your nice Ivy League suit.”
Blowing someone’s brains out all over your clothing—whether it is a nice Ivy League suit or a pressed Nazi uniform—is unnatural, repugnant, and for all but a handful of sadists and extreme psychopaths physically revolting. Browning summarized the most common reason for the men’s withdrawal from the killing at Józefów as “sheer physical revulsion.” It is unquestionably a negative form of stimulus, but also one that can be overcome—otherwise the Holocaust itself would never have happened. And sometimes that stimulus tipped over into sadistic pleasure. In a disturbing book titled Male Fantasies, Klaus Theweleit recorded an incident in which a Nazi camp commandant more than overcame any revulsion he might have once experienced in response to the flogging of a camp inmate: “His whole face was already red with lascivious excitement. His hands were plunged deep in his trouser pockets, and it was quite clear that he was masturbating throughout, apparently unembarrassed by the watching crowd. Having ‘finished himself off’ and reached satisfaction, he turned on his heel and disappeared; perverse swine that he was, he lost interest at this point in the further development of the proceedings.” In fact, this witness added, “on more than thirty occasions, I myself have witnessed SS camp commanders masturbating during floggings.”47
What circumstances and conditions tweak the dials of good and bad acts either up or down in normal people? Consider this explanation from Alfred Spiess, the chief senior state prosecutor in the trial of some of the SS guards at the Treblinka death camp, who explained the psychology of evil this way:
On the one hand obviously there was the order, and also a certain willingness not to refuse the order [duty]. But this readiness was naturally promoted psychologically in that these people were given privileges. Let me put it this way, a lot of carrot and a little stick—that was more or less how the system worked. And their carrot consisted of, first, there was more to eat, and second, which was most important, one couldn’t be sent to the front.… Third, one had the chance of getting into a rest home run by T4, and not least of all, good rations, plenty of alcohol, and last, and not least of all, the opportunity of helping themselves to many valuables which had been taken from the Jews.48
The perks that were offered in the service of the system, the relentless black rain of propaganda, and the steady hammering of the master race ideology into the ears of ordinary men enticed them to slither even farther along the pathway into the pit of evil. A Waffen-SS soldier named Hans Bernhard explained it this way: “Our motto was duty, loyalty, the Fatherland, and comradeship.” This wasn’t just some ordinary war; these were blood brothers in arms, fighting the good fight. As the SS-Viking Division member Jürgen Girgensohn put it, “We were convinced we were conducting a just fight, that we were convinced we were a master race. We were the best of this master race and that really does form a bond.” Discipline was crucial, and anyone in the rank and file who slacked off was punished from within. SS-Das Reich Division member Wolfgang Filor said of the men that “if they didn’t manage it they drew attention and had to do extra training or something.” He explained what that extra “something” entailed: “Everyone gave the guy merry hell. He was dragged out of bed and beaten on the head and that kind of thing—so he wouldn’t give up so easily next time, so that the troop would not be disrupted.” Those who couldn’t grin and bear it went AWOL or hanged themselves, knowing that a court-martial was awaiting them if they didn’t perform.
There was yet another tried-and-true method that the men used to resolve moral conflict, by means of the temporary relief of mental turmoil, through the blotting out of memories and the numbing of the pain—and that method was getting themselves thoroughly, utterly plastered. After a particularly brutal engagement in France, Waffen-SS soldier Kurt Sametreiter noted, “We were happy that it was all over. Really happy. We were so happy that for a few days … well … basically we got drunk. Really, you see, we just wanted to forget.”49
Putting all of these factors together to explain how ordinary Germans became extraordinary Nazis, Christopher Browning summarized the process this way in his appropriately titled book The Path to Genocide:
In short, for Nazi bureaucrats already deeply involved in and committed to “solving the Jewish question,” the final step to mass murder was incremental, not a quantum leap. They had already committed themselves to a political movement, to a career, and to a task. They lived in an environment already permeated by mass murder. This included not only programs with which they were not directly involved, like the liquidation of the Polish intelligentsia, the gassing of the mentally ill and handicapped in Germany, and then on a more monumental scale the war of destruction in Russia. It also included wholesale killing and dying before their very eyes, the starvation in the ghetto of Lodz and the punitive expeditions and reprisal shooting in Serbia. By the very nature of their past activities, these men had articulated positions and developed career interests that inseparably and inexorably led to a similar murderous solution to the Jewish question.50
A PUBLIC HEALTH MODEL OF EVIL
Another way to think about evil is to compare the medical model of disease to the public health model of disease. The medical model of evil treats it as if the locus of the contagion were inside each individual patient. In Western religion, sin is in the individual; in the law, criminality is in the individual. The medical model demands that each infected person be treated, one by one, until no one shows any further symptoms. The medical model of evil, then, is the analogue to the dispositional model of evil, where evil exists in the disposition of the person exhibiting it. Evil is simply in their nature; thus to eradicate the plague of evil we have simply to eliminate those with evil dispositions.
This paradigm served as the basis of the Inquisition, in which women were instructed to be boiled in oil for such crimes as “sleeping with the devil.” Did this put a dent into evil? Hardly. What the witch hunt did was to spread evil in the form of barbaric, systemic violence against women throughout much of Europe and North America for centuries.
By contrast, a public health model of evil accepts as a given that, of course, we affect and infect one another, but individuals are simply part of a larger disease vector that includes many additional social psychological factors that have been identified in the past half century that must be included in any theoretical model to explain the often puzzling world of human moral psychology. Here are some of the most potent factors at work in turning good people into bad.
Deindividuation
Removing individuality by taking people out of their normal social circle of family and friends (as cults do), or dressing them in identical uniforms (as the military does), or insisting that they be team players and go along with the group program (as corporations often do), sets up a situation in which behavior can be molded as the leader wishes. In his classic 1896 work The Crowd, the French sociologist Gustave Le Bon called the concept “group mind, in which people are manipulated through anonymity, contagion, and suggestibility.”51 In 1954, the social psychologists Muzafer Sherif and Carolyn Sherif tested Le Bon’s idea in a now classic experiment at a camp in Oklahoma, where they divided twenty-two eleven-year-old boys into two groups, “The Rattlers” and “The Eagles.” Within a couple of days new identifications were formed within each group, after which the two groups were forced to compete in various tasks. Despite preexisting long-term friendships between many of the boys, hostilities quickly developed between them along group identity lines. Acts of aggression escalated to the point where Sherif and Sherif were forced to terminate this phase of the experiment early, and then introduce tasks for the boys that required cooperation between the two groups, which just as quickly led to renewed friendships and between-group amity.52
Dehumanization
Dehumanization is the disavowal of the humanity of another person or group, either symbolically through discriminatory language and objectifying imagery or physically through captivity, slavery, the infliction of bodily harm, systematic humiliation, and so on. It happens intentionally or unintentionally, between individuals and groups, and can even occur in the self as, for example, when an individual views himself or herself negatively from the third-person perspective of the discriminatory group. When the in-group is defined by its out-group counterpart, and vice versa, the power of the in-group can be reinforced in various ways; for example, members of the out-group might be relabeled as vermin, animals, terrorists, insurgents, and barbarians, making it easy to classify them as subhuman or nonhuman. Prisoners might have their hair shaved off, they might be stripped naked to remove a layer of civilizing humanity, and their heads might be bagged as the ultimate removal of identity; they might be identified by number (as in concentration camps), marked with a symbol (as groups targeted in the Holocaust were forced to wear badges—triangles or double triangles of various colors—to mark their inferiority and out-group status), and they might be treated as mere tools or automatons, after which they can be neatly disposed of by fellow members of the out-group. Dehumanization can be much more subtle than that, however; one need only look at the online world to witness dehumanization in all its nastiness, with individuals regularly treating one another as less than human, with utter disregard for feelings (and sometimes truth), especially when confrontation is not immediate or likely.
Compliance
Compliance occurs when an individual acquiesces to group norms or to an authority’s commands without agreeing with those norms or commands. In other words, the individual will obediently carry out orders without the internalized belief that what he or she is doing is right. In a classic 1966 experiment conducted in a hospital, the psychiatrist Charles Hofling arranged to have an unknown physician contact nurses by phone and order them to administer 20mg of a nonexistent drug called “Astrofen” to one of his patients. Not only was the drug fictional, it also was not on the approved list of drugs, and the bottle clearly denoted that 10mg was the maximum daily dose allowed. Preexperimental surveys showed that when given this scenario as purely hypothetical, virtually all nurses and nursing students confidently asserted that they would refuse to obey the order. Yet, when Hofling actually ran the experiment, he got twenty-one out of twenty-two nurses to comply with the doctor’s orders, even though they knew it was wrong.53 Subsequent studies support this disturbing finding. For example, a 1995 survey of nurses revealed that nearly half admitted to having at some time in their careers “carried out a physician’s order that you felt could have had harmful consequences to the patient,” citing the “legitimate power” that physicians hold over them as the reason for their compliance.54
Identification
Identification is the close affiliation with others of like interest, as well as the normal process of acquiring social roles through modeling and role playing. In childhood, heroes serve as role models for identification, and peers become a reference point for comparing, judging, and deciding opinions. Our social groups provide us with a frame of reference with which we can identify, and any member of the group who strays from those norms risks disapproval, isolation, or even expulsion.
The power of identification is emphasized in a 2012 reinterpretation of Milgram’s research by psychologists Stephen Reicher, Alexander Haslam, and Joanne Smith.55 They call their paradigm “identification based followership,” noting that “participants’ identification with either the experimenter and the scientific community that he represents or the learner and the general community that he represents” better explains the willingness of subjects to shock (or not) learners at the bidding of an authority. At the start of the experiment, subjects identify with the experimenter and his or her worthy scientific research program, but at 150 volts the subjects’ identification begins to shift to the learner, who cries out “Ugh!!! Experimenter! That’s all. Get me out of here, please.” It is, in fact, at 150 volts that subjects are most likely to quit or protest, as it was for our NBC replication subjects. As Reicher and Haslam suggest, “In effect, they become torn between two competing voices that are vying for their attention and making contradictory demands upon them.”
Haslam and Reicher also reinterpreted the famous 1971 Stanford Prison Experiment by Philip Zimbardo, reconfiguring his findings through their paradigm of identity theory.56 Recall that Zimbardo randomly assigned undergraduate students to be either guards or prisoners, directing them to take on their roles fully, and he provided them with appropriate uniforms and the like. Over the next couple of days these psychologically well-adjusted American students were transformed into either the role of violent, authoritative guards or demoralized, impassive prisoners, and Zimbardo had to terminate the two-week study after only a week because of the brutality he witnessed.57
In 2005 Haslam and Reicher worked with the BBC in its Prison Study replication, explaining that “Unlike Zimbardo … we took no leadership role in the study. Without this, would participants conform to a hierarchical script or resist it?” The Haslam and Reicher study made three findings: (1) “participants did not conform automatically to their assigned role,” (2) subjects “only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification),” and (3) “group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it.” The scientists concluded that in the BBC Prison Study “it was neither passive conformity to roles nor blind obedience to rules that brought the study to this point. On the contrary, it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action.” Therefore they concluded,
Those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists.58
This observation echoes the appraisal of Eichmann as an alpinist of evil, and to this I would add that it is here where the element of free choice comes into play. Ultimately, even with all these influencing variables, one is still making a volitional choice to act badly, or not.
Conformity
Because we evolved to be social beings, we are hypersensitive to what others think about us and are strongly motivated to conform to the social norms of our group. Solomon Asch’s studies on conformity demonstrate the power of groupthink: if you are in a group of eight people who are instructed to judge the length of a line by matching it to three other lines of differing lengths, even when it is obvious which line is the match, if the other seven people in the group select a different line, you will agree with them 70 percent of the time. The size of the group determines the degree of conformity. If just two people are judging the line lengths, conformity to the incorrect judgment is almost nonexistent. In a group of four in which three select the incorrect line match, conformity happens 32 percent of the time. But no matter what the size of the group, if you have at least one other person who agrees with you, conformity to the incorrect judgment plummets.59
Interestingly, fMRI studies tell us something about the emotional impact of nonconformity based on the areas of the brain that are most active when a subject is at odds with the group. Conducted by Emory University neuroscientist Gregory Berns, the task involved matching rotated images of three-dimensional objects to a standard comparison object. Subjects were first put into groups of four people, but unbeknown to them the other three were confederates who would intentionally select an obviously wrong answer. On average, subjects conformed to the group’s wrong answer 41 percent of the time, and when they did, the areas of their cortex related to vision and spatial awareness became active. But when they broke ranks with the group, their right amygdala and right caudate nucleus lit up, areas associated with negative emotions.60 In other words, nonconformity can be an emotionally traumatic experience, which is why most of us don’t like to break ranks with our social group norms.
Tribalism and Loyalty
Since many of these social psychological factors operate within a larger evolved propensity we have to divide the world into tribes of Us vs. Them. An empirical demonstration of our natural inclination toward so dividing the world can be seen in a 1990 experiment by the social psychologist Charles Perdue, in which subjects were told that they were participating in a test of their verbal skills in which they were to learn nonsense syllables, such as xeh, yof, laj, or wuh. One group of subjects had their nonsense syllables paired with an in-group word (us, we, or ours), a second group had their nonsense syllables paired with out-group words (them, they, or theirs), while a control group had their nonsense syllables paired with a neutral pronoun (he, hers, or yours). Subjects were then asked to rate the nonsense syllables on their pleasantness or unpleasantness. Results: the subjects who had their nonsense syllables paired with in-group words rated them as significantly more pleasant than subjects who had their nonsense syllables paired with out-group words or neutral-paired words.61
The power of in-group loyalty in the real world was well summarized by Lieutenant Colonel Dave Grossman in his deeply insightful 2009 book On Killing, in which he shows that a soldier’s primary motivation is not politics (fighting for a nation or state) or ideology (fighting to make the world safe for democracy), but devotion to one’s fellow soldiers. “Among men who are bonded together so intensely, there is a powerful process of peer pressure in which the individual cares so deeply about his comrades and what they think about him that he would rather die than let them down.”62 This is not obedience to authority. This is camaraderie, a group of strangers who become what are called “fictive kin,” nongenetically related individuals who come to act as if they were each other’s genetic relations. It is a system that hijacks our evolved propensity to be kind to our kith and kin, in which through such bonding activities as marching and suffering together implemented by the military total strangers come to feel like genetic relations.
Tribes often reinforce loyalty to the group by the punishment (or the threat of punishment) of those who threaten it from within. Whistle-blowers are a case in point. When a whistle-blower threatens our group—even if we know on some level that they are morally right in blowing the whistle—our tribal instincts kick in and we circle the wagons against the perceived threat and vilify it with such emotion-laden labels as “Tattletale, Rat Fink, Stool Pigeon, Snitch, Informer, Turncoat, Bigmouth, Canary, Busybody, Fat Mouth, Informer, Squealer, Weasel, Backstabber, Double-Crosser, Agent-Provocateur, Shill, Judas, Quisling, Treasonist, etc.”63
Pluralistic Ignorance, or the Spiral of Silence
To understand how a group of people, or even an entire nation, can seemingly come to accept an idea that most of the individual members or citizens would likely reject, we must turn to one of the most perplexing of all social psychological phenomena. Pluralistic ignorance happens when individual members of a group do not believe in something, but mistakenly believe that everyone else in the group believes it—and if no one speaks up, it leads to a “spiral of silence” and thence to individuals behaving out of character.
Take binge drinking on college campuses. A 1998 study by Christine Schroeder and Deborah Prentice of Princeton University found that “the majority of students believe that their peers are uniformly more comfortable with campus alcohol practices than they are.” Another Princeton study, in 1993, by Prentice and her colleague Dale Miller, found a gender difference in drinking attitudes in which, predictably, “Male students shifted their attitudes over time in the direction of what they mistakenly believed to be the norm, whereas female students showed no such attitude change.”64 Women, however, were not immune to pluralistic ignorance as shown in a 2003 study by the psychologist Tracy A. Lambert and her colleagues, who found that regarding casual sex, “both women and men rated their peers as being more comfortable engaging in these behaviors than they rated themselves.”65 In other words, these college students say that they themselves are not prone to binge drinking and casual hookups, but that most everyone else is, so they go along with the crowd. When everyone in the group thinks this way, an idea that most individual members do not endorse can take hold of the group.
Pluralistic ignorance can transmogrify into witch hunts, purges, pogroms, and repressive political regimes. European witch hunts degenerated into preemptive accusations of guilt, lest one be thought guilty first.66 Or take the story of Russian dissident Aleksandr Solzhenitsyn, who described a party conference in which Stalin was given a standing ovation in absentia that went on for eleven minutes, until a factory director finally sat down, to everyone’s relief—everyone but one of Stalin’s party functionaries, that is, who had the man arrested that night and sent to the gulag for a decade.67 A 2009 study by the sociologist Michael Macy and his colleagues confirmed the effect: “people enforce unpopular norms to show that they have complied out of genuine conviction and not because of social pressure.” Laboratory experiments show that people who conform to a norm under social pressure are more likely to publicly punish deviants from the norm as a way of advertising their genuine loyalty instead of appearing to be just faking. Together, “these results demonstrate the potential for a vicious cycle in which perceived pressures to conform to and falsely enforce unpopular norms reinforce one another.”68
Bigotry is ripe for the effects of pluralistic ignorance, as evidenced in a 1975 study by the sociologist Hubert J. O’Gorman, “indicating that in 1968 most white American adults grossly exaggerated the support among other whites for racial segregation,” especially among those leading segregated lives, thereby reinforcing the spiral of silence.69 Interestingly, a 2000 study by the psychologist Leaf Van Boven found pluralistic ignorance at work when “students overestimated the proportion of their peers who supported affirmative action by 13% and underestimated the proportion of their peers who opposed affirmative action by 9%.” He attributes the effect to political correctness, which drives some of us to lead dual lives, professing beliefs we think others hold while privately holding beliefs that may vary from this presumptive norm. Presciently, Van Boven closed his analysis with this commentary on the (at the time in 2000) emerging gay marriage debate: “one might expect that fear of appearing politically incorrect could lead to pluralistic ignorance for any number of politically correct issues, such as people’s attitudes toward gay and lesbian marriage or adoption, their views about the appropriate labels for the romantically involved (are they ‘boyfriends and girlfriends’ or ‘partners’?), their attitudes toward gender equality, or their beliefs about the role of the Western canon in a liberal arts education. To the extent that concerns about appearing racist, sexist, or otherwise culturally insensitive squelch public expression of private doubts about such issues, pluralistic ignorance can be expected to emerge.”70
Perhaps in some cases pluralistic ignorance is a good thing, since many private thoughts may be morally regressive, given the time it often takes to shift beliefs and preferences, and the fact that many people who privately hold politically incorrect beliefs may wish that they didn’t. As the Russian novelist Fyodor Dostoevsky wrote, “Every man has reminiscences which he would not tell to everyone but only his friends. He has other matters in his mind which he would not reveal even to his friends, but only to himself, and that in secret. But there are other things which a man is afraid to tell even to himself, and every decent man has a number of such things stored away in his mind.”71
Fortunately, there is a way to break the bonds of pluralistic ignorance: knowledge and transparency. In the Schroeder and Prentice study on college binge drinking it was found that exposing incoming freshmen to a peer-directed discussion that included an explanation of pluralistic ignorance and its effects significantly reduced subsequent student alcoholic intake.72 Sociologist Michael Macy found that when skeptics are scattered among true believers in a computer simulation of a society in which there is ample opportunity for interaction and communication, social connectedness acted as a safeguard against the takeover of unpopular norms.73
* * *
Figure 9-1 features a visual record of how all of these factors can come into play in the real-world example of Köln, Germany, in the 1930s and early 1940s. On a trip to this beautiful city I consulted with the NS-Dokumentationszentrum der Stadt Köln (the National Socialism Documentation Center of the City of Cologne) to assess how the Nazis managed to take over one city. It became clear that they did it district by district, house by house, and even person by person, part of a national plan to Nazify all of Germany but orchestrated locally by gauleiters, or district overseers.74 The museum is in a building that housed the Cologne Gestapo from December 1935 through March 1945, and it documents the seizure of power; the propaganda employed in everyday life, including the youth culture, religion, racism, and especially the elimination and extermination of Cologne’s Jews and its Sinti and Roma; and finally the opposition, resistance, and society during the war. The photograph of a sign posted by the Allies in Köln captures the entire span of the process and its ultimate demise in a quote from Hitler: “Give me five years and you will not recognize Germany again.” The magnificent bridge spanning the Rhine River adjacent to the Köln Cathedral—lying in the water, smashed to pieces—reveals what it took in this case to bring an end to the evil.
EVIL, INCORPORATED
All of these factors are interactive and autocatalytic—that is, they feed on one another: dehumanization produces deinidividuation, which then leads to compliance under the influence of obedience to authority, and in time that morphs into conformity to new group norms, and identification with the group, which leads to the actual performance of evil acts. No one of these components inexorably leads to evil acts, but together they form the machinery of evil that arises under certain social conditions.
These conditions are necessary but not sufficient to account for evil, which also involves the dispositional nature of the individual; the overall system in which these conditions occur; and, of course, free will. We can change the conditions and attenuate evil, first by understanding it and then by taking action to change it. By understanding how its components operate and how to control them, we can quell evil and keep it in check through the social tools and political technologies we now know how to employ to the betterment of humanity.
Figure 9-1a
Figure 9-1b
Figure 9-1c
Figure 9-1d
Figure 9-1e
Figure 9-1f
Figure 9-1g
Figure 9-1h
Figure 9-1. A Visual Record of How the Nazis Took Over Köln, Germany
Photographs from the National Socialism Documentation Center of the City of Cologne reveal how an evil regime can take over a city and a nation. (a) Indoctrination of the citizens was the preferred method of Nazifying the German people, as evidenced in the Hitler Youth programs pictured here (credit: LAV NRW R, BR 2034 Nr. 936); (b) Shaping of cultural life through publications, such as Robert Ley’s Nazi newspaper featuring an anti-Semitic characterization; and (c) a bookstore with swastikas and the anti-Semitic slogan “The Jews Are Our Misfortune.” (d) A fragment of a list of the hundreds of banned clubs shows the extent to which the Nazis controlled every aspect of daily life in Köln. (e) If indoctrination through propaganda didn’t work, imprisonment was employed as a means of bringing the people into line, as seen here in the Köln Gestapo prison. (f) Indoctrination included a eugenics race program that took specific measurements of people such as this woman to determine if they measure up to Aryan standards. (g) A German civilian in Köln, Germany, April 18, 1945, reading a sign posted by the American forces, quoting Hitler’s promise to the German people “Give me five years and you will not recognize Germany again” (National Archives, US Army Photograph, SC 211781). (h) The bombed-out Hohenzollern Bridge spanning the Rhine River adjacent to the famed Köln Cathedral is a visually striking reminder of what it sometimes takes to end evil (National Archives, US Army Photograph, SC 203882).
This is what we have been doing since the end of World War II. Social scientists have undertaken extensive studies and experimental protocols to understand precisely which social and psychological factors enable evil to triumph over good. Historians have uncovered the political, economic, and cultural conditions that allowed these social and psychological factors to play out their effects on individuals and whole populations. Politicians, economists, legislators, and social activists have applied this knowledge to change conditions to attenuate the possibility of such factors leading people down the path toward evil. Although there have been disruptions still—the Rwanda genocide and the 9/11 terrorist attacks come to mind—ever since the Nazi camps were liberated and the Soviet gulags eliminated, the overall trend toward a more moral world is unmistakable. And those salubrious effects have been primarily results of the scientific understanding of the causes of evil and the rational application of political, economic, and legal forces to drive it down and bend that arc ever upward.