Authority and Morality
Apparatus devised by Stanley Milgram to enable his participants to suppose that they were delivering electric shocks of higher and higher voltage as punishments to a man who made mistakes in learning word associations.
Stanley Milgram asked male volunteers to give increasingly severe electric shocks to another man when he made mistakes in learning word associations. Most volunteers continued even when the machine they used indicated that the shocks were dangerous. In another study, Philip Zimbardo and colleagues studied prison guards and prisoners in a simulated prison. Christopher Browning found from interviews that most ordinary men recruited by the Nazis into the Order Police in World War II had been willing to kill civilians. Many of us, evidently, can be cruel under certain circumstances. Morality involves one of several kinds of intuition, such as authority, loyalty, and justice, which are accorded different degrees of importance by different cultures.
Stanley Milgram’s study of obedience is one of the most publicly discussed pieces of research in psychology. In 1961 and 1962, Milgram placed newspaper advertisements and sent direct mail to invite men to come to his laboratory at Yale University to take part in an experiment on memory. They were each paid $4 to take on a role of teacher and told that, when a learner made mistakes they were to punish him by administering electric shocks.
Milgram was born in New York in 1933, and died of a heart attack at the age of fifty-one.1 His parents were Jewish immigrants and in the later part of World War II, the young Stanley and his parents listened on the radio to news of the horrors of the Holocaust, which European Jews were suffering. He became fascinated by news, and described himself as a news addict. Milgram was a bright young man who earned a degree in political science, expecting to go into the American Foreign Service. But, on hearing him give a speech in his final year as an undergraduate, the dean of his college suggested that he might consider going into social psychology. It was something he hadn’t considered. He became interested, and he applied to complete a PhD in psychology at Harvard, but was rejected because he had taken no psychology courses as an undergraduate. In an article she wrote about him, his widow Alexandra said he was never one to take No for an answer. During that summer, he took nine psychology courses at three different universities in New York. He was admitted to Harvard’s PhD program. He did well and, in 1960, became an assistant professor of psychology at Yale. He devised his studies of obedience explicitly to pursue his early interest in the Holocaust which, he saw, could not have been the work of just a few people. He wondered how, in the civilized country of Germany—which had produced great musicians, philosophers, and writers—enough people could be found who were willing to hasten people to their deaths in concentration camps, and press levers to release the poisonous gases that would kill them.
Before performing his first experiment, Milgram described his experiment to forty Yale psychiatrists. He said he would test volunteers by asking them to administer increasingly severe shocks to a learner who made mistakes. The psychiatrists uniformly predicted that only a tiny number of the volunteers would deliver electric shocks at the level labeled as dangerous.
Milgram’s first full experiment was published in 1963. In the paper, he described how men from unskilled, skilled, and professional backgrounds were recruited and came to his laboratory. Forty men took part. When they arrived, one at a time, each was greeted by a laboratory assistant who wore a grey technician’s coat. He presented the volunteer with his $4 payment, and told him that the money was his whatever happened while he was there. Another man was present, whom participants assumed had also volunteered. He was a likeable man, a trained accomplice of the experimenter. Both the participant and the accomplice were told that the experiment was on effects of punishment on learning words, and that it might be important for education. The men were asked to draw one of two slips of paper to see who would be the learner and who the teacher. In fact both slips bore the word “teacher,” so the real volunteer was cast into that role, and the accomplice said he had drawn the slip labeled “learner.” The volunteer saw the learner being taken into a room and strapped into a chair, with electrodes on his arms, through which electric shocks could be delivered.
The teacher was then taken into an adjoining room. On a table was a large and impressive piece of apparatus, labeled “Shock generator,” a photo of which you can see at the start of this chapter. It had a row of switches, a light above each, and labels beneath the switches that started at 15 volts, and ranged up to 450 volts in 15-volt intervals. At the left-hand end, 15 volts to 75 volts, the switches were labeled, in capital letters, “SLIGHT SHOCK.” Extending rightward were switches labeled “MODERATE SHOCK,” then “INTENSE SHOCK,” up to “EXTREME INTENSITY SHOCK” and, at the top of the range, 375 to 420 volts, the “DANGER, SEVERE SHOCK, with two switches beyond this point (435 and 450 volts), labeled “XXX.”
The volunteer was told his job was to sit at the table and, using a microphone, to read out lists of four pairs of words from a piece of paper. Then he would say one of the words from the list, and the learner had to indicate which other word it had been paired with. If the learner were correct, the teacher would move to the next list of words. When the learner made his first mistake, the teacher was to switch the left-most switch on the shock generator, announce the voltage, “15 volts,” and then press a lever to deliver the shock. As the learner made more mistakes, the teacher delivered more severe shocks, in turn, each one higher than the last by 15 volts. With each shock a light above the switch came on, another light flashed, a galvanometer needle on a dial swung to the right, and a loud buzz sounded. As the 300-volt and 315-volt levels, labeled “EXTREME INTENSITY SHOCK,” were reached, the learner was heard pounding on the wall. Beyond that point he made no more responses. The teacher was instructed by the laboratory assistant to treat non-response as a mistake, and continue with the experiment. At this point the volunteers tended to ask if it were alright to carry on. The assistant used a sequence of prods, such as, “It’s absolutely essential that you continue.”2 Many participants became visibly agitated, and asked more questions. Fourteen participants (35%) refused to continue delivering shocks when they reached this point or just beyond it. But twenty-six of the forty participants—65 percent of them—agitated and upset though many of them were, did carry on right up the highest level of shock, 450 volts.
At the end of the experiment the learner reappeared, said he was none the worse for the experience, and a reconciliation took place between him and the volunteer. Milgram says in his paper that procedures were undertaken to assure that the volunteer would leave the laboratory in a state of well-being. Steve Reicher and Alex Haslam reported that, in all, Milgram did around forty pilot tests and ancillary studies for his experiment and found that, depending on the conditions, anywhere between zero and 100 percent of people complied with the experimental instructions. Many of these studies are described in Milgram’s book Obedience to Authority. Because it is so controversial, Milgram’s research has been the subject of the current debate in psychology about reproducibility of results.3 Jerry Burger has replicated Milgram’s experiment, and found that nearly fifty years after the original publication, the results still hold up. This research also raises the question of how research findings are to be interpreted.4
Psychological Experiment as Theater
Milgram’s experiment was a piece of theater. In it, everyone but the volunteer had rehearsed his part. Rather than taking seats in the audience, the volunteers found themselves on stage, but without a script. Their actions derived from the social situation of doing what they were told, what was expected, as we human beings, social creatures as we are, very often do. As Michael Tomasello has pointed out, we have become not individuals, but members of social groups, “Us.” Within such a group we tend to be loyal.
Augusto Boal invented what he called the Theatre of the Oppressed, in which audience members first watch a play about social oppression of a kind with which they were familiar, which then leads to a tragic outcome. Then, in a second performance that would follow it immediately, some members of the audience are invited onto the stage to take roles in the play, and explore how they might stand up to the coercive maneuvers of the other characters, and so achieve a different outcome. Milgram’s theatrical piece can be thought of as a more radical version of Boal’s idea. One difference was that it inserted people, without their knowledge, into an experience of a situation that they had never imagined.
If you are wondering what effect their performance had on the men who volunteered for Milgram’s experiment, Lauren Slater, in her book of 2004 on ten great experiments of psychology, tried to track them down. She was able to find two of them.
The first man Slater interviewed seemed, to her, a fairly conventional person, who said that he refused to deliver shocks to the top of the scale. If he’d continued to the top of the scale, he said, he wouldn’t be talking to her but to a psychiatrist.
The other man whom Slater interviewed said he had continued delivering shocks to the top of the scale. When he was later told of the actual purpose of the experiment, he had a stark recognition of what he had done. He said recognizing that he had conformed to what had been asked of him changed his life. He had been struggling, at the time, with being homosexual. He had been complying with societal constraints. His confrontation, in Milgram’s experiment, with his own propensity to conform to a compelling social situation was a turning point. He decided to come out as homosexual. Slater recounts how, as compared with the man she interviewed who said he hadn’t given shocks up to the top of the scale, this man seemed to be freer. Slater saw him as someone who was more alive, who had led a more satisfying life.
Milgram’s experiment offers us an important glimpse of ourselves. As ordinary people, we think we are autonomous agents when, really, we are extremely responsive to social situations. Milgram’s volunteers entered a reputable laboratory of Yale University. They trusted the experimenter. They did what they were asked. They did what they were paid to do.
A feature film has been made of Milgram, called Experimenter, which depicts his obedience experiments as well as the public reaction to them. The film also treats Milgram’s close relationship with Alexandra, whom he met and married when he was at Yale. The film takes up the idea that the shocks that the volunteers thought they were administering weren’t the only ones that occurred. As people heard about Milgram’s studies, and started to think about their significance, further shocks—real ones this time—occurred, and they continue to occur.
The first shock is that, although Milgram was aiming to solve one of the most important ethical problems in recent history—how educated citizens of a civilized nation could be murderously cruel—many psychologists have reacted to Milgram’s experiments by criticizing them as unethical.5 People were tricked, they say, invited into a laboratory to do experiments on memory, when the experiments were not on memory at all but on how far obedience could go. They were tricked into thinking they were delivering shocks, when they weren’t. Is it not unacceptable, they ask, for an assistant professor at a reputable university to inflict this kind of experience on unsuspecting members of the public for a payment of $4?
The next kind of shock, for each of us, might occur as we ask ourselves which group we would have been in: would we have been among the 65 percent who conformed, or the 35 percent who refused to conform?
The most profound shock is yet more radical. It’s the realization that we all—you, me, our relatives and friends—may be capable of being cruel to others when a compelling social reason that arises from our cultural group is more important to us. From this perspective, it’s no good thinking that we might have been among the inmates of concentration camps while those others—them—would have been camp guards and executioners. Because of our inherent sociability we all have it within ourselves to conform to social expectations, and to be cruel.
Amidst the controversy that Milgram’s experiments have aroused, two issues are often neglected. The first important principle is that Milgram had the insight to bring the question of how we human beings engage in wars and other kinds of internecine conflict from history into the psychology of everyday life, and into the laboratory. Although for Milgram the everyday life with which he was concerned was that of World War II, the issue of cruelty continues into peacetime. In 2011, in Norway, a man blew up a government building, killing eight people, then drove to an island on which a summer camp was being held for young people who belonged to a political party with which he did not agree.6 There, he shot and killed 69 individuals and wounded 319 others.
Figure 24. Hannah Arendt, from a mural. Source: By Bernd Schwabe in Hannover (own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons. https://commons.wikimedia.org/wiki/File%3A2014-08_Graffiti_Patrik_Wolters_alias_BeneR1_im_Team_mit_Kevin_Lasner_alias_koarts%2C_Hannah_Arendt_Niemand_hat_das_Recht_zu_gehorchen%2C_Geburtshaus_Lindener_Marktplatz_2_Ecke_Falkenstra%C3%9Fe_in_Hannover-Linden-Mitte.jpg.
The second important consideration is to treat Milgram’s hypothesis as what it is: a psychological hypothesis. It’s a hypothesis about the Holocaust that was shared, at least in part, by Hannah Arendt. You can see a picture of her from a mural in Hanover, Germany, in figure 24. In her book on the trial of Adolf Eichmann Arendt used the phrase, “the banality of evil.” She saw Eichmann not so much as perverted, but as rather normal. She considered that he didn’t think much about what he was doing. He just did it. She wrote that this kind of unthinking going-along-with-things-as-normal, with the rest of a group, thought of as “us,” makes the Holocaust even more terrifying.
Is obedience to authority a good hypothesis to explain human genocidal actions? Augustine Brannigan and colleagues argue that although obedience may have been a factor, in Nazi Germany, people acted willingly, and consciously, for a cultural cause and thought they were doing the right thing. They propose that factors that prompted people toward cruel actions in those times, and their tolerance of others they saw performing similarly cruel acts, are more complex than simply being obedient.
Being Cruel
An experiment on a simulated prison extended some of the issues with which Milgram dealt, and was an even more explicit piece of theater. Craig Haney, Curtis Banks, and Philip Zimbardo carried out what has become known as the Stanford Prison Experiment. Men were recruited by newspaper advertisements, and from the population around the university, to take part in an experiment in a simulated prison. The volunteers were rigorously screened and all those with histories of drugs, crime, or psychiatric disorders were excluded. Twenty-four were chosen, all college students, as being the most psychologically stable of the original seventy-five volunteers. Half were randomly assigned to be guards, and the other half prisoners. When the experiment ran, there were eleven guards, who worked eight-hour shifts, and ten full-time prisoners.
Among the prisoners, loss of identity was frequent, as were depression and helplessness. Five of the ten were released early because the experimenters judged their reactions to be too severe. Among the eleven men assigned to be guards, nine acted as their roles required. Among these, four clearly enjoyed their power in this situation, and invented new forms of harassment of the prisoners. Only two of the eleven guards refused to go along with cruel treatment, and were kind to the prisoners. The experiment had been planned to last for fourteen days, but the researchers became so concerned with what was happening among some of the guards that they stopped the experiment after only six days. (Harassment of the kind that developed in this experiment is not a general feature of real prisons, where efforts are made to ensure that procedures are regulated.)
In a re-run of the Stanford Prison Experiment, sponsored in Britain by the British Broadcasting Corporation (BBC), Stephen Reicher and Alex Haslam allocated eight men to the role of guards and eight to the role of prisoners. The guards could not identify with their role. They did not impose their authority. They became disorganized and were overcome by the prisoners. Reicher and Haslam analyzed the circumstances in which people do and don’t identify with groups, and suggested that tyranny can arise when cultural groups fail, when people are not able to feel themselves to be one of “Us.” When people feel despairing and ineffective as a group, they become more easily able to respond to, and be identified with, a tyrant who offers hope.
As with Milgram’s experiment, the volunteers in the Stanford Prison Experiment, and its re-run, found themselves in a piece of theater in which they did not know the script, but were led by small steps into roles that were unfamiliar to them. As with Milgram’s study, part of the purpose of Zimbardo’s experiment was to determine how ordinary people had been able to behave with cruelty under the Nazi regime. In World War II, between five and six million Jews, and several hundreds of thousands of others, including members of non-Nazi political parties, Gypsies, homosexuals, and mental patients, were killed under orders of the Nazis. All such people were thought to be undesirables. The idea of eugenics (discussed in chapter 4), proposed by Galton and promoted by some early intelligence testers, had taken on a new and terrifying form.
A study comparable to Zimbardo’s prison experiment, but in many ways more convincing in its implications because it was based in direct historical evidence—judicial investigations of 125 men of the 486 members of Battalion 101 of the Nazi Order Police—was carried out after the war by Christopher Browning. Some of the officers in Battalion 101 had education up to the level of high school. Most of the men were recruited from the skilled and unskilled workforce of Hamburg. Two years after recruitment their job was to round up and kill Jews in Poland. Browning compares his study with Zimbardo’s. He found that between 10 percent and 20 percent of the members of Battalion 101 refused to take part in the shootings. He found, too, that those who refused to go along with their orders were not court-martialed.
Morality
As Michael Tomasello has argued, and in part demonstrated in experimental work, we humans are members of a species that is based on cooperation, and that seems to us a good thing. At the same time, studies such as those of Milgram, Zimbardo, and Browning show that we can cooperate for ends that are anything but good. We can be cruel, to the point of killing fellow members of our species.
Jane Jacobs, the influential writer on how we live cooperatively in cities, put the issue in her book, Systems of Survival. We exist in two quite different systems, which are present simultaneously in the Western industrialized world. Jacobs calls them “syndromes.” Each has its own moral codes, within which we humans cooperate. One is what Jacobs calls the Guardian Syndrome; she takes the term from Plato’s Republic. In this syndrome we adhere to tradition and respect hierarchy. In this syndrome deception is sometimes required. The deception is aimed at “Them,” adversaries within and outside the group. In the Guardian Syndrome, something else is always more important than any individual, and the worst thing anyone can do is to be disloyal. In political systems we see, time and again, people come into power as leaders, alpha males (though some have been female) whose first demand to everyone is to be loyal to them.7 In the other system, the Commercial Syndrome, Jacobs argues, people are open to inventiveness and novelty; they come to voluntary agreements with others. Here, we have to respect other individuals and be honest, otherwise the system breaks down. Among outcomes generated by the Guardian Syndrome can be tyranny and dictatorship. Among outcomes generated by the Commercial Syndrome can be pervasive social inequality, as people and organizations with more mental and physical resources strive to increase their own resources in competition with others.
There is a long-standing debate in philosophy about how we might become virtuous and moral beings.8 Do we come to be in this way by means of our emotional make-up, or because of cultural promptings of society, or from individual reasoning? Only recently have psychologists started to investigate this issue.
Jonathan Haidt has proposed that our abilities to cooperate, in what we might call a moral way, are based on shared judgments, and on intuitions that derive from social, emotional, and cultural influences, rather than from step-by-step reasoning of the kind produced by what Keith Stanovich calls “System 2,” and what Daniel Kahneman calls “thinking slow.” Among these intuitions are: care for others as compared to infliction of harm, fairness as compared with cheating, liberty as compared with oppression, loyalty as compared with betrayal, authority as compared with subversion, and purity versus degradation. We can see the experiments of Milgram and Zimbardo, and the study by Browning, as pitting the intuition of care for others against the intuition that we should be loyal and obey authority.
Although we may think that wrongdoing is committed by people whom we label as criminals, who become those other people over there, “Them,” this research suggests that it might be better to think of another group: “We.”9 Might philosophers, historians, psychologists, and lawyers work together to better understand the reasons why human beings harm each other, and what the legal and social implications might be? Might they work together to overcome some of the impediments to cooperation?