AFTERWORD TO THE PRINCETON CLASSICS EDITION

Rereading Evil in Modern Thought after more than a decade has been a source of pleasure. I remember conceptual struggles so intense, particularly while writing the long first chapter, that my brain seemed to writhe. Today transitions with which I struggled then seem natural, arguments inescapable. Still I’m grateful for the chance to explore, in an afterword, aspects of the book that I would have written differently were I to write it today.

The first is easily remedied: I should have made clear that its subject was modern Western thought. A number of readers wrote to ask why I had neglected one thinker or another, to which the answer was usually easy: not wanting the book to take up a lifetime, I had to make strategic choices. Since my original goal was to present a new narrative of the classic texts of modern Western philosophy, I tended to cut on the side of convention. It was more important to show that Kant and Hegel were centrally concerned with the problem of evil than that Feuerbach or Kierkegaard was. In order to shift philosophy’s focus from epistemological to urgent questions, it was more important to reread those philosophers who cannot be ignored in any serious study of the history of Western philosophy than to insist on adding others to the canon. That said, the echoes of Providence in Adam Smith’s invisible hand, as well as Darwin’s discussion of purposiveness, are so important that I wish I’d explored them. Anyone who has devoted time to understanding evil will know about the moment when one simply has to stop, however unfinished her reflections may be. As I wrote in the original introduction, my hope was not to exhaust paths of inquiry but to open them.

Nevertheless I should have been explicit about the fact that my own inquiry was restricted to Western thinkers, leaving out whole traditions where my competence was nil. As an otherwise generous reader wrote,

Few would now openly claim that a study of European philosophy suffices as a method of addressing the human condition, a phrase that itself sounds outdated. Neiman evades the problem by invoking a vaguely bounded ‘we,’ suggesting significance for ‘us’ while avoiding universalistic claims about human beings. The ‘we’ for whom only Europe is relevant, however, probably no longer exists.1

I stand chastened. Though the point would not have been foreign to me in 2001, when the book was finished, it’s striking that I felt no need to make it explicit. Might we take Perkins’s suggestion that those for whom only Europe is relevant no longer exist as a mark of a more robust universalism that is one of the positive consequences of globalization—perhaps even a sign of moral progress?

□ □ □

A second correction is easy to make. Since its publication, Eichmann in Jerusalem has provoked a flood of work insistent on undermining Arendt’s central theses—more work than has been occasioned by any other philosophical book of the twentieth century. The volume and particularly the passions Arendt aroused should themselves be subject to scrutiny, as I argued elsewhere, for most criticism failed to understand that the book was less a work of journalism or history than of philosophy.2 We are so accustomed to thinking that doing evil requires intending to do it that Arendt’s denial of Eichmann’s evil intentions is still commonly taken to be a way of excusing him, while her apparently gratuitous introduction of the role of the Jewish councils seemed to blame the victims themselves. Neither is the case. The Jewish councils—along with Germany’s so-called inner emigrants, whom she discusses in the same section—are discussed for the same reason Eichmann himself is discussed: to show that not intention but judgment is the heart and soul of moral action. The road to hell was paved with all sorts of things: from the admirable motives of the heads of the Jewish Councils, to the questionable ones of most of the inner emigrants, to the shoddy but undemonic intentions of the sort of bureaucrat Eichmann played at his trial. The destination is what matters; the pavement is secondary. The world must hold you responsible for what you do, since it’s what you do, not what you intend, that resounds in the world.

These radical claims still stand: the Holocaust—like most other examples of mass murder—could not have taken place without the participation of millions of people who were not particularly committed fascists, or even fascists at all, but were willing to follow whatever orders made for the least thinking and the most comfort. This is a lesson that bears repeating and reflection, for it remains vital to understanding every moral and political struggle we face today.

But it does not fit Adolf Eichmann. The meticulous historical research and the philosophical acumen of Bettina Stangneth’s Eichmann Before Jerusalem have shown that Arendt, like virtually everyone who observed the trial, was taken in by Eichmann’s performance on the stand. Stangneth shows that Eichmann was such a masterly liar that he had the presence of mind to prevent suspicions that might lead to resistance by having prisoners told to remember the numbers of the pegs on which they hung their clothing—moments before they entered the gas chambers. Far from being the dull and thoughtless careerist he impersonated at his trial, the SS officer was outraged, in private, by any suggestion that he was a mindless bureaucrat. On the contrary, he presented himself in Argentina as a powerful and hardened soldier committed to the destruction of the Jewish people; his only regret was his failure to organize the murder of all Europe’s Jews. His performance in Jerusalem, an imitation of the bureaucrats he despised, was designed to save his life—an entirely plausible goal given how few former Nazis were called to account for their crimes.

Nor was his thinking confined to adroit uses of instrumental rationality. Even more troubling is the fact that he actually thought about morality. In contrast to the trial, where he shocked the public by claiming to be a Kantian, texts written in Argentina make clear that he fully understood the core of Kant’s doctrine, a commitment to universalism, and rejected it. One text raises the question: “What about morality?” And offers the answer: “There are a number of moralities: a Christian one, a morality of ethical values, a war morality, a struggle (Kampf) morality. Which one should it be?”3 He finds the idea that there might be a universal morality that makes claims on everyone to be as absurd as it is hypocritical, as he later explains. In fact, he is suspicious of philosophy itself, which he sees as an internationalist project.

For Eichmann and his fellows, genuine thinking is racial thinking. He held every people to be engaged in the struggle for world domination; he believed this to be a law of nature, in which the drive for self-preservation is stronger than any other force, especially “so-called moral drives.” Without a state or an army, Jews fought with the weapons they had, namely, intellectual ones. Thus they brought into the world “false and deceitful” doctrines of internationalism, beginning with the prophetic messages of the Old Testament, continuing through the French Revolution, which was driven by Freemasons, and culminating most dangerously in the Bolshevik message of the Jew Karl Marx. Eichmann admonished his listeners to understand the weapons of the enemy, which meant reading both Jewish literature and philosophy in general, both of which might on occasion be put into service against the enemy. Just how this might be done was demonstrated in his use of Plato; the fact that Socrates accepted Athens’s death sentence shows that even universalists know that morality must yield to state power. “Socratic wisdom,” wrote Eichmann, “bows to the law of the state. That is what the humanists teach us.”4 In this, of course, he means to show the weakness of humanism, which must yield to the inevitable laws of nature and power. Not for a moment does he take a universalistic position seriously. Kant, he often repeated in Argentina, was not a sufficiently German thinker—an honor Eichmann also extended to Nietzsche.

Now none of these discoveries show that Eichmann was a serious thinker or that he took philosophy seriously. By his own lights he could not, since he held that “philosophy is international.” In one of many felicitous phrases, Stangneth describes Eichmann’s relationship to books as that of a thief breaking into a house, looking as quickly as possible for whatever he can take from it. But the Argentinian papers show that he did take and use a variety of philosophical texts to construct a worldview of which he was proud.5 This hodgepodge of claims can hardly be counted as truly independent thinking. Behind the racist, vitalist doctrine that was standard Nazi fare is a standpoint that goes back to the Sophists: virtue is either a matter of helping your friends and hurting your enemies, or it’s a load of hypocritical rhetoric designed by one group to maintain its power over another by claiming to be acting for the common good. Readers of Plato will recognize these as the two positions set out in Book One of The Republic, against which Socrates works to defend a universal conception of virtue. As I have argued elsewhere, the contemporary version of these pre-Socratic standpoints was best stated by Carl Schmitt, who held that the only genuine political distinction is between friend and foe, and later by Michel Foucault, whose analyses of power were far richer than those of Socrates’s opponent, but whose insistence on the identity of justice and power led him to reject any notion of universal justice.6

Eichmann’s philosophical musings may be crude and superficial, but crude and superficial views of this kind are still with us, and for all their vulgar simplicity, Eichmann’s philosophical reflections are coherent. There are just two choices. Either you believe in universal moral categories that are valid for every human being on earth or you do not, and a great many consequences follow from that very simple decision. Eichmann in Argentina knew very well which side Kant was on. He may have even read the works of those Nazi philosophers who, unwilling to burn the treasures of German patrimony as easily as they’d burnt the works of Freud and Heine, drew careful distinctions between what they called the Jewish Kant, who emphasized universalism, and those parts of Kant that could be mined for quotes about duty and rule-following, then duly celebrated as German virtues. Thus Eichmann in Jerusalem hoped that calling himself a Kantian would endear him to his enemies (whom he assumed were all Kantians, a reasonable assumption given his terms) or at least confuse them into pardoning him. Were there no copy of the original documents it would beggar belief: Eichmann’s lawyer had to talk him out of using Kantian formulations and references in his final statement to the court.

After reading Stangneth we can no longer say, with Arendt, that Eichmann “merely, to put the matter colloquially, never realized what he was doing7 for there is incontrovertible evidence that he knew exactly what he was doing all along. His own words show him to be driven by the desire to shape history according to the vicious anti-Semitic, anti-communist program to which he, and everyone he chose to have around him, was deeply, clearly, and consciously committed. Add to his proud and unwavering commitment to one of the viler ideologies the world has known Eichmann’s ability to deceive even those who knew him well–for which the words “criminal mastermind” seem accurate—and we have as perfect a specimen of a classical evildoer as has ever lived. The fact that, unlike many Nazis, he never enriched himself with the stolen goods to which he had access may count as mitigating evidence for some, but for many it will only darken the picture. A man who, in his own words, is such an idealist as to be immune to the temptations of ordinary forms of criminality may appear more terrifyingly evil than those with more human flaws. If, as Evil in Modern Thought argued, the Holocaust was carried out by a great number of people whose intentions varied from acceptable to abysmal, Eichmann’s belonged to the worst.

Historical evidence that was not available to Arendt has undermined her claim that “except for extraordinary diligence in looking out for his own advancement, [Eichmann] had no motives at all.” It does not undermine her core idea that evil intentions are not required for evil actions. Eichmann mimicked the thoughtless bureaucrats Germans now call desk-perpetrators, but they were there, in droves, to be mimicked—and without them, the intentions of men like Eichmann would seldom bear fruit. There are people who are driven by poisonous mixtures of murderous ideology and lust for violence, but their numbers pale next to those who aid and abet them with no intention other than the wish to get by with a minimum amount of trouble. This point is so important for understanding and preventing other crimes that it cannot be emphasized too often—even if the example on which Arendt based it turns out to be the wrong one.

□ □ □

A third aspect of the book that I would expand today is the discussion of terrorism. It was written in haste. I had promised my editor, Ian Malcolm, that nothing would prevent me from delivering the final version of the manuscript on which I’d been working for years— by October 1, 2001. The attacks on the World Trade Center and the Pentagon put the word “evil” in the spotlight and me in a quandary. Could I let the book go to press as it stood? Wisely, as in every other editorial suggestion, Ian told me not to change the book but to add a short chapter if I could. Written though it was in the midst of the emotions that gripped most people wherever they were in those weeks, I stand by everything in it. In particular, the reluctance of many on the left to join George W. Bush in describing the attacks as evil was a moral and political mistake. The concept’s sudden resurrection by those who were responsible for other kinds of evils should not lead the rest of us to reject it. Doing so, I argued, would leave moral concepts—the most powerful ones in our language—in the hands of those least equipped to use them. Acknowledging distinctly different kinds of evil supported one of Evil in Modern Thought’s central claims: evil, like many other crucial concepts, is not the sort of thing that has an essence, and it’s useless to try to understand it by seeking necessary and sufficient conditions that would cover every instance of it. In less metaphysical terms: using the word “evil” for one kind of atrocity doesn’t preclude using it for another. What we understand to be evil has changed over time: think of earthquakes, torture, and slavery. At the beginning of the modern period the first was considered an evil; the latter two were not.—Sometimes we do make progress.—This isn’t to say that the word is up for grabs, but the only way to distinguish something that’s evil from something that’s awful is to look carefully at particulars.

I set out to look at particulars in chapter 11 of the next book I wrote, Moral Clarity: A Guide for Grownup Idealists. (The title itself was an attempt to wrest back moral language from those who had so grievously abused it.) As I was writing the book, the Bush administration was ripe for scrutiny. Today the consequences of its decisions are only too apparent: the attack on Iraq, begun on false pretexts and accompanied by torture, played a central role in creating terrorist bases where there were none before. Evidence of both the horror and the uselessness of torture continue to emerge as I write. Yet I was also interested in less dramatic, or directly lethal forms of evil, such as Bush’s remark “Lucky me, I won the trifecta.” Before he first made it in mid-September, as the ruins were still smoking in lower Manhattan, only fans of horse racing might have recognized the word: a trifecta is a bet on three horses so lucky that you take all the winnings. “Hit the jackpot” would have roughly captured the same emotion. Here is one of thirteen speeches Bush made to largely Republican audiences from February to June 2002:

The recession—no question, I remember when I was campaigning, I said, would you ever deficit spend? And I said, yes, only if there were a time of war, or recession, or a national emergency. Never thought we’d get (laughter and applause). And so we have a temporary deficit in the budget, because we are at war, we’re recovering, our economy is recovering, and we’ve had a national emergency. Never did I dream we’d hit the trifecta. (Laughter)8

Were this speech, like twelve others, not documented by the office of the White House Press Secretary, you might suspect it was invented by hostile critics—a charge made against Paul Krugman when he first reported it in the New York Times. It is indeed hard to swallow that a president of the United States could use fundraising events to make jokes about the same victims of terrorism he was invoking as excuses to go to war. Anyone still persuaded by the widespread myth that the Iraq war was driven by misguided idealism would do well to look these speeches up. The repeated jokes about the windfall provided by the terrorists left few visible traces, but Bush’s gloating about the opportunities he gained through others’ deaths and loss marked a moment in moral discourse. When the president of the United States cheapens life and death by jesting, moral language seems lost between coarseness and kitch.

Though its foreign policy has been constrained by the wreckage the previous administration left behind it, the Obama administration has abolished torture, and its repeated attempts to close Guantanamo have only been thwarted by the refusal of fifty U.S. states and 106 countries to accept the remaining prisoners. But those responsible for the war and its crimes remain not only at large but in demand, and despite all the available evidence, respected media still regularly deny that the Bush administration lied its way into war. Given the unprecedented Republican opposition to even the most uncontroversial of Obama’s proposals, it’s clear that a war crimes trial, or even a truth and reconciliation commission, was not a political option.

Here comparisons may be useful: it took over twenty years for West Germany to even begin the process of examining its Nazi past. (Communist East Germany was able to try and sentence large numbers of Nazis shortly after the war ended, but exploring the reasons for the differences between the two halves of postwar Germany would go beyond the scope of this text.)9 Perhaps a reckoning with American crimes will take place after Bush and Cheney and Rice and Rumsfeld are gone. Meanwhile, though they are hardly the only cause of the fundamentalist violence that appalls us in the vile ISIS videos, they surely fuel its fires. You don’t need orange jumpsuits that scream “Guantanamo” to make the connection.

In Moral Clarity I wrote:

Shortly after the first photos hit the daylight (Rumsfeld) declared “What happened at Abu Ghraib was not right. But it’s not the same thing as cutting off someone’s head in front of a video camera.” True enough. It’s not the same thing. But only if you think that evil has an essence can you draw the conclusion Rumsfeld wanted us to draw, namely, that one of these actions is evil, and the other is simply—too bad. The differences are abundant. One of these evils is visible, and the other still is not. One of these evils is the product of ruthless individual will, while the other is produced by a large complex system that makes it easy for individuals to evade responsibility. One of them takes death as its clear-eyed goal, and the other as an unfortunate by-product. One of them reduces people to shivering pale heaps of terrified flesh, in tears on a screen; the other reduces them to naked faceless bodies. What they have in common is a willingness to use methods so loathsome they create enemies left and right. But the biggest difference is probably this one: since one of perpetrators is so much more powerful than the other, its effects are likely to last longer. Abu Ghraib confirmed the world’s nightmares, creating hatred and suspicion of the U.S., the West in general, and anyone in the future who’s inclined to say a word, however sincere, about defending human rights against dictators. For the sake of argument, let’s suppose the intentions of the U.S. soldiers who tortured prisoners at Abu Ghraib were better than those of the groups who beheaded hostages. What good are those intentions if the consequences turn out to be worse?10

At the time “Isis” referred to nothing but an ancient Egyptian goddess, and no one had enough imagination to suppose that alienated youths would leave the relative comfort of the West to fight in a Syrian desert. Never have I more deeply wished my own fears to be unfounded. Writing today from Europe amidst a new and vicious wave of fundamentalist terrorism I am hardly inclined to excuse it—but it will not be uprooted without an understanding of the ways our own policies contribute to its growth.

□ □ □

A fourth aspect of the book would look different today. When writing it I never suspected that the Lisbon earthquake might have more than historical relevance. It was a point of departure for modernity, and for reframing our views about evil, but hardly a point of orientation. I argued that the distinction between natural and moral evils is not a natural one, but a way of making sense of and controlling what evil we can. But the distinction looks increasingly unnatural, and impossible to maintain, in the face of the disasters the past decade has witnessed. Most have been not only deadlier than the Lisbon earthquake, but, thanks to global media, more present.

If it weren’t tragic it might even be funny: the competition among fundamentalist religious leaders to decode the message they were sure was heaven-sent. Much like orthodox priests in 1755, they view natural catastrophes to be the speech of God. I was particularly struck by reactions to the 2004 tsunami: while Islamist preachers saw it as punishment sent to sweep half-naked tourists from Thailand’s beaches, Buddhists were quick to point out that it occurred the day after Christmas, when unusual amounts of meat are consumed. For the Christian website raptureready.com, which searches the news for signs of disaster that correspond to St. John’s prophecies, the tsunami was one more sign that the apocalypse was nigh. Laughable as such responses may seem to soberer readers, they point to deepest needs: anything is easier to bear than meaningless suffering. Interpreting tragedies as God’s message is the shortest route to filling them with meaning.

The Washington Post found a full quarter of Americans surveyed to be convinced that Hurricane Katrina was “a deliberate act of God,” but they too were divided as to His intentions. Those who viewed the hurricane as punishment were quick to note that New Orleans had always been known for sin, gambling, and general wickedness—“the kind of behavior that ultimately brings the judgment of God.” A good half saw the storms not as punishment but as warning, reminding those left to repent for their sins before they were swept away in God’s final judgment. Some 14 percent thought the hurricane was a test of faith, meant to lead us through doubt and despair to an affirmation of the Lord, whatever His will; while the rest viewed the catastrophes as one of the ways of God that human beings cannot understand.

More appealing were political reactions to the disaster, which focused less on the storm itself than on the absence of decent human reactions to it. In 1755, the valiant Marquis Pombal commandeered stores of grain to prevent famine, organized the disposal of corpses to prevent plague, gathered militia to prevent plundering, and did it all so efficiently that Lisbon was able to publish its weekly paper without missing an issue. 250 years later, the only thing working in New Orleans was the news. Many people were without food and water for days; faced with threats of looting and violence, hundreds of policemen simply left town, and corpses floated in the streets for weeks. The scale of disaster in New Orleans was larger than that of the Lisbon earthquake, but then so were the resources for coping with it. The differences in responses to the crises point to differences in societies that have, incredibly, grown more unjust over the course of time. Contemporary Portugal is part of Europe, and as such enjoys the fruits of social democracy of which Americans can only dream. In contemporary America, decades of ignoring a system of social relations that ensure that poor people die more, and more quickly—be it through vulnerability to disease or disaster—were brought, for a moment, home. It is hard to acknowledge that eighteenth-century Portugal enjoyed a more competent and comprehensive social structure than the present-day United States, but for the majority of Americans, poverty remains a natural evil with which government should not interfere. What distinguishes them from the nineteenth century English clerics who argued that providing relief to victims of the Irish famine would be flouting the will of God?

The multiplication of natural disasters has led some observers to fear that the end of the world is near, not because the messiah is coming, but because environmental doomsday is. Some years ago one of my children, then a teenager, surprised me with her exact knowledge of the shrinking polar icecap and the expanding ozone hole, and even more with her conviction that they made political action futile. I pointed out reasons to be hopeful about political change. My generation had seen both the civil rights movement and the women’s movement make changes in people’s lives that, while still insufficient, had been unimaginable forty years earlier. Why shouldn’t her generation take up environmental activism with similar success? “Ma,” she said witheringly, “We don’t have forty years.”

All serious climate science suggests she is right. Recent disasters have shown the distinction between natural and moral evils that followed the Lisbon earthquake to be no longer clear, and even less important. The extent of human power to alter nature has made the line between what is human and what is natural increasingly hard to draw. And our knowledge of how much evil can be done without intention makes the question of whether or not destruction and suffering were deliberate increasingly irrelevant.

The power to call up storms used to be the stuff of magic, the outer reaches of human fantasy. But population growth, technological capacity, and a political order that views the endless production of goods as the key to human happiness have combined to create forces of which earlier ages could only dream. Melting the Arctic? Bringing forth hurricanes? What boundaries remain?

Expressions like “nature’s fury” suggest that nature fights back, and some environmentalist rhetoric is disturbingly animistic. I’m enough of a product of the post-Lisbon era to hold that nature doesn’t act; we do. But nature is surely reacting, and far more dangerous than any extravagant rhetoric is the unwillingness of government and industry to make the changes needed to preserve the earth that we know. Climate change suggests that the difference between natural and moral evils no longer matters; what counts is taking moral responsibility for the natural disasters that our reckless use of nature provokes. Drawing distinctions between natural catastrophes and human evils makes increasingly little sense in a world where we have the power to call up natural forces that could be nature‘s undoing—whether intentionally, through the nuclear weapons whose use is less regulated than ever before, or thoughtlessly, through the looming environmental disasters that could make recent storms seem like a gentle warning. When human heedlessness stokes destruction, then leaves the world’s poorest people at its mercy, it isn’t merely tragic; it’s evil. And nothing but the most banal of intentions is required for it to occur. If the events of recent years make us realize that, they could provoke a change in our understanding as important as the one that occurred after the Lisbon earthquake.

□ □ □

The final thing I would change in this book is larger in scope, and I am still at work on its repercussions. Hiroshima was not entirely absent from Evil in Modern Thought, and I noted then that “for a good two decades after World War II, the conviction that limits had been crossed in ways from which we would never recover was captured more by the word Hiroshima than by Auschwitz.”11 (EMT, p. 251) I also wrote that it wasn’t necessary that Auschwitz came to stand for an event that ended modern ways of thinking about evil, any more than Lisbon was necessarily the event that began them. My goal was not to establish hierarchies of evil, as if it were possible to measure mass technological murder on some absolute scale. Rather, I set out to trace changes in modern (Western) consciousness as reflected in our perceptions of evil from the eighteenth through the twentieth centuries. And since the early 1970s, when the immediate threat of nuclear war seemed to recede, Auschwitz has surely loomed larger in European and American minds than Hiroshima or Nagasaki.

I did not ask why the consciousness of Hiroshima, so present in its aftermath, had been so thoroughly overshadowed by our consciousness of Auschwitz until asked to contribute an essay for a volume in preparation for the seventieth anniversary of the bombing of Hiroshima and Nagasaki. I knew neither the journal nor its editor, but felt morally bound to respond to a request that sought to devote to those bombings a small portion of the critical thought that has been devoted to Auschwitz. (Critical thought, alas, that hardly takes place inside philosophy departments, as I noted in Evil in Modern Thought; there the subject is largely absent. Historians, sociologists, literary scholars, and journalists, however, have produced enough reflections on Nazi crimes to fill whole libraries.) I had an idea or two on the subject which I thought could become an essay within a couple of weeks, but decided that I should do a little empirical research on the background of the bombings before beginning to reflect.

Two months and many books later, I was shocked by what I’d learned. Until that moment I believed what most of my teachers and friends, whether American or European, believed. Here is John Rawls’s summary in a short essay written to commemorate the fiftieth anniversary of Hiroshima:

The bomb was dropped to hasten the end of the war. It is clear that Truman and most other allied leaders thought it would do that. Another reason was that it would save lives where the lives counted are the lives of American soldiers. The lives of Japanese, military or civilian, presumably counted for less. Here the calculations of least time and most lives saved were mutually supporting.12

Nevertheless, Rawls concluded,

Both Hiroshima and the firebombing of Japanese cities were great evils that the duties of statesmanship require political leaders to avoid in the absence of the crisis exemption. (Ibid.)

Rawls’s conclusion was all the more moving in light of his biography: as a soldier in the Pacific, his was one of the lives that might well have been lost in an invasion of Japan. Even nobler was his refusal to mention that fact. Other writers, like Gunther Anders, emphasized that the consequences of the bombing went far beyond the Japanese lives lost. As Arendt wrote about Auschwitz, the impossible became true, and it’s the kind of possibility that cannot be undone. Once the complete destruction of the earth through nuclear weapons becomes an option, we live with it as a fact that becomes so commonplace that we barely notice its brutality. Were the leaders who ordered the bombing that inaugurated the atomic age nonetheless acting with the best of intentions? So much the worse for intention.

But what if Truman and other allied leaders did not think the bomb necessary to end the war and save lives, as Rawls, and I, and nearly every other recent observer believed?

As it turns out, this was not only clear to those historians who have carefully traced the decision to drop the bomb.13 Though they foraged in archives, examining documents and diaries that weren’t readily available, the fact that the bomb was not dropped to avoid an invasion or end the war was so widely known in 1945 that it crossed political lines as deep as those between John Foster Dulles and Albert Einstein. A 1945 survey commissioned by Truman and led by directors such as Paul Nitze and John Kenneth Galbraith concluded:

Based on a detailed investigation of all the facts, and supported by the testimony of the surviving Japanese leaders involved, it is the Survey’s opinion that certainly prior to 31 December 1945, and in all probability prior to 1 November 1945, Japan would have surrendered even if atomic bombs had not been dropped, even if Russia had not entered the war, and even if no invasions had been planned or contemplated…The Hiroshima and Nagasaki bombs did not defeat Japan, nor by the testimony of the enemy leaders who ended the war did they persuade Japan to accept unconditional surrender.14

Even before the survey, just weeks after Hiroshima, Truman himself publically declared that the bomb was not necessary to win the war.15

The widely held contrary opinion was carefully and deliberately constructed. Conspiracy theories are rightly viewed with suspicion. In this case, however, historians have provided detailed information that reveals exactly how contemporary consciousness of the bomb was manufactured. It began with a September 1947 letter from James Conant, president of Harvard and member of the interim committee created to advise Truman on the use of nuclear arms. Though a majority of Americans supported the decision to bomb Hiroshima and Nagasaki, a protest by most of the scientists who had worked on the Manhattan Project had begun to raise doubts; John Hersey’s New Yorker report from Hiroshima describing the aftermath of the bombings had an even wider impact. “This type of sentimentalism,” wrote Conant, “for so I regard it, is bound to have a great deal of influence on the next generation. The type of person who goes into teaching, particularly school teaching, will be influenced a great deal by this type of argument.”16

He was writing to Henry Stimson, who had been secretary of war from 1940–1945. The recently retired and widely respected Stimson was an ideal choice for the task Conant had in mind, and he was persuaded to write an article that created the legend most of us take for granted: after judicious weighing of all the alternatives, Truman authorized an atomic attack in order to avoid an invasion of Japan, which was “expected to cost over a million casualties alone” (ibid.). Stimson knew better than anyone that those numbers were wrong. He had been among those officials who argued against a nuclear attack, along with high-ranking military advisors including Generals Eisenhower, LeMay, and Admiral Leahy, then chairman of the Joint Chiefs of Staff. They were overruled by the new secretary of state, conservative South Carolinean James Byrnes, who persuaded Truman to drop the bomb. We will probably never know what convinced Stimson to write an account he knew to be false, and which private correspondence suggests he later regretted. But the 1947 Harper’s essay had consequences any writer would envy: it was almost singlehandedly responsible for creating the myth that still reigns today.

Still reigns. There are, as I mentioned, a number of excellent books devoted to deconstructing the myth and setting historical records straight. The best of them, Lifton and Mitchell’s Hiroshima in America, was even published as a trade paperback. But they have little claim on popular consciousness—where “popular” includes a host of well-read and well-meaning critical intellectuals, the present author included. Any literate person will have a rough but generally accurate grasp of the events that took place in World War II’s European theatre; a mixture of secondary school education and an unending stream of popular movies, television, and radio programs ensure that you needn’t be a historian to know basic facts about Auschwitz. Indeed, to avoid information about Auschwitz, you would need to have spent the last thirty years in a hermitage. By contrast, the amount of material available about the war in the Pacific, and the bombing of Hiroshima and Nagasaki, is easy to overlook. The information is there—primarily in books and documents, far less in film or other media—but you have to seek it out. Or to put the matter differently: why is there an entire museum devoted to the Holocaust on the Washington Mall, while the Smithsonian was unable to hold a temporary exhibit about Hiroshima?17

Let me take an anti-Semitic bull by its horns and address the most common suspicion. There is indeed a Jewish lobby, more accurately known as AIPAC, which seeks to support right-wing Israeli governments who deflect responsibility for their own policies by emphasizing the ways in which Jews have been victims, particularly at the hands of the Nazis. But it is not responsible for the movement of popular consciousness from Hiroshima to Auschwitz. For decades following the war, survivors of Auschwitz were viewed with shame and even disgust; the newly founded Jewish state wanted heroes, not victims.18 Jewish interests are not the ones served by the forgetting of Hiroshima, nor were Jews among the congressional leaders who refused to fund a proposed 1995 Smithsonian exhibit examining the event. To be sure, it’s easy to argue that the Jews were wholly innocent victims in a way the Japanese were not. While ordinary Japanese cannot be held entirely responsible for their government’s racist and militarist policies, only a handful of Japanese communists actively opposed them, and emigration during the period was nonexistent.19 By contrast, even in Germany, small but definite resistance to the Nazis existed, and Marlene Dietrich and Hannah Arendt were but two of the thousands of emigrants, both voluntary and involuntary, who maintained a commitment to another Germany throughout the Nazi era. The fact that no such comparable Japanese contingent existed makes it harder to accept Japanese claims of victimhood—though no sane person would argue that the thousands of schoolchildren melted at Hiroshima did anything to deserve their fate.

The difference between Japanese and Jewish victimhood gives us one clue to the difference in the amount of attention paid to Hiroshima and Auschwitz. Not the fact that the heirs of Jewish victims are in a better medial position to emphasize their victimhood, but the fact that their victimhood was as unambiguous as could be imagined, has led so many to focus on it.20 There may be no other event in recent history that has such simple dramatic properties. If the villains of the story were more complicated human beings than even the better Hollywood efforts, like Schindler’s List, know how to portray, their crimes were as evil as any screenwriter could construct and their victims entirely free of guilt for the fates they suffered. Even better, the Nazis lost the war, and though six million Jews were murdered in it, the people as a whole survived and flourished. Evil was punished, innocence rewarded. Such a narrative serves not just dramatic needs but—broadly speaking—religious ones. If the wicked suffer, and the good are saved, then the world as a whole makes sense.

Whatever deep-seated needs for theodicy the story of Auschwitz may serve, it clearly serves political ones. Focus on a crime that is devoid of moral ambiguity gives us a picture of absolute evil next to which any other example falls short. The repetition of images of ordinary people being herded first into boxcars and then into gas chambers makes us feel we know what real evil is, leaving everything else appearing merely unfortunate. The more we focus on simple models of evil, the less practice we have in recognizing more complicated ones. And it is the more complicated forms of evil that have dominated U.S. foreign policy in the last fifty years, from the bombing of Hiroshima and Nagasaki to the disastrous invasion of Iraq. The suppression of facts about Hiroshima has had more of an impact on our moral perception than we know how to acknowledge. Lifton and Mitchell write that Americans have a vague, unfelt, half-knowledge of Hiroshima that

Increased ordinary Americans’ sense of being out of control of their own destiny, of being out of control of the large forces that determined their future…We have to ask ourselves how much of our rising mistrust for politicians and officials of all kinds, for our government and just about all who govern us—how much this angry cynicism so evident in our public life in recent years is an outcome of the Hiroshima and post- Hiroshima nuclear deceptions and concealments.21

I now believe that the suppression of Hiroshima, and the focus on Auschwitz, has distorted our moral vision: like extremely near-sighted people, we can only recognize large bold objects; everything else remains vague and dim. I do not think this is accidental, but I do not think it is the work of those who claim to represent Jewish interests. Rather, it is U.S. interests that are served by promoting a picture of evil that is so indisputably self-evident. In the light of the fires of Auschwitz, the forms of evil perpetrated by U.S. policies in Iran, Guatemala, Congo, and Chile—just to name several that occurred in the first decades after the bomb was dropped—seem so pale and subdued as to have nearly vanished from view. Or to put the matter in psychoanalytic terms: the focus on Auschwitz is a form of displacement for what we don’t want to know about Hiroshima.

The argument sketched here is set out in more detail in the essay mentioned, and I will continue to expand these thoughts in future work.22 In the 2004 preface to the paperback edition of this book I wrote:

Perhaps the very worst legacy of the Nazis was to leave everything else in the shadow of the death camps; Auschwitz was so extreme that next to it, too much can look benign. But focusing on its singularity is no less dangerous than any other such focus. Evils come in too many forms to confine. (Ibid., xv)

But my own ignorance of the background of Hiroshima, and my willingness to believe the official version of it, shows how easy it is to accept conventional wisdom, and how surely the history of concepts must be informed by real history if it is not to go astray.

□ □ □

While Evil in Modern Thought has sparked discussions of contemporary evil to which I have, at best, partial answers, its original aims were more limited. Though trained by some of the best philosophers any twentieth-century student could have the good fortune to encounter, I sought a narrative of philosophy’s history that was different from the one I had learned. Increasingly frustrated by the gap between the expectations which most people bring to philosophy—be it for an afternoon or a lifetime—and the actual practice of most professional philosophers, I looked to the history of philosophy to see if the gap had always been there. And though sophistry and scholasticism were present in every age, what I found was even richer than I’d hoped. The great minds of modern philosophy were not preoccupied with proving the existence of the external world or other minds—questions Kant described as “cobwebs…that are of interest only to the Schools” (Critique of Pure Reason, Bxxxv). Their interest in the difference between appearance and reality was not driven by the skeptical worry that the world might turn out to be different from the way it seems, but rather, precisely the opposite. The fear that the manifold appearances of evil cannot be explained or redeemed fueled the major works of philosophers as different as Leibniz, Rousseau, Kant, Hegel, and Marx. Their skeptical opponents, like Bayle and Hume, were not troubled by skepticism about billiard balls, or even ordinary moral questions, but about attempts to defend or justify a cosmic architecture that appears woefully deficient.

My own subsequent reading and reflection have only confirmed this reading of philosophy’s history, and left me more perplexed: what drove twentieth-century philosophers to read their predecessors in a manner so weltfremd? (Though the German language is often full of impossibly long constructions, English has no equivalent for this precise way of denoting alienation from the world.) Why ignore philosophy’s engagement with a profound and pressing problem to focus on questions that Kant knew should only interest pedants? Why turn a field that is riveting into one that seems lifeless, and finally boring? It’s hard to find an answer, particularly when many of the philosophers responsible for the standard narrative, like Bertrand Russell, were hardly themselves weltfremd. Nor can the answer lie in an overzealous use of the principle of charity, which reckons the problem of evil to religious concerns that philosophy is thought to have outgrown. For the problem of evil is not a religious problem; religion, rather, is one sort of response to the problem of evil.

It’s hard to find an answer, but it’s easy to close with a warning. Many people in many places are hungry for philosophical reflection. We may laugh at fashion labels called “Theory” or even “Philosophy,” or sneer at the sales figures of some of the trendier postmodern sages. But they point to a need that will be filled by woolier heads when critical ones abdicate. The digital era has raised a host of new questions, but people still wrestle with questions that moved Kant, and if professional philosophy ignores them, they will seek their answers elsewhere. May this book continue to call philosophy home.