images

The Bush Doctrine

A testing case for the doctrine of preventive war, and for any jurisprudence of prevention, is the Bush administration’s decision to attack, invade, and occupy Iraq in March 2003. In December 2002—three months before we attacked Iraq and fifteen months after the September 11 attacks on us—the administration published a major policy paper entitled “The National Security Strategy of the USA,” in which it tried to make the case for employing “preemptive actions to counter a sufficient threat to our national security.”

The paper, which outlined what has come to be known as the Bush Doctrine, began with a discussion of deterrence as the major strategy in dealing with the Soviet Union and its allies during the Cold War. It emphasized that the nature of this cold war, confronting a “risk-averse” nuclear power adversary, reasonably led the United States “to emphasize deterrence of the enemy’s use of force, producing a grim strategy of mutual assured destruction. . . .”1 The paper went on to argue that the type of deterrence that proved successful against the risk-averse Soviet Union was insufficient in dealing with international terrorism because “the threat of retaliation is less likely to work against leaders of rogue states more willing to take risks, gambling with the lives of their people, and the wealth of their nations. . . .”

It contrasted our enemy during the Cold War, in which “weapons of mass destruction were considered weapons of last resort whose use risked the destruction of those who used them,” with today’s enemies—terrorists and rogue states—who see “weapons of mass destruction as weapons of choice.” They also see these weapons as capable of blackmailing the United States and its allies “to prevent us from deterring or repelling the aggressive behavior of rogue states.” For example, if Saddam Hussein’s Iraq had had nuclear weapons when it invaded Kuwait in 1990, it would have been far riskier for the United States to force it back. Nor would deterrence work “against a terrorist enemy whose avowed tactics are wanton destruction and the targeting of innocents; whose so-called soldiers seek martyrdom in death and whose most potent protection is statelessness.” This new reality, concluded the paper, “compels us to action.”

The action contemplated by this analysis included preemptive military attacks against stateless terrorists and the states that harbor and support them: “We must be prepared to stop rogue states and their terrorist clients before they are able to threaten or use weapons of mass destruction against the United States and our allies and friends. . . . The inability to deter a potential attacker, the immediacy of today’s threats, and the magnitude of potential harm that could be caused by our adversaries’ choice of weapons, do not permit [the] option [of merely reacting]. We cannot let our enemies strike first.”

The paper then sought to justify the use of preemptive military measures in select situations by adapting existing international law to the new reality of international terrorism: “For centuries, international law recognized that nations need not suffer an attack before they can lawfully take action to defend themselves against forces that present an imminent danger of attack. Legal scholars and international jurists often conditioned the legitimacy of preemption on the existence of an imminent threat—most often a visible mobilization of armies, navies, and air forces preparing to attack.”

But this limited concept—preemption in the face of an “imminent” threat—would not justify preventive wars designed to head off longer-range and less certain, but still serious, threats. To address this problem, the paper recommended adapting the traditional concept of imminence “to the capabilities and objectives of today’s adversaries.” Therein lie the leap that is so controversial and the leap that lacks a firm jurisprudential basis in international or domestic law. The paper tried to bridge this gap by invoking the danger of terrorism, pointing to our new enemies’ reliance “on acts of terror and, potentially, the use of weapons of mass destruction—weapons that can be easily concealed, delivered covertly, and used without warning.” The paper pointed to 9/11 as proving that the infliction of “mass civilian casualties is the specific objective of terrorists and these losses would be exponentially more severe if terrorists acquired and used weapons of mass destruction.” It then extrapolated from precedent: “The United States has long maintained the option of preemptive actions to counter a sufficient threat to our national security. The greater the threat, the greater is the risk of inaction—and the more compelling the case for taking anticipatory action to defend ourselves, even if uncertainty remains as to the time and place of the enemy’s attack. To forestall or prevent such hostile acts by our adversaries, the United States will, if necessary, act preemptively.” By using the words “preemptive” and “preventive” interchangeably, the paper papered over real differences between these concepts and differences in the jurisprudential basis for employing them.

Finally, the paper assured friends and enemies alike that the United States will “always proceed deliberately, weighing the consequences of our actions. To support preemptive options, we will”:

• build better, more integrated intelligence capabilities to provide timely, accurate information on threats, wherever they may emerge;

• coordinate closely with allies to form a common assessment of the most dangerous threats; and

• continue to transform our military forces to ensure our ability to conduct rapid and precise operations to achieve decisive results.

The purpose of our actions will always be to eliminate a specific threat to the United States or our allies and friends. The reasons for our actions will be clear, the force measured, and the cause just.

Although these cautionary, prudential assurances are useful, they are a far cry from any carefully thought-out jurisprudence or even specific guidelines for action. Nor has historical precedent provided these missing elements. The paper was correct in observing that the United States “has long maintained the option of preemptive action” in extreme cases. President Clinton had considered several preemptive military actions during his administration and had, in fact, authorized at least three such actions and threatened one more.2 But until the Bush Doctrine was officially propounded, this option had not been publicly articulated as a central aspect of our National Security Policy.3 Nor was it used, prior to the attack on Iraq, as a justification for a full-scale war designed to prevent an enemy from securing weapons of mass destruction, disseminating such weapons to terrorists, and producing other evil and dangerous results at unspecified future times4—in other words, for a preventive war, as distinguished from a preemptive attack.

Because the war in Iraq has been so controversial, especially in light of the failure to find weapons of mass destruction (WMDs) or proof of a clear link to international terrorism, the current discussion regarding preemptive or preventive military action has been understandably skewed by our nation’s first major use (some would argue misuse) of the doctrine. For the most part, the debate over preemption, and especially over preventive war, has become a debate over the legitimacy of the attack on Iraq and its subsequent occupation. Moreover, the very different issues of singular surgical preemptive attacks (such as Israel’s bombing of the Osirak reactor), as contrasted with full-scale preventive war (such as the invasion of Iraq) have become conflated in at least some of the discussion growing out of the invasion of Iraq. Thus, although the invasion of Iraq is indeed a testing case for the doctrine of preventive war, it should not be regarded as the testing case for that important and controversial doctrine. A far wider array of potential situations must be evaluated before an acceptable general jurisprudence can be articulated.

The Prewar Debate over Prevention and Preemption

In the months leading up to the attack on Iraq, and in the months since, there has been a close association in the minds of many between the policy of preemptive or preventive military action and the implementation of that policy in Iraq. Though this association is understandable, since the attack was justified in preemptive terms as part of an ongoing “doctrine,” and since it is the first and only example of a full-scale preventive war fought by the United States, it is important to delink the general issues of preemption and prevention from the specific application of these policies in the one controversial case of Iraq. It is also important to recall the important differences in degree between a preemptive surgical strike and a full-scale preventive war. It is certainly possible that a preventive policy of some kind may be sound if properly implemented in appropriate situations and that the attack on Iraq did not fit the criteria for appropriate use. On the other hand, it is impossible to ignore or downplay the Iraq experience, since it has been the primary testing case and it may well demonstrate the dangerous potential for misuse of any doctrine of military prevention or preemption.

In the run-up to the invasion of Iraq, the debate over the Bush Doctrine in general and its preemptive aspect in particular took on a polemical quality because of the immediacy of the threat of a dangerous and controversial war in Iraq and the strong opposition to this prospect by many in religious, academic, and peace communities. The questions were often posed in either/or terms: Are you for preemption or against it? Do you favor preventive war or do you oppose it? Would a preventive or preemptive attack be legal or illegal under international law? Would it be just or unjust under principles of “just warfare”? Is a policy of deterrence preferable to a policy of preemption?

With an invasion of Iraq at stake, the answers to these overbroad questions tended to come in rather absolute terms, often reflecting the ideological or political predisposition of the writer. Professor Robert A. Pape of the University of Chicago insisted: “Preventive war by the United States would violate one of the most important norms of international politics—that democracies do not fight preventive wars. [The United States] has never attacked to stop a state from gaining military power. Iraq would be the first preventive war for the United States.”5

Professor Pape may well have been correct descriptively, but it does not follow that his normative conclusion is likewise correct. Had the United States (or Great Britain) attacked German military targets in the 1930s, before Germany’s aggression against Poland, and had it prevented the German conquest of Europe with its enormous casualties, the historical assessment of preventive war might well have been quite different. Indeed, it is the inaction of Chamberlain that has become a paradigm of immoral and ineffective appeasement. But if Hitler’s Germany had been destroyed or disarmed by preventive military action, the world would never have experienced the horrors of Nazi aggression. It would have experienced only British “aggression,” which, despite Germany’s violation of its treaty obligations, might have seemed disproportionately harsh without actual knowledge of what might have been. This is the paradox of prevention: When it is employed successfully, we rarely can be sure what it prevented. When it is not employed, it is difficult to assess if it could actually have prevented the horrors that did occur. With the benefit of hindsight, it now seems relatively clear that the democracies should have taken preventive action, including, if necessary, the waging of a preventive war against Nazi Germany. It does not follow from that conclusion, of course, that the United States should have fought a preventive war against Iraq, despite the flawed analogies often invoked between Hitler’s Germany and Saddam’s Iraq. What does follow is that any categorical statement about the impropriety of democracies’ ever fighting preventive wars is not supported by the verdict of history.

The U.S. Conference of Catholic Bishops also expressed its deep concern about “recent proposals to expand dramatically traditional limits on just cause to include preventive uses of military force to overthrow threatening regimes or to deal with weapons of mass destruction.”6 Yet there were some in the Vatican hierarchy who, during the early years of the Cold War, favored preventive war against the Soviet Union because of the fear that this atheistic regime, which had stifled the practice of Catholicism throughout Eastern Europe and endangered values important to the church, would spread its godlessness throughout Europe and the rest of the world.7

Professor Richard Falk of Princeton, who is generally critical of Israeli actions, distinguished the use of preemption by Israel in 1967 from the preventive war threatened by the United States against Iraq: “Preemption . . . validates striking first—not in a crisis as was done by Israel with plausible, if not entirely convincing, justification in the 1967 war, when enemy Arab troops were massing on its borders after dismissing the UN war-preventing presence, but on the basis of shadowy intentions, alleged potential links to terrorist groups, supposed plans and projects to acquire weapons of mass destruction, and anticipations of possible future dangers. It is a doctrine without limits, without accountability to the UN or international law, without any dependence on a collective judgment of responsible governments and, what is worse, without any convincing demonstration of practical necessity.”8

Professor Falk may have been right about Iraq, but I strongly suspect that he would have taken a somewhat different view with regard to the “shadowy intentions” of Nazi Germany in the mid-1930s.9 The “practical necessity” of purely preventive military action was proved by subsequent events and was anticipated by some even at the time. Germany had, after all, armed itself to the teeth in open violation of its treaty obligations. Its designs on neighboring lands were plain, its involvement in the Spanish Civil War was obvious, and Hitler’s program for the Jews was set out in detail for all to read in Mein Kampf. “By regularly linking acts of aggression with assurances of peaceful intentions . . . [Hitler] continually duped and paralysed his opponents. Before their partly horrified, partly helpless eyes . . . he succeeded in everything he undertook, from the withdrawal from the League of Nations in October 1933 through the introduction of universal military service and the occupation of the Rhineland to Vienna, Munich and Prague.”10 Yet only with the benefit of hindsight could anyone have been certain of the unprecedented extent of Germany’s predation between September 1939 and May 1945.

Although the use and threatened use of preventive or preemptive force are associated with the Bush Doctrine, the current debate over these controversial issues began in earnest during the Clinton administration. The focus of that debate was not, however, Iraq; it was Libya, which was believed to be constructing an underground chemical plant at Tarhunah.11

As a 1997 report put it, “Preemption—that meaningful and value-packed word that excites journalists and sends politicians scrambling for cover—was a clear policy option, according to the Secretary of Defense.”12 The report, like the Bush Doctrine paper that followed it five years later, did not really distinguish between short-term preemption and longer-term prevention. It generally used the word “preemption” to cover both. Indeed, its primary focus was on the possibility of an attack on a facility that did not seem to pose an imminent danger. Nor was it about to become unattackable (like the Osirak reactor in June 1981). The attack therefore would have been closer to a preventive than preemptive military action.

The 1997 report, written by a group of military officers for the National Security Program at Harvard’s Kennedy School, outlined the policy considerations involved in any decision to use preemptive military force against a dangerous adversary. It predicted that “preemption will increasingly face the decision-maker as an option” for two reasons: First, the costs of not acting preemptively are rising, because “the death, destruction and dislocation that [rogue states, terrorist groups, and transnational organized criminals] can bring upon America is likely to be so massive and debilitating that the act cannot be allowed to occur if there is any possibility of preventing it,” and second, we are more capable of acting preemptively, because of enhanced information processing, intelligence and “strategic strike capabilities.”13

The report went on to catalog a series of general criteria that might be relevant in any preemptive military decision:

Specific, accurate, and timely intelligence is a critical precursor—an enabler and a catalyst—for preemption. Intelligence sharing with non-traditional domestic and international partners may be necessary to achieve the sufficient evidence for justification necessary for timely and feasible preemption. Intelligence assets must focus on capabilities and intentions of potential adversaries, and on their opportunity to carry out those intentions. The convergence in time and space of an adversary’s capability, intention, past history, opportunity, and current actions are [sic] key to the decision to preempt. These five critical intelligence insights bear directly and decisively on the key decision variables. Without these, the possibilities for moral and legal justification, as well as political and military feasibility, are tenuous.

The report then concluded: “U.S. leaders must consider preemption as a viable, sometimes necessary option to prevent unacceptable loss of life or damage to essential institutions.”14

Applying the criteria they had articulated, the authors of this report opposed military preemption against the Libyan chemical factory, favoring instead a combination of diplomatic, political, and economic pressures. Military preemption was not attempted, and in the end Libya seems to have accepted a diplomatic resolution, coerced perhaps by the threat of military force.15 In that case, the decision not to preempt seems to have worked. In hindsight, the decision seems correct, though we might have a different view if the Libyan military had subsequently used chemical weapons against U.S. targets. It is also difficult to determine what reprisals might have taken place if the chemical factory had been attacked.16

The difference, therefore, between the approaches taken by the Clinton and Bush administrations lies not so much in the policies each articulated—they both supported preemption when American security was threatened—but rather in the implementation of the policies. The Clinton administration appears to have been more cautious, demanding a higher level of threat. It was also prepared to try other means first and for a longer period of time. The Bush administration, on the other hand, seemed prepared to employ military preemption earlier in the process. It is not certain, however, that Bush would have come out differently on Lybia or that Clinton would have come out differently on Iraq, based on the available intelligence.17

There is another important difference as well. The Kennedy School report posed the question of whether if preemption is included as an option, it should be announced publicly as part of a strategic policy. This is what the report recommended: “A deliberately ambiguous policy that neither rejects nor recommends preemption may be best. To our potential opponents, what we do is far more important than what we say. For the American public and the rest of the world, what we say is important.”18

The Clinton administration seems to have adopted that policy. The Bush administration, as we shall see, did not.

Whether to Announce a Doctrine of Preemption

The Bush Doctrine with regard to Iraq was twofold: first, to announce that the United States now had an explicit policy of stopping “rogue states and their terrorist clients before they are able to threaten or use weapons of mass destruction” and second, to use that policy and actually to attack Iraq. Most of the discussion surrounding Iraq has naturally focused on the second issue because guns speak louder than words. But the first issue—the decision to announce a policy of prevention or preemption—also warrants consideration.

There seems to be agreement among unbiased observers that there is a difference between retaining the option of preemptive attack in certain limited situations, on the one hand, and announcing and implementing a broad policy of preemptive war, on the other. Following the horrific deaths during the Russian school shoot-out in Beslan, Russia went out of its way to announce a new policy of preemption: “Russia warned it could launch preemptive strikes on terror bases anywhere in the world and put a bounty on two top Chechen rebels after the broadcast of a chilling video of the school hostage siege.” The Russian chief of staff, General Yury Baluyevsky, announced that his military “will take steps to liquidate terror bases in any region.” He also noted that “the doctrine of preventive military action against terror targets had been spelled out publicly before and said such steps were only an ‘extreme measure’ that did not include use of nuclear force.” Despite his reference to prior articulation of the doctrine, his announcement was understood to reflect “a hardening mood in Moscow a day after one television network aired video footage of scared children and parents sitting in the school gym in southern Beslan as masked militants rigged bombs over their heads.” The European Union reacted negatively to the Baluyevsky announcement, seeking to “downplay the warning on preemptive action against terror targets and said statements like those from Baluyevsky ‘are not the first instrument that will bring results’ in combatting terrorism.”19

Several days after the statement by General Baluyevsky, Vladimir Putin, the president of Russia, confirmed that preemption against terrorism was now the official Russian policy: “Terrorists must be eliminated directly in their lairs, and if the situation requires it they must be attacked, including abroad.”20

There are considerable potential downsides to any policy of military preemption, announced or unannounced, as well as to any preemptive military actions, threatened or taken. Some of these downsides, such as the risk of false positives, are obvious. Others may be more subtle. For example, an announced policy of preemption may drive an enemy who is contemplating the development of nuclear weapons to speed up its program. The New York Times, in an editorial on September 19, 2004, made this observation: “Mr. Bush once lumped Iraq, Iran and North Korea together as an ‘axis of evil.’ Bush’s decision to invade Iraq limited the diplomatic and military tools left available to influence North Korea and Iran—which were undoubtedly taught by the Iraq experience that the best protection against a preemptive strike is a nuclear arsenal.” This may or may not be true in any particular case, as the Libyan experience demonstrates. In that case the threat of a preemptive attack, coupled with diplomatic pressures and the desire of Colonel Muammar Khadafy to avoid further sanctions and international ostracism, brought about a nonmilitary resolution to a dangerous threat.

There are probably situations in which a rogue nation has been deterred from developing a nuclear military option by the credible threat of preemption. For example, Syria knew that Israel had the military capacity as well as the announced intention of preempting any Syrian effort to develop a nuclear bomb that could target Israeli population centers or shift the balance of power. As far as can be determined, Syria had made no serious effort to develop a nuclear bomb, though it almost certainly wanted one, and it did develop chemical and perhaps biological weapons.21

There is no question, however, that a rogue nation that manages to win the nuclear preemption race, that develops a nuclear arsenal before it can be preempted, will be in a better position to deter a preemptive attack from its enemy by threatening a nuclear response to an unsuccessful preemptive attack.22 North Korea has learned this lesson, and Iran is seeking to emulate its success. In this high-stakes game of threat and counterthreat, timing may be everything.

An announced policy of preemption can also serve as a counterdeterrent by incentivizing a pre-preemptive first strike by an enemy who fears a preemptive attack. “On November 9, 2004, Iran announced it was capable of mass-producing the Shahab-3 missile and reserved the right to use the missile in preemptive strikes should its nuclear facilities be threatened with attack.”23 Iran has thus threatened to preempt any preemptive attack. A system of international law and morality that accepts preemption may encourage a potential target of preemption to strike first. Had Egypt known that Israel was planning to attack its air force on the ground in 1967, it would certainly have taken some action, perhaps even attempted a preemptive strike itself against the Israeli Air Force.24 The same might have been true of the United States had it known of the Japanese intention to attack its naval base at Pearl Harbor. This potential downside is more theoretical than practical, since nations generally take actions based on actual intelligence about enemy intentions rather than on announcements of policy or concerns about vague principles of international law and morality. But an announced policy of preemption can serve as post facto justification, if not an actual reason, for a preemptive first strike.

These observations may be more relevant to the decision whether to announce a policy of preemption, as the United States, Russia, Israel, and Australia have now done,25 than to the decision to take an unannounced, surprise preemptive action, such as the attack on the Iraqi nuclear reactor by Israel in 1981. But a singular preemptive military action without an announced policy can also have significant downsides, even when it succeeds in the short term. In the first place, any preemptive military action that is taken necessarily serves as a signal that preemption has become part of that nation’s policy options. Thus all the disadvantages (as well as advantages) of an announced policy of preemption follow from a preemptive attack. To put it another way, a nation can secure the advantages of a surprise preemptive action only once before also being burdened by its possible disadvantages.

As the authors of the 1997 Kennedy School report on military preemption concluded:

Preemption may have significant deterrence value, especially after a rogue state or terrorist has once been preempted. It may, however, drive a determined adversary to extreme measures of retaliation and reprisal. These consequences could be worse than the original threat that was preempted.

Paradoxically, although preemptive actions may eliminate immediate threats, they may also cause the adversary to go underground, figuratively and literally, thereby masking future activities from detection and making future interdictive efforts much more difficult. The impact of this “preemption paradox” is that decision-makers must have a clear understanding of their objectives and of what constitutes success: in some cases, preemption serves only to delay or prolong crisis in favor of “buying time” to seek alternative solutions.

The Bush administration, in contrast with prior administrations, decided to announce a distinct policy of military preemption under specified circumstances. If this announcement was intended to deter Iraq from continuing its refusal to cooperate more fully with weapons inspectors, it failed. The administration then carried out its threat to act preemptively and invaded Iraq.

The Postattack Debate

Following the invasion of Iraq and the subsequent developments—no WMDs found, a difficult occupation with many casualties on both sides, much international condemnation—the nature of the debate over preemption became even more charged. Criticism of the attack on Iraq often turned into condemnation of the doctrines of preemption and preventive war.

An article in the Financial Times declared that “for all Bush’s bravado, the preemption doctrine is dead” and the president’s assertion of “America’s right to preemptive action ‘before threats materialize’ had a hollow as well as a hubristic ring.” The Bush Doctrine of preemption, “unveiled in the wake of September 11, 2001 survives only in name.”26 The Los Angeles Times headlined an article SHOOTING FIRST: THE PREEMPTIVE WAR DOCTRINE HAS MET AN EARLY DEATH IN IRAQ.27 Its text concluded that the problems that have plagued the occupation “make it highly unlikely that preemption is a tactic that he will employ elsewhere anytime soon.” It too declared that “Bush’s doctrine of preemption is, for all intents and purposes, dead.”

A lead editorial in the New York Times of September 12, 2004, was headlined PREVENTIVE WAR: A FAILED DOCTRINE. It observed that the preventive war doctrine had had only “one real test” and that the failure to find any weapons of mass destruction or any link between the Iraqi regime and al-Qaida had shown it to be a failure: “The real lesson is that America dangerously erodes its military and diplomatic defenses when it charges off unwisely after hypothetical enemies.”28 The editorial sought to draw precisely the distinction between “preventive war” and “preemptive attacks” that the Bush Doctrine had elided.29 It noted that American policy had always “left room for pre-emptive attacks” when the “nation’s vital interests were actively threatened” and all reasonable diplomatic efforts had been tried and failed:

America is under no obligation to sit and wait, if it is clear that some enemy is actually preparing to strike first. But it correctly drew the line at preventive wars against potential foes who might, or might not, be thinking about doing something dangerous. As the administration’s disastrous experience in Iraq amply demonstrates, that is still the wisest course and the one that keeps America most secure in an increasingly dangerous era.

The terrorist attacks of Sept. 11, 2001, plainly ushered in a new era of catastrophic threats to the American homeland. If these are to be met effectively, major changes in national security policy will be required. But a shift toward preventive wars is not one of them.30

Some writers continued to favor preemption, despite (or because of) the Iraqi experience. Another article in the Los Angeles Times predicted that preemption as an option “will not go away” and that even Kofi Annan was now suggesting that discussions begin about “the criteria for an early authorization of coercive measures to address certain types of threats.”31 Professor Ruth Wedgwood observed that the “question remains whether a state can ever resort to the use of preventive force in unique cases—when intelligence is reliable and timing is sensitive, multilateral authorization is not practically available, and a state is sponsoring or hosting a network acquiring weapons of mass destruction. . . . [T]he abstract answer to many strategists is yes—a given regime might have a record of conduct so irresponsible and links to terrorist groups so troubling that the acquisition of WMD capability amounts to an unreasonable danger that cannot be abided.”32

Miriam Sapiro, in an article in the American Journal of International Law, suggested: “Perhaps Iraq will be the case that highlights the risks of adopting a new doctrine of preventive war, while persuading skeptics that the constraints governing the more traditional justification of anticipatory self-defense make sense.”33

Whether or not Iraq will persuade policy makers that the risks associated with preventive war outweigh its benefits—at least as a general matter—there is little doubt that our first full-scale preventive war will have considerable impact on how we think about preventive and preemptive warfare. It may, however, require some distance from the passions generated by that controversial and divisive war, and some historical perspective, before the lessons of Iraq can be fully understood and incorporated into a general jurisprudence. In the meantime, the process of evaluating an ongoing war will continue even if it provides only a first draft of the ultimate historical conclusions.

Evaluating Decisions Relating to Iraq

Applying the criteria outlined in this book to the United States’ Iraq policy requires that we examine a number of factors that were involved, or should reasonably have been known, at the time that critical decisions were made and implemented. Subsumed within this broad framework are issues relating to intelligence gathering, evaluation, and determination, both within the government and to the public.

We should begin by first asking whether, if had all the allegations claimed to be true and relied on by the government turned out to be accurate, the action would have been justified. We know, in retrospect, that many, perhaps most, of these allegations could not be confirmed: No weapons of mass destruction were found; no hard links between the specific al-Qaida attacks of 9/11 and the Saddam Hussein administration could be established; no proof exists that more innocent lives were saved by toppling the Hussein dictatorship than by allowing him to continue in his murderous ways; and there was no evidence of American “liberators” being widely greeted with appreciative gestures rather than with a determined resistance. On a more general level, it is difficult to prove or disprove the claim that in the long run, democracy will be promoted, terrorism reduced, and other broad values served by our actions.

But what if all these allegations had proved to be valid? It is nearly impossible now to conjure the image of Iraqi citizens lining up to shower our troops with flowers, as captured nuclear weapons are paraded through the streets, victorious Americans produce hard evidence of direct links between the toppled dictator and Osama bin Laden, and the American military ends its brief occupation while Iraq quickly moves on the road to democracy, followed by Syria, Iran, and Saudi Arabia. However, let us suspend the bitter reality for a moment and imagine a utopian result. Even then the case for a full-scale preventive invasion is subject to debate.

In the run-up to the attack I personally believed most of the administration’s claims: the presence of weapons of mass destruction, including nuclear weapons; possible links to terrorists sufficient to cause at least some concern about Iraqi weapons of mass destruction getting into the hands of some terrorists; a more welcoming reception by most Iraqis to the ending of a brutal dictatorship; and a possible warning to other Islamic and Arab dictatorships that their days were numbered. I did oppose a full-scale invasion of Iraq, but my conclusion was not open-and-shut. I publicly stated that my opposition was “a 51–49 percent” matter, with the deciding 2 percent based on the law of unintended consequences.* I made references to the transition from the dictatorship of the relatively secular shah of Iran to the far worse dictatorship of the religiously extremist cabal that succeeded him, and I pointed out that in general it is better to deal with secular dictators who fear death than with religious zealots who welcome it. I worried about a long occupation, with many casualties on both sides and with the strengthening of Islamic fundamentalism (such as occurred in Israel, following the long-term occupation of the Gaza and West Bank).

I claim no special prescience, only the general pessimism that results from long experience watching the best-laid (and best-intentioned) plans of men and women go awry. The law of unintended consequences is more powerful, more certain, and more enduring than most other human laws. It must be factored into any predictive equation regarding the consequences of complex human actions. But there are unintended consequences to inaction as well, as the tragic events of World War II demonstrate.

There are always lessons to be learned from a military action that has gone wrong, as the invasion and occupation of Iraq seem to demonstrate. These lessons must become part of our collective thinking as we move toward constructing a jurisprudence and morality of preventive and preemptive war. The lessons of Iraq will also have an influence on any decision regarding preemptive or preventive military action against Iran. But it would be a mistake to build an entire jurisprudence on one experience that has divided reasonable people and entire nations. The lessons of Iraq are ongoing and incomplete. The end of the story has not yet been written in the blood of the many who will yet suffer. Elections, suicide bombings, trials, the establishment of a stable government, assassinations, progress toward a constitution, victories, defeats—all these occur on a daily basis. The one lesson that is clear is that wars of this kind—unlike singular preemptive attacks, rarely go smoothly or are without very high costs. There are other lessons as well, ones that mandate extreme caution before embarking on full-scale preventive wars, invasions, and occupations but that also mandate that no option, not even full-scale preventive war, should be taken off the table if the dangers are grave enough, if the certainty is high enough, and if other options are unrealistic enough.

One vexing question—and perhaps another testing case—is how to apply these lessons and cautionary principles to the far more difficult case of the multiple nuclear facilities in Iran, which are protected, spread out, and deliberately located near population centers. The case is also more complex because of the different internal dynamics within Iran. It is this issue to which we now turn, before we try to articulate a general jurisprudence capable of informing preemptive and preventive decisions.

images

* I also said that I did not regard the decision to invade, despite my opposition to it, as a “marchable” event, warranting shrill public protest, because I believed that reasonable people could disagree about its wisdom and morality.