CHAPTER 1

THE FOREIGN POLICY OF AMERICAN PRIMACY

What a difference two decades can make. In 1980, America was obsessed with its rivalry with the Soviet Union and brought cold-warrior Ronald Reagan to Washing-

ton to make sure it didn’t fall dangerously behind its communist adversary. Ten years later, with the Soviet Union imploding, the United States found itself without a Great Power rival but quickly began to worry about Japan’s seemingly unstoppable economic momentum. These fears evaporated when the Japanese bubble burst, and the sclerotic Japanese political system seemed unable to implement meaningful reforms. While Japan stagnated, the United States enjoyed eight years of robust economic growth, its preeminence in the world more apparent with each passing year. Most of the world’s major powers were now U.S. allies, and its principal enemies were a handful of minor “rogue states” and a shadowy terrorist network whose ambitions and abilities were at that point only partly understood. As the new millennium dawned, the United States stood alone at the pinnacle of world power.

Although careful, at first, not to risk too much blood or treasure, America did not simply sit back and savor its privileged position. Rather, U.S. leaders saw the unchecked power at their disposal as an opportunity to mold the international environment, to enhance the U.S. position even more, and to reap even greater benefits in the future.

How did the United States do this? In broad terms, America’s leaders have sought to persuade as many countries as possible to embrace their particular vision of a liberal-capitalist world order. States that welcomed U.S. leadership were rewarded; states that resisted it were ignored or punished. America’s overarching objective, reports Richard N. Haass, former head of the State Department’s Policy Planning Staff and now president of the Council on Foreign Relations, was to integrate other countries “into arrangements that will sustain a world consistent with U.S. interests and values and thereby promote peace, prosperity, and justice.”1

These broad objectives were pursued by all three post–Cold War presidents—George H. W. Bush, William Clinton, and George W. Bush—though critical differences exist between the approaches undertaken by the first two and the third. In general, both the main goals of U.S. foreign policy and the strategies used to achieve these goals did not change fundamentally under the first Bush administration and the Clinton administration. Each sought to preserve or increase U.S. power and influence, to prevent the spread of weapons of mass destruction, to further liberalize the world economy, and to promote the core U.S. values of democracy and human rights. Both administrations pursued these goals by working within the preexisting Cold War order—and especially the multilateral institutions created since 1945—while seeking to maximize U.S. influence within these arrangements. Although the absence of a unifying threat and concerns about U.S. power created occasional frictions with other countries— including some long-term U.S. allies—America’s standing in most of the world was quite positive as the twentieth century ended.

By contrast, President George W. Bush’s approach to foreign policy marked a clear departure from the policies of his predecessors. Although Bush also sought to enhance U.S. power, oppose the spread of weapons of mass destruction, and promote an open world economy and democratic values, his administration was more skeptical of existing international institutions—including America’s Cold War alliances—and far more willing to “go it alone” in foreign affairs. Convinced that other states would follow if U.S. leadership was clear and uncompromising, and emboldened by a surge of domestic and international support following the terrorist attacks on September 11, 2001, Bush chose to use American power—and especially its military power—to eliminate perceived threats and to promote U.S. ideals around the world. Bush’s basic goals were not radically different from those of his predecessors, but his willingness to use U.S. muscle to achieve them—and to act alone—was new, and startling. Predictably, this new approach to foreign policy alarmed many other countries and sparked a steady decline in the U.S. image abroad.

The Foundations of American Primacy

The U.S. position in the current world order is best understood as one of primacy. The United States is not a global hegemon, because it cannot physically control the entire globe and thus cannot compel other states to do whatever it wants.2 Indeed, as we shall see throughout this book, even relatively weak states retain considerable freedom of action in the face of U.S. power. Nonetheless, the United States is also something more than just “first among equals.” By virtually any measure, the United States enjoys an asymmetry of power unseen since the emergence of the modern state system. Some leading powers in the past had gained an advantage in one dimension or another—for example, in 1850 Great Britain controlled about 70 percent of Europe’s wealth while the number-two power, France, controlled only 16 percent—but the United States is the only Great Power in modern history to establish a clear lead in virtually every important dimension of power.3 The United States has the world’s largest economy, an overwhelming military advantage, a dominant position in key international institutions, and far-reaching cultural and ideological influence. Moreover, these advantages are magnified by a favorable geopolitical position.4 If primacy is defined as being “first in order, importance or authority,” or holding “first or chief place,” then it is an apt description of America’s current position.5

Economic Dominance

Economic strength is the foundation for national power. The United States has been blessed with the world’s largest economy for over a century. The U.S. share of global production ballooned to nearly 50 percent after World War II—reflecting the damage that other countries had suffered during the war—and then gradually declined as the rest of the world recovered. Nevertheless, it has hovered between 25 and 30 percent from 1960 to the present, and the U.S. economy is still roughly 60 percent larger than its nearest rival, Japan.6 Moreover, the U.S. economy has grown more rapidly than the economies of most of the other major industrial powers since the mid-1990s.

The U.S. economy is also more diverse and self-sufficient than the other major economic powers, making it less vulnerable to unexpected economic shifts. As Thomas Friedman and others have emphasized, American society remains unusually open to immigration and innovation, which makes it more adept at adapting to new conditions.7 According to MIT’s “Technology Review Patent Scorecard,” American companies were the top innovators in the automotive, aerospace, biotechnology, pharmaceutical, chemical, computer, telecommunications, and semiconductor industries between 1998 and 2002, and trailed Japan (the number-two innovator) only in electronics.8 Even if these trends were to reverse, it would take several decades before any other country could acquire a gross national product (GNP) equal to America’s, let alone a similar combination of size and per capita income.9

The size and diversity of the U.S. economy gives it considerable political leverage. Although the United States is more dependent on the outside world than it was a generation ago, it still depends far less on others than they depend on it. In 2000, for example, only three countries had lower ratios of trade to gross domestic product (GDP) than the United States, and only one of them was a major military power.10 This relatively low level of interdependence insulates the United States from foreign pressure and strengthens America’s political clout. Does anyone really wonder why China behaved with such restraint following the forced landing of a U.S. reconnaissance aircraft on Chinese soil in March 2001? The answer is simple: Chinese exports to the United States were a whopping 5 percent of Chinese GDP (19 percent of total Chinese exports) and critical to Chinese economic growth. U.S. exports to China, by contrast, were a mere 0.16 percent of U.S. GDP.11 Because Beijing could not afford a disruption in economic ties, it agreed to release the crew and return the damaged plane to the United States after offering little more than a token protest over U.S. spying.

A state can be wealthy without being powerful, of course—think of Brunei, Kuwait, or Switzerland—but it is impossible to be a Great Power without a large and diverse economy. In particular, a strong economy enables a state to create and equip a powerful military force. Today, the United States is not only the world’s foremost economic power; it is clearly the dominant military power as well.

Military Supremacy

While America’s economic advantages are manifold, its military lead is simply overwhelming. Virtually all the major powers (except China) reduced their defense spending when the Cold War ended, and many continue to do so, but the decline in the U.S. military budget was smaller than most of the others. Defense spending turned back up again in the late 1990s and has continued to rise ever since. As a result, U.S. defense expenditures in 2003 were nearly 40 percent of the global total and almost seven times larger than that of the number-two power (China). To put it another way, U.S. defense spending was equal to the amount spent on defense by the next thirteen countries combined. And because many of these countries are close U.S. allies, these figures if anything understate the U.S. advantage.12 The United States also spends more to keep itself in the vanguard of military technology. The U.S. Department of Defense now spends over $50 billion annually for “research, development, testing, and evaluation,” an amount larger than the entire defense budget of Germany, Great Britain, France, Russia, Japan, or China.13

The relative efficiency of the U.S. military increases this daunting gap in military investment even more. For all the complaints about “waste, fraud, and abuse” and “no-bid” contracts in the Pentagon, the United States gets more battlefield bang for its defense bucks than other major powers do. For example, the combined defense budgets of America’s European allies are roughly two-thirds of the U.S. defense total, but the EU is not yet able to put 60,000 well-equipped troops in the field within sixty days and keep them there for a year. By contrast, the United States deployed more than 500,000 troops in the Persian Gulf for Operations Desert Shield and Desert Storm; mobilized substantial air, ground, and naval forces in Kosovo in 1999 and in Afghanistan in 2001; and then deployed more than 180,000 troops and other personnel to topple Saddam Hussein in 2003.

America’s military preeminence is both reflected by and enhanced by its global military presence. As of 2004, the United States had roughly 250,000 soldiers, sailors, and airmen deployed in more than a hundred countries. It has 1,000 or more troops in at least a dozen countries, not counting the forces currently occupying Iraq.14 Smaller contingents are also active in dozens of countries, and the United States provides military training for personnel from over 130 countries.15 The United States maintains hundreds of military bases and other facilities around the world, with an estimated replacement value of $118 billion.16 The United States has the largest and most sophisticated arsenal of strategic nuclear weapons, and it is the only country with a global power projection capability, stealth aircraft, a large arsenal of precision-guided munitions, and integrated surveillance, reconnaissance, and command-and-control capabilities.17 U.S. military personnel are also far better trained.18

U.S. success in its post–Cold War military endeavors thus needs little further explanation. The United States has fought three opponents since 1990: Iraq (twice), Serbia, and the Taliban regime in Afghanistan. Each of these wars was a gross mismatch: total U.S. defense spending was more than fifty times greater than Iraq’s, more than two hundred times greater than Serbia’s, and more than a thousand times that of the Taliban. The occupation of Iraq reminds us that supremacy on the battlefield does not guarantee effective postwar reconstruction or an unfettered peace, but the United States can still be confident about its ability to defeat any other country in a direct test of military strength.

Given these disparities, the United States could have defeated any of its recent foes without active military assistance from any other country. Indeed, the “coalitions” that the United States has organized and led during this period have been decidedly one-sided affairs. With the partial exception of Great Britain, its various allies have provided token forces largely for symbolic purposes. By 2001, the United States was refusing to let even its closest allies take on meaningful combat roles in Afghanistan (except for postwar peacekeeping), so that it would not have to coordinate its military activities with any other country. Historian Paul Kennedy correctly termed this a “Potemkin alliance,” where “the U.S. does 98 percent of the fighting, the British 2 percent, and the Japanese steam around Mauritius.”19 The U.S. Air Force performed the lion’s share of the patrol duties over the “no-fly zones” in Iraq (with a modest assist from Great Britain), and the U.S. military has also provided logistical support for peacekeeping operations in Africa, East Timor, and elsewhere. The gap was perhaps most apparent in the invasion of Iraq: the United States supplied over 80 percent of the occupying force and used over 10 percent of its total military manpower. By contrast, the other members of the coalition used less than 1 percent of their personnel.20

Clearly, no single state can hope to match the combined U.S. economic and military capabilities, and even a large coalition would find it difficult to amass a comparable portfolio of power. This imbalance does not mean, however, that the United States can simply issue orders to the rest of the world and expect instant obedience, as the recent U.S. experiences in Iraq, Iran, and North Korea remind us, and the strains of its current activities could undermine U.S. superiority over the longer term. For the foreseeable future, however, America’s military predominance will be an essential element of its position of primacy.

Institutional Influence

States use international institutions to coordinate and regulate certain joint activities. Peacekeeping, for example, falls under the auspices of the United Nations; international trade is regulated by the World Trade Organization (WTO); and international finance and global development are guided (in part) by the International Monetary Fund (IMF) and the World Bank. Other international organizations deal with global health issues, the environment, and a host of other common concerns. Institutions neither substitute for state power nor exert powerful constraints on Great Power behavior, but they can provide useful mechanisms for overcoming dilemmas of collective action and other obstacles to multilateral cooperation.21

The norms and rules that govern these institutions will prevent any single state (or group of states) from controlling them completely, yet the United States plays a unique role in the most important global organizations. In the United Nations, the United States is one of five permanent members in the Security Council, and thus retains veto rights over all matters falling within the council’s aegis. Because of its military dominance, the United States can also ignore the Security Council when it wants to, as it did in waging war in Kosovo in 1999 and in Iraq in 2003. The United States is also the dominant power within the North Atlantic Treaty Organization (NATO). The position of NATO Supreme Allied Commander Europe (SACEUR) is always held by an American officer—who also commands all U.S. forces in the European theater. Further, the United States can at any time reduce its commitment to Europe’s security because it is not on the same continent; the Europeans do not have that option. Although “political consultation does exist within the North Atlantic Council,” notes Guillaume Parmentier, the former head of external relations at NATO headquarters, “these negotiations are very often no more than a formality…. It has almost always been American initiatives that have brought about changes in the Alliance.”22

More important, the rules governing the operation of many global institutions strengthen the hands of U.S. officials. In addition to its Security Council veto, for example, the United States also contributes the largest share (22 percent) of the UN’s annual budget.23 When the United States withholds some of its assigned share (as it did in the 1990s), the shortfall has a direct effect on what the UN is able to accomplish. When a U.S. president decides that the UN secretary-general is acting contrary to U.S. interests—as Bill Clinton did during the tenure of Boutros Boutros-Ghali—he can orchestrate his or her removal.24 Similarly, U.S. contributions to IMF and World Bank capital subscriptions entitle it to a voting share on their executive boards sufficient to veto any major policy change. Thus, while the United States does not “control” these institutions, both are highly sensitive to Washington’s wishes. As one student of the IMF and the World Bank puts it, “The record of lending from both institutions strongly suggests a pattern of U.S. interests and preferences.”25

Cultural and Ideological Impact

Another key advantage for the United States is its ability to shape the preferences of others—to make them want what America wants—through the inherent attractiveness of U.S. culture, ideology, and institutions. This “soft power” remains hard to define or measure, but there is little doubt that the United States casts a long cultural and ideological shadow over the rest of the world.26 Not only is English increasingly the lingua franca of diplomacy, science, and international business, but the American university system is a potent mechanism for socializing foreign elites.27 There were nearly 600,000 foreign students studying at U.S. universities in 2002–3, for example, roughly double the total from two decades earlier.28 In addition to becoming familiar with U.S. norms, foreign students in the United States absorb prevailing U.S. attitudes about politics and economics, especially the emphasis on competitive markets, democratic institutions, and the rule of law.29 As Die Zeit editor Josef Joffe puts it, “If there is a global civilization, it is American. Nor is it just McDonald’s and Hollywood, it is also Microsoft and Harvard. Wealthy Romans used to send their children to Greek universities; today’s Greeks, that is, the Europeans, send their kids to Roman, that is, American, universities.”30 Even in the Arab world, where the United States is presently unpopular, America’s educational institutions continue to attract students and continue to serve as an inspirational model.31 This sort of influence can have unexpected payoffs: for example, Libya’s decision to end its prolonged isolation and to abandon its efforts to acquire weapons of mass destruction was due in part to the influence of reform-oriented Prime Minister Shukri Ghanem, who received a PhD in economics from the Fletcher School of Law and Diplomacy in Medford, Massachusetts.32

The effects of America’s preeminent position in higher education are reinforced by the pervasiveness of U.S. mass media.33 As of 2004, the top twenty-five highest-grossing films of all time were U.S. productions, even if one omits U.S. ticket sales and looks solely at foreign revenues. American consumer products and brand names are ubiquitous, along with U.S. sports and media figures. When the Cold War ended and was followed by the economic boom of the 1990s, free markets and democratic governance became more appealing worldwide. American primacy, therefore, extends into the cultural realm, though the ultimate impact of this level of penetration is unclear. While American ideas and cultural icons seem to enjoy greater appeal today than in the past, many societies remain wary of “Americanization,” and a few are openly hostile to it.

The Blessings of Geography

Economic, military, institutional, and cultural dominance may define U.S. primacy, but its geopolitical situation is the icing on the cake. The United States is the only Great Power in the Western Hemisphere, and it is physically separated from other major powers by two enormous oceanic moats. As Jules Jusseraud, French ambassador to the United States from 1902 to 1925, once observed, America is “blessed among the nations. On the north, she had a weak neighbor; on the south, another weak neighbor; on the east, fish, and the west, fish.”34 America’s geographic separation from the other major powers reduces the threat of direct invasion while mitigating the sense of threat that U.S. power might pose to others, thereby lowering their incentives to join forces against the United States. And because the other major powers lie in close proximity to one another, they are inclined to worry more about each other than they do about the United States. In fact, the United States has long been the perfect ally for many Eurasian states. Its power ensures that its voice is heard and its actions felt, but it lies a comfortable distance away and does not threaten to conquer its allies. Over thirty years ago, Chinese leader Mao Zedong justified the rapprochement between China and the United States by saying, “Didn’t our ancestors counsel negotiating with faraway countries while fighting with those that are near?” Ten years ago, a European diplomat told an American scholar that “A European power broker would be a hegemon. We can agree on U.S. leadership, but not on one of our own.”35

Geography also helps explain why it would be difficult to conjure up an anti-American coalition combining Russia, China, and India. These states share lengthy borders and troubled pasts and still regard each other with considerable suspicion. Although one cannot entirely rule out the possibility of a grand anti-American coalition were the United States to behave too aggressively, it would take a remarkable feat of American diplomatic incompetence to bring it about.

How Has the United States Used Its Power?

The Virtues of Primacy

It may be lonely at the top, but the view is compelling. Having achieved a preeminent global position, U.S. leaders have been eager to preserve and protect it. They understand, as do most Americans, that primacy confers important benefits. Primacy makes other states less likely to threaten the United States or its vital interests, and it gives the United States the power to defend these interests if challenges do arise. By dampening Great Power competition and giving the United States the capacity to shape regional balances of power, primacy also contributes to a more tranquil international environment. That tranquility in turn fosters global prosperity, because investors and traders can operate more widely when the danger of war is remote. Primacy also gives the United States a greater capacity to work for positive ends—the advancement of human rights, the alleviation of poverty and disease, the control of weapons of mass destruction, etc.—although it provides no guarantee of success.

The United States relishes its position of primacy, and no presidential candidate ever campaigns on a pledge to “make America number two.” Instead, the central goal of U.S. foreign policy since the end of the Cold War has been to consolidate and, where possible, to enhance its preeminent position. The United States has acted for humanitarian reasons on occasion—as when it intervened in Bosnia and Kosovo, for example—but the main thrust of its policy has been to advance its own interests and increase its relative power.

As we shall see, however, there are important differences between the exercise of primacy by George H. W. Bush and Bill Clinton on the one hand and George W. Bush on the other. All three have worked to improve the U.S. position and to use the power at their disposal to shape the world in the U.S. image, but they have done so in markedly different ways.

Primacy and Security:

Promoting a Favorable Imbalance of Power

Preserving U.S. dominance. International politics is a dangerous business, and having more power is generally preferable to having less. Being the strongest state does not protect the United States from all dangers—September 11, 2001, certainly proved that—but it does give the United States a greater capacity to respond to dangers that do arise. Primacy also deters many challenges to U.S. interests, because potential adversaries know that the United States is strong enough to respond if pressed. Secretary of Defense Donald Rumsfeld captured this benefit of primacy perfectly by saying, “We want to be so powerful and so forward-looking that it is clear to others that they … ought not to be doing things that are imposing threats and dangers to us.”36

American primacy did not just happen, of course. U.S. leaders recognized America’s potential strength from the first days of independence, and successive presidents focused on expanding across a continent, preserving national unity, building industrial power, and eventually excluding other Great Powers from the Western Hemisphere—all with an eye toward establishing the United States as the dominant power in its own hemisphere and as a major power on the global stage. Good luck played a role in America’s rise to world power, but it was also the conscious objective of U.S. policy.37

The same perspective has informed U.S. foreign policy for the past half-century. Indeed, U.S. leaders sought a preponderance of power in America’s favor from the very beginning of the Cold War. As the Policy Planning Staff at the State Department concluded in 1947, “[T]o seek less than preponderant power would be to opt for defeat. Preponderant power must be the object of U.S. policy.”38 Thus, the collapse of the Soviet Union was not just a lucky accident of history; it was the intended result of four decades of sustained U.S. effort. Although “containment” was the official label of Cold War policy, the real U.S. objective was Communism’s complete defeat.39

When the irreversible erosion of Soviet power eventually forced Mikhail Gorbachev to offer unprecedented concessions in the mid-1980s, the United States didn’t respond by offering reciprocal concessions or by trying to keep bipolarity intact. Instead, U.S. leaders increased their demands as Moscow’s position grew weaker and eventually negotiated the dissolution of the Warsaw Pact and the reunification of Germany despite strong Soviet misgivings. When Gorbachev complained in 1986 that “U.S. policy is one of extorting more and more concessions,” U.S. Secretary of State George Shultz merely replied, “I’m weeping for you.”40 And when the Soviet Union began to come apart in 1990, the United States did not lift a finger to hold it together. Instead, President George H. W. Bush welcomed the disappearance of a longtime rival, believing that “the best arrangement would be diffusion, with many different states, none of which would have the awesome power of the Soviet Union.” The result, Bush later recalled, was the United States “standing alone at the height of power … with the rarest opportunity to shape the world.”41

The desire for primacy also explains why the United States did not dismantle its vast military arsenal once the Soviet Union collapsed. Instead, each U.S. administration made it clear that it intended to maintain a clear margin of superiority over the rest of the world. In 1992, George H. W. Bush’s administration prepared a draft “Defense Guidance,” arguing that the United States “must sufficiently account for the interests of the advanced industrial nations to discourage them from challenging our leadership or seeking to overturn the established political and economic order.” The way to accomplish this goal was equally clear: The United States should maintain military capabilities large enough to discourage potential rivals from even trying to compete.42

The Clinton administration placed less rhetorical emphasis on U.S. military power, but it also sought to “maintain the best-trained, best-equipped and best-led military force in the world” and to ensure “that U.S. forces continue to have unchallenged superiority in the 21st century.”43 This goal became even clearer under George W. Bush, whose official National Security Strategy declared that U.S. military forces will be “strong enough to dissuade potential adversaries from pursuing a military build-up in hopes of surpassing, or equaling, the power of the United States.”44

The desire for primacy can also be seen in U.S. efforts to limit the spread of weapons of mass destruction (WMD) and especially nuclear weapons, while maintaining a large and sophisticated nuclear arsenal of its own. Because the United States has the world’s strongest conventional forces, halting or slowing the spread of WMD is obviously in its own self-interest. If weaker states obtain WMD, they may be able to deter the use of U.S. conventional forces, thereby undercutting U.S. military superiority and eroding U.S. global influence.45 U.S. leaders also fear that rogue states armed with WMD might behave more aggressively, thereby making regional conflict more likely and threatening U.S. interests directly, or that the spread of WMD would make it easier for terrorists to obtain them.46

For all these reasons, the United States has conducted a broad campaign to limit the spread of WMD and to maintain a dominant position at the highest levels of strategic competition. It used its economic and political clout to persuade the former Soviet republics of Kazakhstan, Belarus, and Ukraine to give up the nuclear arsenals they had inherited when the Soviet Union dissolved. The United States subsequently launched a broad array of programs to bring Russia’s vast and poorly monitored arsenal under more reliable control.47 The United States supported the permanent extension of the Nuclear Non-Proliferation Treaty (NPT) in 1995 and actively tried to discourage countries such as Libya, North Korea, India, and Pakistan from pursuing a nuclear option.48 The United States has also pressured Russia not to sell a nuclear reactor to Iran and has favored strong measures to persuade Iran to end its clandestine nuclear programs. These efforts have been accompanied by a parallel campaign to deny potential proliferators access to modern ballistic missile technology, which would merely compound the perceived threat from WMD.49

The final element in the U.S. quest to preserve its position of primacy is its own strategic weapons program. The United States has made significant cuts in its nuclear arsenal since the end of the Cold War, but it has done this primarily by retiring outmoded or superfluous systems. At the same time, it has continued to modernize its remaining strategic forces while negotiating parallel reductions in Russia’s aging arsenal. In addition, the first Bush administration and the Clinton administration continued to fund research and development for missile defenses, and the second Bush administration took the further step of abrogating the 1972 Anti-Ballistic Missile Treaty (despite strong opposition from many U.S. allies) and declared that the United States would begin deploying missile defenses as soon as they could be developed. The second Bush administration also completed a Nuclear Posture Review in 2004 recommending development of a new generation of nuclear weapons, including bunker-busting “mini-nukes.”50

The underlying purpose of these initiatives, as several analysts have noted, is to preserve U.S. superiority and thus maintain America’s overall freedom of action.51 At a minimum, overwhelming nuclear superiority—including the development of some form of national missile defense—is intended to keep so-called rogue states from deterring the use of U.S. conventional forces by threatening to launch a handful of nuclear weapons at U.S. territory. At a maximum, the combination of highly accurate offensive forces, an increasingly unreliable Russian arsenal, and improved strategic defenses might provide the United States with a genuine “first-strike” capability. Needless to say, countries with smaller nuclear arsenals would be even more vulnerable. Even if U.S. leaders cannot be 100 percent sure that such an attack would succeed, other states may fear this possibility and be more inclined to defer to U.S. wishes. America’s strategic ambitions even extend into outer space. As Undersecretary of the Air Force Peter B. Teets told an Air Force Association exposition in September 2004, “the fact is, we need to reach for that goal [of space supremacy]. It is the ultimate high ground.”52

The bottom line is clear: The United States remains committed to maintaining—and, if possible, enhancing—its position of primacy, at virtually all levels of strategic competition.

Expanded global influence. A similar desire to improve its global position can also be seen in U.S. alliance policy. When the Soviet Union collapsed, U.S. leaders might have concluded that there was no longer a major threat to “contain,” and thus less need for a global network of formal military alliances. Indeed, this logic led many Europeans and Asians to worry about a precipitous U.S. withdrawal. Instead of liquidating its existing commitments, however, the United States chose to expand its formal ties in Europe and Asia while retaining its central leadership role.

In Europe, the United States initiated and led a process of expansion that increased NATO from sixteen to twenty-six members by 2004.53 At the same time, the United States remained wary of developments that might diminish its influence. In particular, U.S. officials sought to prevent other European security institutions from supplanting NATO, because NATO was still seen as the best vehicle for retaining U.S. influence on European security issues. The role of SACEUR remained in U.S. hands, and the United States blocked a French proposal to assign NATO’s Southern Command to a European officer in 1996.54 American officials in all three (Clinton and Bush) administrations remained ambivalent about European efforts to strengthen their own defense capabilities, warning that such efforts would be desirable only if they did not “undermine” the alliance.

Special demands on NATO brought the dominant position of the United States into sharp relief. The United States controlled the negotiations that ended the Bosnian civil war and sent the largest contingent to the stabilization force that arrived to implement the accord. The United States also took the lead role in the negotiations that led to war against Serbia in 1999 and designed the military strategy that eventually brought Serbia’s capitulation. America’s European allies complained during both episodes, but they could do little to stop the United States from imposing its preferences upon them.55

Maintaining a favorable position in Asia has been equally important. Despite fears that the Soviet collapse would lead to a U.S. withdrawal, the United States has maintained a substantial military presence and worked to strengthen its security relations with several regional powers.56 In addition to renegotiating its security treaty with Japan and maintaining its long-standing alliance with South Korea, the United States also opened diplomatic relations with Vietnam in 1995 and obtained expanded basing rights in Singapore in 1999.

China’s economic expansion poses a potential long-term threat to U.S. primacy, and the United States has tried to limit the impact of this trend in two ways. On the one hand, the United States has gone to considerable lengths to encourage China to integrate itself within global institutions such as the World Trade Organization (WTO); on the other hand, it has also sought to deter China from taking aggressive action against Taiwan or against other U.S. allies in the region.57 Concerns about China’s potential have been more evident under George W. Bush, who took office declaring that China was “a competitor, not a strategic partner,” and pledging that Beijing would be “unthreatened but not unchecked.”58 This preoccupation with China was partly eclipsed by September 11 and the new focus on terrorism, but Bush still sought to discourage China from becoming a “peer competitor.” Accordingly, Bush’s 2002 National Security Strategy welcomed the emergence of a “strong, peaceful and prosperous China” but warned that if China tried to obtain “advanced military capabilities,” it would be following an “outdated path” that “would hamper its pursuit of national greatness.”59 In Asia, as in Europe, the United States wants to remain the dominant power.

The post–Cold War era also saw the United States taking a more active role in shaping the balance of power in the Middle East and Persian Gulf. During the Cold War, the United States sought to defend its core interests—the security of Israel and access to oil— primarily by supporting local allies (Israel, Saudi Arabia, Iran, Jordan, etc.) while avoiding direct military involvement in the region. When serious threats arose—as in the aftermath of the Iranian revolution in 1980—the United States was also willing to give support to local dictatorships such as Saddam Hussein’s Iraq.60 In general, however, the United States maintained a modest military presence in the region, and its forces did not participate in direct combat operations.61

This detached approach ended when Iraq invaded Kuwait in August 1990. To prevent Iraq from threatening other oil producers or using Kuwait’s oil revenues to amass even more military power, the United States led a multinational coalition to expel Iraq from Kuwait and force Saddam Hussein to dismantle his nuclear, chemical, and biological weapons programs under the watchful eyes of UN weapons inspectors. Once the Gulf War was over, the United States increased its military presence in the region and kept these forces busy.62 U.S. aircraft patrolled Iraqi airspace continuously after the end of the Gulf War, and U.S. and British aircraft attacked Iraqi facilities repeatedly. The United States also took the lead role in maintaining economic sanctions against the Iraqi government (despite dwindling international support), in order to prevent Iraq from rebuilding the military forces and weapons programs that had been destroyed or dismantled as a result of the Gulf War. At the same time, the United States also sought to prevent Iran from increasing its own military capabilities—through a policy known as “dual containment”—largely by denying it easy access to advanced weapons technology.63 What was not fully recognized at the time, unfortunately, was the emergence of an Islamic terrorist group—al Qaeda—partly inspired by opposition to the U.S. presence on “holy” Muslim soil.

Thus, the Gulf War and its aftermath were not just vehicles to restore the status quo ante; rather, the United States saw these events as opportunities to tilt the regional balance in its favor. Under George H. W. Bush and Clinton, however, the United States did so as part of a multilateral effort, and did not try to remove existing regimes from power. As discussed at great length below, George W. Bush’s administration took a more ambitious approach to the Middle East and the Gulf, and eventually chose to use U.S. military power to begin a far-reaching transformation of the region.

Primacy and Prosperity

It is sometimes said that interdependence produces peace, but it may be more accurate to say that peace encourages greater interdependence.64 By fostering a more stable world, therefore, U.S. primacy also encourages global prosperity. Investors are more willing to send capital abroad when the danger of war is remote, and states worry less about being dependent on others when they have little reason to fear that these connections might be severed. When states are relatively secure, they will also be less concerned about how the benefits from cooperation are distributed. By this argument, U.S. primacy creates political conditions conducive to expanding international trade and investment.

U.S. leaders have also sought to foster an international economic order that would enhance U.S. prosperity and advance other U.S. interests. Specifically, the United States has used its power to reduce barriers to trade and investment, and to create and maintain the institutions on which the current international economic order rests. In doing so, it has also sought to persuade other countries to adopt domestic institutions that are compatible with basic U.S. practices. Accordingly, what we now think of as “globalization” is itself partly an artifact of U.S. primacy.65

Trade policy. As the world’s largest economy, the United States has a powerful interest in free trade. Reducing artificial barriers to free trade brings lower prices for U.S. consumers and facilitates U.S. exports, thereby raising U.S. living standards and fostering U.S. economic growth. Accordingly, the United States has generally been a proponent of trade liberalization, except in particular sectors with significant domestic political influence (e.g., steel, textiles, certain farm products, etc.) that would be hurt by increased foreign competition.

On balance, the United States has pursued the goal of continued liberalization during the first decade of the unipolar era. Notable achievements include the North American Free Trade Agreement (NAFTA), which was signed by President George H. W. Bush and ratified under President Clinton, and the successful completion of the Uruguay Round in 1994, which established the World Trade Organization. The United States also helped organize and convene other global and regional forums (such as the Asia Pacific Economic Cooperation group (APEC) and the Transatlantic Economic Partnership (TEP) forum), and conducted an ambitious export promotion policy. All three post–Cold War administrations have negotiated bilateral trade agreements with particular countries and have used the threat of retaliatory sanctions to persuade other countries to open their markets to U.S. goods.66 The results were generally favorable: U.S. exports increased by about 11 percent between 1992 and 1999, and this expansion accounted for roughly 20 percent of U.S. GDP growth during that time.67

The liberal theory of free trade tells us that all countries benefit from reducing barriers to trade, and U.S. efforts to create a more liberal international economic order were probably not intended to make the United States richer and others poorer. Yet the agreements that were reached were particularly favorable to U.S. interests. In addition to lowering tariffs on industrial and manufactured goods, for example, the Uruguay Round also included extended trading rules to the service and agriculture sectors, provided new guarantees for intellectual property rights, and reached a partial agreement on “trade-related investment measures,” in each case covering an area where the United States had a strong competitive position. It also created a new dispute-resolution system that favors the United States, because the new system depends in part on each state’s ability to retaliate and the United States has by far the world’s most potent retaliatory capacity. The Uruguay Round also took aim at a number of nontariff barriers to trade, which other countries have relied upon more than the United States has, and the United States also stood to reap disproportionate benefits from agreements on telecommunications, information technology, and financial services, because the United States enjoys a competitive advantage in these sectors.

By contrast, the United States successfully resisted efforts to end agricultural subsidies, and although it did agree to phase out the protectionist Multi-Fiber Agreement, its own tariffs on textiles remained high. Nor did the WTO agreement bar the United States from threatening unilateral sanctions or prevent U.S. trade negotiators from pursuing an array of bilateral and regional arrangements outside the WTO framework.

Furthermore, U.S. negotiators have increasingly insisted that trade agreements also cover areas that were previously regarded as purely domestic issues (e.g., labor standards, environmental protection, intellectual property, government regulations, etc.), and both NAFTA and the WTO incorporated these issues (albeit to varying degrees). As a result, these agreements require signatories to bring a host of domestic institutions and procedures into conformity with specific treaty-defined standards. For example, joining NAFTA forced Mexico to submit its enforcement of labor and environmental standards to international oversight, and China’s accession to the WTO required it to commit to major domestic economic reforms and a host of specific rules on tariffs, nontariff barriers, anti-dumping, subsidies, and technology transfers. Moreover, these adjustments have to be made rapidly, which increases the degree of internal disruption. The United States and Western Europe were already in compliance with many of these principles, so adjusting to them was comparatively easy.

Many of these reforms are desirable for their own sake, and entering a bilateral or multilateral agreement is one way for reformist politicians to “lock in” a more market-oriented approach. Yet the fact remains that these institutions were largely “made in America” (and, to a lesser extent, in Europe and Japan), and countries that joined later—such as China—have had to accept a set of rules that they had little role in shaping and that did not reflect their own ideal preferences.68

The use of U.S. economic clout was also apparent in what former Bush administration Trade Representative Robert Zoellick termed a strategy of “competitive liberalization.” Although the Bush team was committed to advancing the Doha Round of multilateral trade talks, it also pursued an ambitious array of bilateral trade negotiations.69 These bilateral talks were used to reward governments that were cooperating closely with the United States, and to encourage other states to adopt social and political institutions that the United States favored. The sheer size of the U.S. economy gave it considerable leverage over smaller exporting countries, because other states have to worry about being excluded from the U.S. market if they do not cooperate on U.S. terms.70 As Zoellick put it, “The strategy is simple: The U.S. is spurring a competition in liberalization. . . . Our FTAs [free trade agreements] are encouraging reformers … they establish prototypes for liberalization in areas such as services, e-commerce, intellectual property for knowledge societies, transparency in government regulation, and better enforcement of labor and environmental protections.”71 Thus, the Bush administration’s trade strategy sought to maximize U.S. leverage and encourage other states to adopt institutions and practices that are largely “made in America.”

International finance. Liberalization on U.S. terms has been the general goal in the realm of international finance as well. Capital mobility has increased dramatically in the past two decades, due to declining communications costs, the spread of market systems, and expanded trade.72 U.S. leaders have encouraged these trends, however, based on the belief that integrating more countries into global financial markets would encourage more rapid economic growth, but also because U.S. banks and financial-services firms were likely to benefit from the removal of political barriers in the financial sector. Unfortunately, these same trends may also have increased the volatility of the international financial system and helped produce financial crises in Mexico, Brazil, and Russia, as well as the Asian financial crises of 1997.

These events underscored the need for new international arrangements to reduce the costs and risks of financial volatility. The United States has supported these efforts, while emphasizing the importance of domestic institutional reform in countries with corrupt, opaque, or underregulated financial sectors. As with trade, this objective means that other states will have to bring their domestic financial practices more in line with those of the United States and other advanced industrial countries. Toward this end, the Bush administration has called for reform of the major international development banks (e.g., the World Bank), focusing in particular on the need for “greater emphasis on measurable results and activities that increase productivity, including private sector development.”73

The second Bush administration has sought to pursue the same goals in an even more assertive fashion through its Millennium Challenge Account (MCA) initiative. If funded, the MCA will increase annual U.S. development assistance by $5 billion by 2006 (an increase of roughly 50 percent), with the additional aid targeted at countries “that are pursuing policies and building institutions that adhere to the principles of good governance.” “Good governance,” however, means conforming to a set of pre-specified benchmarks based on U.S. practices, including strengthening the rule of law, liberalizing economic and political institutions, and generally conducting progrowth economic policies.74 Reduced to its essentials, the campaign to reform existing development institutions and the decision to tie U.S. aid to political reform are part of a broader effort to use U.S. economic power to encourage other countries to become more like the United States. These remedies may be wholly desirable from an economic point of view, but they also provide another example of America’s seeking to impose its own preferred solutions on others.

Primacy, Democracy, and Human Rights

In addition to shaping the global economy, America’s attempt to mold a favorable world order included a continued effort to promote U.S. ideals of democracy and human rights. These are hardly new goals, of course, as U.S. leaders have always claimed to uphold democratic values and have on occasion made good on this pledge.75 Since 1991, however, the United States has been more visibly committed to spreading democracy and preventing large-scale human-rights abuses. Under President George H. W. Bush, for example, the U.S. Agency for International Development formally adopted the promotion of democracy as one of its core objectives, and adherence to democratic principles became an important criterion for admission to key Western institutions. In 1995, the Clinton administration made “enlarging” the sphere of democratic rule a centerpiece of its own national security strategy.76 President George W. Bush declared in 2002 that “freedom, democracy, and free enterprise” are the world’s “single sustainable model for national success,” and his second inaugural address focused on the promotion of freedom and democracy worldwide.77 And, as just noted, the United States has also increasingly linked foreign aid and trade concessions to the adoption of democratic norms and institutions.

The United States favors the spread of democracy in part because it believes that other peoples would be better off if they were able to select their own leaders in free and fair elections, and because it believes that this development will be good for the United States as well.78 In particular, democracies are believed to be less likely to fight each other, more likely to engage in free trade, and more inclined to support the United States in the international arena. As the Clinton administration’s 1995 National Security Strategy report put it, “The more that democracy and … liberalization take hold in the world … the safer our nation is likely to be and the more our people are likely to prosper.”79 To this end, the United States and its allies have also made membership in institutions such as NATO conditional on the implementation of democratic reforms—thereby giving aspiring members a greater incentive to establish genuine democratic institutions—and they have provided direct assistance to pro-democracy groups.80 These efforts can claim some degree of success, insofar as the number of democratic states has increased sharply in the 1990s, and the “level of freedom worldwide” reached its highest recorded level by the end of that decade.81 Finally, the United States used its armed forces to remove or defeat authoritarian leaders in Haiti (1994), Bosnia (1995), Kosovo (1999), Afghanistan (2002), and Iraq (2003), albeit for both self-interested and humanitarian reasons.

The end of the Cold War facilitated these efforts in several ways. The collapse of the Soviet Union removed the main threat to U.S. national security and made it less necessary to tolerate anticommunist dictatorships. Victory in the Cold War made U.S. ideals and values more appealing to others and bolstered U.S. faith in its own virtues. The defeat of Communism also opened the door to democratic transitions in Eastern Europe and parts of Central Asia, reinforcing a trend that was already underway in Latin America and elsewhere.

For all these reasons, promoting democracy has become a more visible issue on the U.S. foreign-policy agenda. Yet an increased rhetorical commitment to these ideals has not been matched by a similar commitment of time, money, troops, or political capital. In the 1990s, for example, the U.S. government spent roughly $2 billion each year to “promote democracy,” a sum equal to one-fifth of the U.S. military-assistance budget and less than 1 percent of the U.S. defense budget.82 Presidents George H. W. Bush and Bill Clinton paid scant attention to the lack of democracy in China, Egypt, Saudi Arabia, and much of the Middle East, because these states were either too powerful or strategically vital to pressure on these issues. President George W. Bush has gone further in calling for the spread of democracy in the Arab and Islamic world, but like his predecessors, he has been willing to work with nondemocratic governments in Pakistan, Uzbekistan, China, Saudi Arabia, Egypt, and elsewhere when U.S. interests required.

Consistent with its ideological commitment to individual freedom, the United States has also sought to foster a world where basic human rights were respected and large-scale human-rights abuses (including torture, extrajudicial execution, loss of civil liberties, genocide, etc.) did not occur. The rhetoric was strong, but the will to act remained weak. On the positive side, the first Bush administration did send troops to distribute food in Somalia in 1992, even though the U.S. strategic interests in this region were negligible. The United States also intervened in Haiti in 1994 in order to restore President Jean-Bertrand Aristide to power and took the lead role in fashioning the 1995 Dayton Accords that ended the civil war in Bosnia. The United States also orchestrated NATO’s effort to halt Serbian human-rights abuses in Kosovo and provided logistical support for the Australian-led peacekeeping force that entered East Timor in 1999.

On the other hand, no U.S. president has been willing to risk much blood or treasure solely to promote democracy or to advance human rights. The Clinton administration withdrew U.S. forces from Somalia after a single engagement left twenty-two U.S. soldiers dead, and this debacle helped discourage Clinton from intervening to prevent a subsequent genocide in Rwanda.83 Clinton also learned that pressing China on human-rights issues merely poisoned relations with Beijing and sparked protests from U.S. businesses, and his administration generally downplayed these issues. Despite ample evidence of human-rights violations, the United States intervened in Bosnia with great reluctance and eventually transferred most of the burden of implementing the Dayton Accords to its European allies. The United States used an intensive air campaign to compel Serbia to grant autonomy to Kosovo, but its obvious reluctance to commit ground troops reinforced the perception that America would not risk its own soldiers’ lives in order to save the lives of foreigners. U.S. reluctance was further highlighted during the 2000 presidential campaign, when candidate George W. Bush repeatedly criticized the Clinton administration for its “nation-building” efforts and made it clear that he did not think such activities were in the U.S. national interest. The United States has also reacted mildly to Israeli repression on the West Bank and in the Gaza Strip and has done little to halt Russia’s brutal treatment of its Chechen minority. Although President Clinton eventually signed the convention to create the International Criminal Court (whose main purpose is to prosecute individuals accused of crimes against humanity), President George W. Bush formally removed the U.S. “signature” and put strong pressure on a number of other countries to reject the treaty. Along with most other major powers, the United States has also been slow to act to halt mass killings in the Darfur region of Sudan.

Although America rarely lives up to its idealistic rhetoric, the belief that the United States should try to promote U.S. ideals abroad has helped reinforce the domestic consensus in favor of an activist foreign policy. Conservative supporters of U.S. primacy can favor using U.S. power to stabilize key regions and to discourage the emergence of “peer competitors,” while liberal ideologues can endorse using U.S. power to halt large-scale human-rights abuses and to promote democratic rule. Thus, a broad spectrum of Americans now supports efforts to remake the world in the U.S. image, albeit for somewhat different reasons.

Primacy, Hubris, and September 11

When a country is as strong as the United States now is, its leaders inevitably will be tempted to pursue far-reaching objectives abroad. If they are initially successful, they may succumb to “victory disease,” taking on ever-more-ambitious goals in the belief that past successes will be easy to duplicate. The danger of hubris grows apace, because it is hard to identify the limits of one’s power in advance. As the late Senator Richard B. Russell (who was hardly an isolationist) warned in 1967, “We should not unilaterally assume the function of policing the world. If it is easy for us to go anywhere and do anything, we will always be going somewhere and doing something.”84

The hubris born of prior success helps us understand the shifts in U.S. foreign policy after 2000, and especially the U.S. response to the terrorist attacks on September 11. As noted above, Presidents George H. W. Bush and William J. Clinton sought to take advantage of U.S. primacy within the existing geopolitical structures bequeathed by the Cold War. Both were “conservative” administrations insofar as they sought to enhance the U.S. position in the world while preserving the alliances, institutional commitments, and broad multilateralist approach that had won the Cold War. Both administrations also enjoyed a run of easy victories: Iraq was swiftly defeated in the Gulf War, the U.S. and Britain enforced “no-fly zones” there for more than a decade without losing a single plane, NATO’s intervention in Bosnia ended a lengthy civil war with few allied casualties, and even the unexpectedly messy war in Kosovo turned out to be a rather low-cost affair. Most important, both administrations managed to use U.S. primacy in ways that minimized global opposition to U.S. power.85

Like his predecessors, President George W. Bush also sought to maintain U.S. primacy, enhance U.S. prosperity, and encourage the spread of U.S. values to other countries. But it soon became clear that Bush was less committed to the traditional Cold War structures and more willing to use U.S. power to alter the international status quo.86 The Bush team was firmly convinced that history was on America’s side, and that U.S. power should be used to reinforce what then-National Security Advisor (now Secretary of State) Condoleezza Rice termed the “powerful secular trends moving the world toward … democracy and individual liberty.”87

Confidence in U.S. power also implied a greater willingness to “go it alone” in foreign affairs, even if this approach led to major disruptions in the existing international order. By September 2001, the Bush administration had formally rejected the Kyoto Protocol on global warming, begun an active campaign to discredit the International Criminal Court, and derailed efforts to negotiate a stronger verification protocol for the Biological Weapons Convention. Bush also opposed a new international effort to restrict the global trade in small arms and reiterated U.S. opposition to the landmines convention. This string of unilateralist gestures led the normally pro-American Economist to ask, “Has George Bush ever met a treaty that he liked? … It is hard to avoid the suspicion that it is the very idea of multilateral cooperation that Mr. Bush objects to.”88

The al Qaeda terrorist attacks on September 11 encouraged the Bush administration to take this approach to a new level. Bush became a “war president” overnight, and international terrorism occupied the rhetorical center of U.S. foreign policy. September 11 also reinforced Bush’s unilateralist inclinations and his tendency to divide the world into friends and enemies. In declaring war on all terrorists “of global reach,” Bush made it clear that other states and political groups had to choose sides. “We will make no distinction between the terrorists who committed these acts and those who harbor them,” he said on the evening of September 11, a point he emphasized again in a speech to Congress on September 20. “Every nation, in every region, now has a decision to make,” he declared. “Either you are with us, or you are with the terrorists.”89

September 11 also allowed Bush’s advisers to articulate a fundamental revision in U.S. national security strategy, one that reflected their underlying belief that contemporary international norms did not operate to America’s advantage. This new approach downplayed the traditional U.S. reliance on deterrence and emphasized the need to “preempt” potential threats before they emerged. Declaring that “the gravest danger to freedom lies at the crossroads of radicalism and technology,” the new Bush strategy warned that “the United States can no longer rely solely on a reactive posture as we have in the past.” In particular, the new National Security Strategy called for “anticipatory action to defend ourselves, even if uncertainty remained as to the time and the place of the enemy’s attack.” In other words, the United States was now declaring that it would use military force to prevent certain states (or terrorist groups) from acquiring potentially dangerous weaponry. Although the administration called this a strategy of “preemption,” they were in fact articulating a rationale for preventive war.90

The invasion of Iraq in March 2003 offered the first demonstration of the new “Bush Doctrine” in action. Unfortunately, the decision for war was based on a false reading of the prewar intelligence and inaccurate judgments about the likely consequences of a U.S. invasion. Iraq turned out not to have any weapons of mass destruction, and Saddam Hussein did not have meaningful ties to al Qaeda. Prewar hopes that the United States would be welcomed as liberators and that it would be easy to form a viable post-Saddam government proved equally illusory. The U.S.-led invasion had little trouble defeating Iraq’s third-rate army, but the occupation forces soon found themselves confronted by a resilient and deadly insurgency, a collapsed Iraqi economy, and a simmering political rivalry among Iraq’s Shi’ite, Sunni, and Kurdish communities. Iraqi resentment at the U.S. presence increased as the occupation dragged on, and President Bush began his second term in a quagmire of his own creation.

Most important of all, the war in Iraq reinforced global concerns about the unchecked nature of U.S. power. The Bush administration’s earlier acts of unilateralism had upset a number of key U.S. allies and made it more difficult for Bush to rally support within the UN Security Council. But the decision to use force against Iraq—in defiance of the Security Council and widespread global opposition— brought the problem of U.S. primacy into sharp relief. The issue was straightforward: how can other states be comfortable and secure when U.S. decisions affect all of their interests, and when the United States is strong enough to act pretty much as it wishes? Saddam Hussein was by all accounts a despicable tyrant, but the war and its aftermath led other countries to question the desirability of one state’s declaring that it will wage preventive war whenever it chooses, and on the basis of its own interpretation of evidence. The war in Iraq showed how the United States was using its position of primacy, and why this approach was deeply alarming to others.

Conclusion

The end of the Cold War left the United States in an extraordinary position—one that many states might envy but none could match. Rather than relaxing at the pinnacle of power, the United States sought to extend its dominant position and to mold a world that would favor U.S. interests even more. Because it was already in remarkably good shape, U.S. efforts under Presidents George H. W. Bush and William J. Clinton were relatively modest and were conducted within the confines of existing multilateral arrangements. From the moment that President George W. Bush took office, but especially after September 11, he sought to take advantage of U.S. primacy and to use it in an unconstrained way, in order to eliminate potential threats, further enhance America’s global position, and encourage the spread of American ideals and institutions.

Primacy makes many things possible, and most countries would be happy to trade places with the United States. Yet primacy also brings with it at least one obvious danger. No matter how noble the aims of the United States may be, its position in the world and its activities abroad are likely to alarm, irritate, and at times anger others. As the world’s most powerful country, the United States will inevitably face greater suspicion and resentment than it did when it was one of several Great Powers (as it was from 1900 to 1945), or even when it was one of two superpowers (as it was from 1945 to 1989). In a world of independent states, the strongest one is always a potential threat to the rest, if only because they cannot be entirely sure what it is going to do with the power at its command. This tendency may be muted if the United States acts wisely, but what appears to Americans as wisdom does not always play that way overseas. What hangs in the balance, in short, is the way the rest of the world perceives and responds to U.S. primacy. America may be a genuinely benevolent force in today’s world, but the rest of the world does not always see it that way. In the next chapter, I examine why this is so.