It is in the nature of democracies to have, for the most part, the most confused or erroneous ideas on external affairs, and to decide questions of foreign policy on purely domestic considerations.
Alexis de Tocqueville, Democracy in America (1835)
The American narrative is morally unresolvable because the society that saved humanity in the great conflicts of the twentieth century was also a society built on enormous crimes—slavery and the extinction of the native inhabitants.
Robert D. Kaplan, Earning the Rockies (2017)
The Berlin Wall fell in 1989, exactly two centuries after George Washington assumed the presidency of the first large-scale republic in modern history. The collapse of the Soviet Union two years later marked a truly historic moment, which was not merely the end of the Cold War but also the triumph of the republican framework that Washington had led into a deeply skeptical world.
What came to be called the liberal tradition—representative government, popular sovereignty, a market economy, the rule of law—proceeded to vanquish the European monarchies in the nineteenth century, then the totalitarian dictatorships led by Germany, Japan, and the Soviet Union in the twentieth. The implosion of the Soviet Union could plausibly be described as the final chapter in a long story, in which the American model for the nation-state defeated communism, the only remaining autocratic alternative to democracy, thereby clinching victory for the vision Washington had glimpsed so long ago. In this triumphant spirit, Francis Fukuyama proclaimed that “liberal democracy may constitute the end point of man’s ideological evolution,” even “the end of history,” a phrase almost designed to provoke the gods.
More specifically, the end of the Cold War left the United States as the undisputed superpower. This claim was not a rhetorical exaggeration. In 1991 the United States generated 26 percent of global GDP with less than 3 percent of the population. The dollar was the global currency; English was the global language. Militarily, America spent more on defense than the next twenty nations combined, projecting its power abroad with over 600,000 combat troops and support garrisons in more than forty countries. No imperial power in recorded history, neither the Roman empire nor the British empire at its zenith, enjoyed such supremacy.
The first occasion for the United States to display its new superpower status came with the Gulf War (1991), which became a poster child for America’s hegemonic role in the post–Cold War era. Under a United Nations mandate to remove Iraqi forces from Kuwait, the United States deployed an overwhelming force of half a million troops that also enjoyed the support of a thirty-nation coalition. The military campaign was swift and successful, American casualties were low, and the costs were shared by the coalition countries, all of whom welcomed American leadership in securing the flow of oil through the Persian Gulf. President George H. W. Bush resisted the advice to pursue the Iraqi army back to Baghdad, calling it an unnecessary expansion of America’s main mission. All in all, the new American leadership model was multilateral rather than unilateral, limited rather than unlimited, and decisive rather than protracted.
But hindsight, the historian’s crystal ball, exposes four ominous features in the Gulf War story that would haunt America’s superpower status in the ensuing decades. First, in his speech to the American people justifying the robust military commitment, President Bush coined the term new world order, implying that the collapse of the Soviet Union had created the unique opportunity to remake the world in America’s image. Second, in the debate leading up to the war, Bush claimed the sole authority to commit American troops as commander in chief, a continuation of expanded executive power during the Cold War, in open defiance of language in the Constitution vesting that responsibility and role in Congress. Third, the insertion of American combat troops into the Middle East established a modern precedent for Western intrusion into the Muslim world that was eerily reminiscent of the ill-fated European Crusades so long ago. And fourth, the Gulf War short-circuited the debate over what was called the “peace dividend,” meaning a reduction in defense spending that should occur now that the demise of the Soviet Union had removed the major rationale for the massive military establishment created during the Cold War.
The fourth feature is most mystifying because it was a nonevent. But the failure to develop a comprehensive strategy for American foreign policy once the Cold War ended proved to be the most consequential event of all. It can be explained in part by the fact that the sudden collapse of the Soviet Union took most American intelligence officials by surprise, itself a failure rooted in exaggerated estimates of the military and economic prowess of our designated Evil Empire. And then the Gulf War generated an immediate crisis that crowded out any strategic deliberation. As a result, deciding the direction of American foreign policy soon became an inherently improvisational process that developed on a case-by-case basis in response to crises that popped up like blips on the global radar screen.
One might envision the United States as a sailor venturing into vast and turbulent seas without a compass but with two heartfelt articles of faith to guide him: first, that the liberal model embodied in the American ship of state was sailing with the winds of history at its back, so that it was now, at long last, possible to act decisively on the Wilsonian promise to make the world safe for democracy; second, that the overwhelming military superiority of the American juggernaut meant that victory was inevitable, not only because the cause was just but also because the firepower at its disposal was invincible. Given these priceless assets, what could possibly go wrong?
The most jolting answer to that question was first sighted in the skies over New York and Washington on September 11, 2001. The attack by Islamic terrorists on 9/11 represented the end of America’s “splendid isolation,” the first assault on the mainland of the United States since the British invasion during the War of 1812. (American national security had been vulnerable to foreign penetration since the mid-1960s, when the Soviet Union installed intercontinental ballistic missiles.) In what might be called the diplomacy of catharsis, the administration of President George W. Bush channeled this anxiety into support for an invasion of Iraq by postulating a link between 9/11 and the regime of Saddam Hussein. As America’s top counterterrorist expert Richard Clarke described the decision, it was as if, in response to the Japanese attack at Pearl Harbor, the United States had invaded Mexico.
More ominously, the American government had inadvertently discovered a worthy successor to the Soviet Union as the new Evil Empire. The prevailing belief that Islamic terrorism represented an existential threat to national security comparable to a nuclear exchange during the Cold War was rationally ridiculous but, in the immediate aftermath of 9/11, emotionally compelling. Defense spending grew exponentially as a vast new Homeland Security and intelligence bureaucracy materialized almost overnight. All plans for reductions in the defense budget after the Cold War evaporated. The national debt soared past $12 trillion and kept climbing after the Great Recession of 2008.
Meanwhile, events in the Middle East made a mockery of any global version of Manifest Destiny. The overthrow of Saddam Hussein provoked a civil war between Sunni and Shia factions that exposed the awkward fact that no such entity as the “Iraqi people” existed. Loyalties were resolutely sectarian and tribal, which effectively meant that the Shia majority, once elected under a new American-sponsored constitution, proceeded to persecute and alienate the Sunni minority, many of whom went over to the emerging coalition of radical Islamists after most American troops departed in 2009. In neighboring Afghanistan, where the physical terrain was even more treacherous and the sectarian culture more splintered, the United States spent more on nation-building than it had spent on the Marshall Plan for all of Europe, with little to show for it.
If we step aside to scan the historical landscape during the quarter century after the Cold War, the most distinctive feature is nearly perpetual war, and the most distressing fact is that the overwhelming military superiority of the United States did not produce successful outcomes. Quite the opposite. In the most significant theater of American intervention, the Middle East, the strategic consequences have proven worse than disappointing, indeed counterproductive, by driving Iraq into the orbit of Iran and radicalizing Iraq’s Sunni population, which became the nucleus for ISIS. Beyond the Middle East—in Somalia, Kosovo, and Libya—developments have also failed to follow the script that liberal prophets predicted. The American political recipe for success has proven unpalatable for people and places with different histories, religions, and political traditions.
There was no “peace dividend” because there was no peace, and there was no peace in part because the United States made going to war easy. It was supposed to be hard, what Washington had called “the dernier resort,” requiring a declaration of war by the full Congress. The last time the United States went to war as the Constitution stipulated was December 8, 1941. The common practice was for the Senate to delegate the authority to the president, who then made the decision as commander in chief.
Nor was any public sacrifice required. Since President Richard Nixon ended the draft in 1973, the burden of military service has fallen on a small minority of working-class men and women. Nothing like the mass protests against the Vietnam War occurred in the post–Cold War era, for the simple reason that two generations of middle-class, college-age citizens have never had to factor military service into their personal agendas. A serious but unspoken moral dilemma lurks beneath this convenient arrangement, when the chief beneficiaries of America’s status as the dominant world power are immunized from obligations that come with those benefits.
An analogous disconnect has also been institutionalized by concealing the costs of war. The budget for the military campaigns in Iraq and Afghanistan, for example, was kept on a separate ledger, as if on a credit card. Instead of a tax increase to cover a portion of the costs, President Bush pushed through a major reduction in taxes, thereby passing the bill (or buck) to posterity. Such financial and accounting chicanery reinforced the larger pattern of deception in the post–Cold War era, making war almost painless: it is not declared, few have to fight, and no one has to pay. Only a small group of veterans, who are currently committing suicide in record numbers, experience firsthand that the United States is engaged in a perpetual war.
It is difficult to focus critically on the strategic failure of American foreign policy in the post–Cold War years when the central problem is the absence of any comprehensive strategy at all. It does seem clear that the Middle East has become a disproving ground for assumptions about the inevitable triumph of the liberal order. And that painful realization raises larger questions about the wisdom of military interventions that require the transformation of entire societies, usually a long-term, expensive, and thankless task for which the short attention span of the American public is unprepared. Moreover, in the absence of any explicit articulation of America’s strategic priorities, control over the direction of foreign policy has effectively shifted to the Pentagon in what became a global version of “mission creep.” The creation of five military regions covering the entire world, each under a senior general or admiral who functions as a kind of proconsul in the mode of the Roman Empire, represents an unspoken but de facto commitment to defend ill-defined American interests anywhere and everywhere. This defense-driven expansion of America’s global obligations is also reflected in the 16-to-1 ratio of military to diplomatic spending throughout the post–Cold War era.
The combination of limitless goals and a fully militarized foreign policy has troubling implications because it mirrors the fateful pattern of rise and fall that has caught almost all preceding world powers in its web. The authoritative account of this imperial syndrome is The Rise and Fall of the Great Powers (1987) by Paul Kennedy, which identified the fatal flaw as “imperial overstretch,” meaning the excessive growth of military costs required to manage far-flung obligations, which then saps the economic strength responsible for the earlier ascendance. (Several members of the founding generation, Adams most prominently, were thoroughly familiar with the classical version of this cyclical pattern from the works of Tacitus on the Roman Empire and viewed the misguided policies of the British Empire as the modern version of the same imperial story.) Lurking in the decline-and-fall syndrome is the implication that all empires, like all mortals, must come and go, and that the chief reason for their demise is that the world is an inherently unmanageable place that eventually devours the strength of any and all superpowers that history selects for what is, in effect, an impossible mission. Based on the first quarter century of its reign as the sole superpower, the United States, which tends to regard itself as the “exceptional nation,” is not proving an exception to one of history’s most enduring narratives.
If we are seeking guidance from the past on how to make foreign policy decisions in a deliberative rather than improvisational fashion, preferably like past decisions that we now know proved prescient, George Kennan deserves special notice. Kennan was the chief architect of the doctrinal framework for American strategy in the Cold War that came to be called “containment.” His central insight, made public in his “Sources of Soviet Conduct” (1947), was to turn the ideological prophecy of Marxism-Leninism on its head, for he argued that communism, not capitalism, was sown with the seeds of its own destruction, thereby making any direct military conflict with the Soviet Union unnecessary. If the United States and its allies blocked Soviet expansion into Western Europe and confined its sphere of influence to the Eurasian continent, the inherent contradictions within Soviet-style communism would, over time, make both economic growth and political control unsustainable. Kennan opposed the Vietnam War as an unnecessary and misguided waste of American resources, then lived long enough to celebrate the vindication of his vision with the implosion of the Soviet Union in 1991.
How can we explain the analytical prowess of Kennan’s strategic thought process? He was stationed at the American embassy in Moscow, observed firsthand the brutality of Stalin’s regime as a witness to the Moscow Trials during the late 1930s, and realized earlier than most that our Soviet ally in World War II would become an implacable enemy once the war was won. He was fluent in Russian and deeply read in Russian history and literature as well as in the political treatises of Marx, Engels, and Lenin. He was an unabashed realist who thought historically, meaning that he had little patience with theoretical models or moralistic assumptions with utopian expectations; indeed, those were the very flaws that made communism an inherently delusional ideology. More pragmatically, he realized that no nation, not even the United States, possessed unlimited resources, so any realistic strategy needed to prioritize its commitments, to identify what must be done and what must not be done. On that score, there was no need to wage a costly war against the Soviet Union once you truly believed that, left to its own devices, it was doomed to dissolution.
There is another voice from further back that merits our attention, though not for the obvious reasons. George Washington’s prescription for an isolationist foreign policy enjoyed a lengthy life-span but became anachronistic by the end of the nineteenth century, when the frontier era ended and the United States surpassed Great Britain as the world’s dominant economy. The irrelevance of isolationism was not fully exposed until the interwar years (1920–40), when the global order collapsed in the absence of an American international presence, leading to the ascendance of totalitarian regimes in Germany, Japan, and the Soviet Union. Ever afterward isolation ceased to exist as a viable American option. The question was not whether the United States should play its role as a world power but how to do so.
Washington’s enduringly relevant message in the Farewell Address can be found in the realistic reasoning that shaped his isolationist vision, which emphasized the unique geographic and demographic conditions the United States enjoyed. Unlike Jefferson, who believed that the liberal values the founding generation had discovered were universal principles destined to spread throughout the world, Washington believed they were products of a highly distinctive set of historical circumstances unlikely to be duplicated elsewhere. Adams tended to concur with Washington and engaged in a friendly debate with Jefferson during their twilight years, arguing that America’s political values would never take root in Latin America because Spanish and Catholic traditions predisposed that region to hierarchical political systems. Adams’s son, John Quincy Adams, the most prominent and influential foreign policy thinker of the antebellum era, also agreed that the United States could serve as a role model for republican institutions but must never attempt to become a messianic missionary or, even worse, an imperious intruder in the British mode. As he put it in a Fourth of July speech (1821) that George Kennan loved to quote, “America goes not abroad in search of monsters to destroy.”
Beyond the legacy of isolationism, then, Washington and his generation left a legacy of American exceptionalism that meant exactly the opposite of what that term came to mean in the twentieth century. In effect, precisely because the conditions shaping the American founding were unique, it was highly problematic to presume that the American model was transportable beyond the borders of the United States. As for Jefferson, whose formulation of the founding legacy foresaw its global triumph (i.e., “May it be to the world, what I believe it will be, in some parts sooner, to others later, but finally to all”), even he assumed that liberal principles, by definition, could never be imposed by force but only discovered by distant peoples in their own time and in their own ways.
The phrase that has come to capture this neoisolationist tradition is “city on a hill,” which also, like “American exceptionalism,” abounds in ironies. It was coined by John Winthrop in 1630 to describe a Puritan paradise where each person had a predestined place in a fixed social hierarchy; it was a medieval rather than a democratic vision. President Ronald Reagan frequently used the phrase “shining city on a hill” without seeming to fully realize that it described a United States that remained a distant beacon to the world and focused its fullest energies on improving itself rather than on overseeing the global order.
These voices from the past speak from different contexts with distinctive political accents, but they constitute a chorus in sounding three clear notes. First, the United States has committed the predictable mistakes of a novice superpower most rooted in overconfidence bordering on arrogance; second, wars have become routinized because foreign policy has become militarized at the same time as the middle class has been immunized from military service; and third, the creedal conviction that American values are transplantable to all regions of the world is highly suspect and likely to draw the United States into nation-building projects beyond its will or capacity to complete. If we ever have a sustained conversation about America’s role in the world, in effect the conversation we did not have at the end of the Cold War, these three lessons learned over the last quarter century should be placed on the table at the start.
The conversation has already begun within scholarly circles and the foreign policy establishment. There is a discernibly chastened tone to the dialogue, a shared sense that the world has not followed the liberal script that so many expected in the immediate aftermath of the Cold War. Hindsight now makes clear that the Cold War imposed a coherent framework on the international world (i.e., West versus East, democracy versus communism) that provided a measure of strategic, even moral certainty for the designated champion of the West. Specific diplomatic decisions fit within a larger strategic scheme that framed choices in unambiguous, nonnegotiable terms. And these terms conveniently coincided with the Jeffersonian side of the American Dialogue (i.e., freedom versus tyranny), which required no explanation to an American audience.
That era has ended. The Cold War has proved to be a temporary interlude that gave a false sense of credibility to the term world order. The global landscape has recovered its baffling, multilayered complexity and no longer fits within a bimodal frame. It’s as if the gods replaced their binoculars with a kaleidoscope. Whether the United States is historically equipped to lead in the new global context remains an open question.
The improbable election of Donald Trump has placed an exclamation point after that question. His presidential campaign featured the promise to “Make America Great Again” with “Again” deliberately vague. For his white supremacist supporters, it meant before the civil rights movement. For voters in Appalachia and the Rust Belt, it meant before globalization took away their jobs. His other promise to “Make America First” echoed the slogan of those who opposed American entry into World War II, suggesting a return to the isolationist America of the interwar years (1920–40). And by renouncing American commitment to the Paris Accords that set limits on carbon emissions, endorsing the British exit from the European Union, questioning the viability of NATO, and threatening to withdraw from the North American Free Trade Agreement, Trump effectively announced that the United States was relinquishing its role as the designated superpower.
It is clear that Trump’s controversial presidency is an American version of Brexit, a “back to the future” retreat to fortress America in reaction to the disorienting and destructive forces of the global marketplace. It is also clear that Trump embodies, in almost archetypical form, the demagogic downside of democracy. In the ancient world, for example, Thucydides warned about the vulnerability of the Athenian citizenry to the jingoistic rhetoric of Cleon. Cicero delivered a similar warning about the conspiratorial tactics of Catiline, which threatened the survival of the Roman Republic. Washington echoed the same message in his Farewell Address, lamenting the manipulation of popular opinion by the Republican opposition to the Jay Treaty and the intrusion of inflammatory domestic disputes into foreign policy.
Throughout history, then, the fate of nations with political frameworks based on public opinion has always been haunted by the specter of charismatic charlatans with a knack for exploiting popular fears. In that sense, the Trump presidency, while wholly unexpected, was eminently predictable, almost overdue. (Historians are the undisputed champions at after-the-fact wisdom that reveals why unexpected outcomes were always inevitable.) Much like meteors streaking across the horizon, demagogues tend to enjoy only limited life-spans, so the Trump presidency is likely to resemble the proverbial blip on the historical radar screen.
But whatever his duration, Trump has exposed the deep pools of isolationist sentiment that always lurked beneath the surface in the rural regions of the American heartland, now raised to relief by residents who see themselves as victims rather than beneficiaries of the globalized marketplace America is defending. Moreover, the very fact that a person with Trump’s obvious mental, emotional, and moral limitations could be chosen to lead the free world casts a dark shadow of doubt over the credibility and reliability of the United States as the first democratic superpower.
In addition to being the first democracy, the United States also carries two burdens that no previous world power had to bear: it is the first superpower with anti-imperial origins, and it is the first superpower to assume that role in the postimperial era. These unique legacies, with morally admirable implications that flow from the founding, pose political and strategic problems that limit America’s conduct in its role as the dominant power.
The British historian Niall Ferguson provides the fullest treatment of America’s anti-imperial problem in Colossus: The Price of America’s Empire (2004). He poses the problem with a distinctive British accent, asking whether the United States is up to the task of replacing Great Britain as the guarantor of global order. Ferguson describes America on the basis of its sheer size and resources as even more economically and militarily equipped than his British ancestors to play the leadership role. But he concludes that the United States is “an empire in denial” that lacks the will to stay the course in Iraq and Afghanistan in the same reliable fashion Great Britain displayed in its sustained occupations of India, Egypt, and South Africa.
One can only imagine George Washington rolling over in his grave. For the very suggestion that the British Empire should serve as the role model for American leadership in the world defies the core values of the American founding. A republic, by definition, cannot be an empire. This is a principled conviction deeply embedded in America’s DNA. It is the chief reason why protracted wars have seldom enjoyed popular support in American history, and the underlying reason why nation-building projects in faraway places invariably become politically unpalatable. For they closely resemble, and often are, imperial projects.
It is useless to accuse the United States of lacking “the will to power,” as Ferguson does, since that purported deficiency is rooted in what Lincoln called “the mystic chords of memory” that enshrined the successful war against British imperialism as the primal source of American patriotism. There are exceptions to the larger pattern—the occupation of the Philippines, for example. But as a general rule there is a clock running on all American military occupations of foreign countries that our British, Spanish, Ottoman, and Roman predecessors did not need to contemplate.
Also unprecedented is the problem the United States faces as the first superpower of the postimperial era. All previous world powers were empires whose global ascendance to that status was a collateral consequence of the power acquired by extracting wealth from the colonies they controlled. Great Britain did not occupy South Africa and fight the Boer War primarily to shoulder “the white man’s burden” in some Kiplingesque commitment to racial justice. It did so in order to ensure British control of the lucrative diamond market.
In the postimperial age, no analogous economic incentive exists for the United States. When Donald Trump, speaking as a presidential candidate, proposed that the United States should have “taken the oil” before exiting Iraq in 2009, the very suggestion produced incredulous criticism from all sides, and he quickly dropped the idea. As a former colony in an anti-imperial age, the United States cannot do colonization.
As a result, the role of superpower in the twenty-first century is unlikely to prove cost-effective. When the ledger is closed on the military budget for Iraq and Afghanistan, the cost will approach $4 trillion. Such a sum, if spent on domestic priorities, could have shored up Medicare for a generation and paid for the restoration of America’s aging infrastructure. These are difficult trade-offs to justify in a democracy. Moreover, as already noted, a sizable minority of America’s working class are victims of the very global order the United States is spending so much to sustain, the very constituency that made Trump’s presidency possible.
There are, then, some compelling questions surrounding America’s future as a superpower based on long-standing legacies. Do our origins as a republic based on popular consent impose limitations that make American reliability as a world leader problematic? Is the neoisolationist message of the Trump presidency a harbinger of the future or a temporary aberration? Is America’s seventy-year reign at the top ending? Are there any voices from the founding still sufficiently resonant to point in a different direction?
One prominent student of American foreign policy, Robert D. Kaplan, has addressed this cluster of questions in a distinctive format. His intriguingly titled Earning the Rockies (2017) is a memoir of his coast-to-coast trek in 2015 that quickly became a meditation on America’s role in the world, prompted by his conviction that “the answers to our dilemmas overseas lie within the continent itself.” Kaplan is self-consciously echoing and updating Washington’s earlier version of American continental empire as a geographic asset destined to dictate the future direction of American history.
What Washington viewed as a providential gift that permitted the infant republic to flourish as a secure, self-sufficient nation, Kaplan describes as a providential platform from which the mature American republic can project its unmatched economic and military prowess abroad. Due to technology, distance has now disappeared from the strategic equation, thereby rendering isolation impossible. Now the Atlantic and Pacific are no longer protective shields but unique access points to both Europe and Asia. Kaplan adopts Washington’s realistic approach to foreign policy, still driven by geography, but now revised to fit a wholly globalized world.
His analysis of the forces propelling the United States outward is almost gravitational. It is not a matter of choosing a direction. America’s fate is built into its location, natural resources, and the coastal contours of the continent. For Kaplan, America’s destiny as a superpower is just as manifest as its earlier expansion across the Rockies. Any attempt to resist that destiny will prove as futile as the effort to rescue the victims of globalization in the American heartland from their sad but inevitable fate. There are two Americas out there, one connected to the global economy, the other marooned in interior islands of joblessness, despair, and addiction. There is no question which one owns the future.
Kaplan also goes back to the founding for his forecast of America’s conduct in its fated role as unrivaled world power, which once again channels Washington’s experience. We should expect great triumphs and great tragedies during America’s continued reign at the top, he argues, because the oscillation between these two sides of our domestic history is likely to persist in our future foreign policy. “The American narrative is morally unresolvable,” he observes, “because the society that saved humanity in the great conflicts of the twentieth century was also a society built on enormous crimes—slavery and the extinction of the native inhabitants.” If our role in the world is clear, how we play that role will probably defy any semblance of coherence or moral clarity. We are both fated to lead and fated to do so erratically and impulsively. Alexis de Tocqueville’s words at the end of Democracy in America come to mind: “I am full of apprehension and hope.” Our ongoing dialogue should pay respect to both sides of that thought.