9 The Decline of Force
A state is a human community that successfully claims the monopoly of the legitimate use of physical force within a given territory.
—Max Weber, Politics as a Vocation
Frankly, I’d like to see the government get out of war altogether and leave the whole field to private individuals.
—Joseph Heller, Catch-22
Of all forms of power, perhaps the greatest is arbitrary power, because it asserts the will of an actor independent of any other influence; it is the ability to set oneself above a system that allows a ruler to be the defining force within that system. Similarly, for great states, the ability to impose their will without regard for the views of those on whom it is imposed is the ultimate standard of power. Throughout history, the linchpin of such power has been force—the ability to either personally or on behalf of a tribe, a city, or a state, bend others to one’s will. Ultimately, for this reason, the ability to amass and project force became the defining prerogative of the state, as expressed by the quotation from Weber above. Whether it was the destructive force of an army or the more limited tools required to enforce the law and preserve the peace, in the interests of “order” and the “public good” the state claimed the sole right to apply and carry out threats to the life, well-being, or freedom of others.
As it has other important pillars of sovereign authority, progress has altered the role that force plays in the modern life of the state. On the one hand, conflict has become so costly that few states can reflexively revert to war as an acceptable choice for the resolution of international differences. On the other, international and domestic systems of law have become so evolved that they have in large degree eliminated the arbitrary power of those to whom the enforcement of the laws is entrusted. Law has reined in the lawmaker and war has largely pacified the warmaker.
This has been to the unquestionable benefit of society as a whole. But it has also indisputably altered the role of the state, and in some areas reduced it to a status in which it today possesses few characteristics that can distinguish it from other actors with similar resources and prerogatives who also operate peacefully within a system of laws—such as multinational corporations. It also raises questions about the evolving role of nonstate actors, who are freer to use force than state actors might be, and about how their influence might grow in a world in which the strongest are “too strong” to freely apply the forces they have at their disposal.
Every Man Against Every Man
According to the tradition of Hobbes, Locke, and Rousseau, man began in a state of nature that was also permanently a state of war, which, for Hobbes, was one of “every many against every man.” As Hobbes described it, “the right of nature” was every man’s right to do whatever he needed to preserve himself “even to one another’s body.” So long as this condition was maintained, security was impossible, and since that was intolerable, men were willing to give up the right of nature in order to ensure their own well-being. Each thereby agreed to concede “so much liberty against other men as he would allow other men against himself.” This “social contract” required an enforcer, some “coercive power to compel men equally to the performance of their covenants … and to make good that propriety which by mutual contract men acquire in recompense of the universal right they abandon: and such power there is none before the erection of a commonwealth.”
Similarly, Locke agreed that avoidance of the state of war was “one of the great reason of men’s putting themselves into society and quitting the state of nature.” Locke’s man transferred to the community the right to judge or punish others “even with death itself,” thus creating civil society. For Locke, one of the principal goals was to preserve not just the well-being but the property of members of this society, a task to which all members collectively apply themselves. In the conception of both Hobbes and Locke, the right to the use of force is in most instances transferred to the state, which in turn may apply it as it serves the public interest. Of course, this was just political theory. In reality, it was the brute application of force that created the first tribal leaders, whether among the clans of primitive Scandinavia, the horsemen of Mongolia, or the Franks of Central Europe. From prehistory through Charlemagne to totalitarian regimes today from Myanmar to Zimbabwe, fear has been a tool of leaders because they were seen to be willing and able to claim life or property with impunity.
For us, the core question is whether something has changed in the comparatively recent past with regard to the application of force that has permanently altered the nature of the state, or most states, and changed its relationship with other independent actors in the world. Or, to put it more directly: in a world in which the state is limited by its own laws as well as emerging international ones, in which multinational private actors can influence the formation of those laws or dodge their application altogether through global operations, in which global threats and challenges are increasingly beyond the ability of national actors to manage, and in which states and corporations are actors on equal footing whether before courts or in terms of the resources or tactics they can bring to bear to achieve a desired outcome, has something changed so fundamentally that it affects even the nature of the social contract and irreversibly alters the way global civil society is likely to work?
To answer this question, we need to consider the changes that might be characterized as the decline of force as an option most states can realistically or successfully employ, and at the same time ask whether in this area too, private or nonstate actors are actually gaining a new and different role through which they assume some of the prerogatives that were once exclusively the domain of sovereigns.
Mercenaries 1.0: Setting the Record Straight on the State’s “Monopoly” on Power
Readers of this book need only think back to the stories of the army of the British East India Company to realize that the idea of states having a monopoly on power is a comparatively recent one. Indeed, for much of history, armies were largely made up of mercenaries, or they were groups with an affiliation not to any political entity but to an individual, a family, a region, a church, or even a guild. As we have seen, the story of the miners of the Great Copper Mountain illustrates this phenomenon well. Prior to the unification of Sweden, the men of Dalecarlia resolved disputes and kept the peace with swords and battle-axes. They protected the mines in the same way, and when they saw a threat to their autonomy from Gustav Vasa as he attempted to unify Sweden, they rebelled three times. Among their other privileges, the master miners of the late Middle Ages were granted the right to carry “sword, shield, iron hat and mailed gloves”—not mining tools exactly, but a sign of both their status and their role in a Europe in which men of standing were periodically called upon to bear arms to resolve the issues of the day.
When the miners followed the Continental trend and formed their own guild, named after their patron saint, Saint George, or in the Swedish transliteration, Saint Orjen, the rules were strict, but the powers were also clear. The group had its own independent judicial functions and required members to support one another in times of sickness or in the face of physical threats. In fact, the bonds were so strong that they were actually expected to help one another out even if one of them turned out to be a criminal, by helping him along the first twenty miles of his escape or by providing a boat. Loyalty to the guild transcended that to local law, “and the word of the Master of the Guild weighed heavier than that of a judge.”
During the Thirty Years’ War, the men of the mine took their forceful independence to its next step, albeit one that went considerably further than the vast majority of other private enterprises of the era. They organized their own military unit. It consisted of seven infantry companies and a squadron of cavalry. Workaday hierarchies were maintained: miners were the foot soldiers, and shareholders were the officers and presided over their own military court. Among its other responsibilities, the unit policed and defended the mine, which, throughout the period of the war, was considered a national-security priority. But for all the innovations associated with the mines, the development of a private mercenary force was not one of them. Mercenaries had been common throughout Europe since the decline of feudal armies in the thirteenth century. As towns and cities increased in importance, their rulers and elites would hire privately organized military units. These units, called “free companies,” would travel around the Continent offering their services to those with a battle to fight and a treasury to finance it. They reached their heyday during the Hundred Years’ War between England and France. In 1362, France’s king John II the Good tried to wipe out the free companies by raising a feudal army, but the companies banded together and defeated the king’s soldiers. They were considered such a threat by subsequent rulers that the next few French kings launched wars against neighboring kingdoms just to get the mercenaries out of France.
Italy too, with its welter of constantly battling cities and principalities, became a mercenary magnet. In The Prince, Machiavelli offers an opinion of them that one can imagine captures the view of the aristocratic leaders he advised:
Mercenaries and auxiliaries are at once useless and dangerous, and he who holds his State by means of mercenary troops can never be solidly or securely seated. For such troops are disunited, ambitious, insubordinate, treacherous, insolent among friends, cowardly before foes and without fear of God or faith in man. Whenever they are attacked defeat follows; so that in peace you are plundered by them, in war by your enemies. And this because they have no tie or motive to keep them in the field beyond their paltry pay, in return for which it would be too much for them to give up their lives.
With the Hundred Years’ War over, John the Good’s grandson, the new French king Charles VII, came to a similar conclusion. To combat the perceived problem, he created a new tax on the merchants of the kingdom, which he used to hire several of the companies. He then offered these mercenaries regular pay if they would kill off the competing companies. They did, and the survivors became the core of a standing French army.
Mercenaries also took to the seas and played an important role in the battles between the English and the Spanish. Some of them, such as Sir Francis Drake and Sir Walter Raleigh, won the favor of Queen Elizabeth I and made their fortunes thanks to their treasure raids against Spanish convoys; Drake became one of the wealthiest commoners in England.
Similarly, when Sweden’s king Gustavus Adolphus led the country to its considerable victories in the Thirty Years’ War, he did so not just with the assistance of the miners and the copper they mined for weapons or trade, but also with the benefit of a navy partially provided by Louis de Geer, the Dutchman who was managing the mines at the time. Of the seventy-three thousand men with whom Gustavus attacked the empire in 1630, thirty thousand were paid German mercenaries. The Dutch East and West India companies also provided private armies to help expand the Dutch empire, and the Imperial and Catholic League armies employed thousands of mercenaries during the war.
A century later, half the Prussian army was of foreign birth. Four out of every ten members of the British army were not British. A quarter of the members of the Spanish and French armies were also not native to the countries for which they fought. This centuries-old dependence on swords and guns for hire ultimately began to change in ways that would have the direst consequences in the one country that had gone to perhaps the greatest lengths to be rid of the mercenaries: France.
Mass Armies, Mass Destruction: From the Storming of the Bastille to Nagasaki
The beheading of Louis XIV and his wife, Austria’s Marie Antoinette, in January 1793 understandably made Europe’s other monarchs uneasy. In an effort to contain the radical republicanism that had brought France’s people to a boil, these monarchs, who were eager to maintain heads on which to rest their crowns, assembled the so-called First Coalition against France. The alliance included Austria (which was at war with France from the time of Marie Antoinette’s arrest), Prussia, Russia, Britain, Spain, Portugal, the Dutch Republic, and several of Italy’s many principalities.
Confronted with the daunting task of defending France against what seemed to be the rest of the Continent, the Committee on Public Safety made a decision that may have been more revolutionary than anything else they had undertaken. Rather than relying on France’s Bourbon army, which they saw as a vestige of old and discredited aristocratic ways and raised serious questions about the allegiance of many in its leadership, the Committee determined that what they needed was a true people’s army. In August 1793, they declared the levée en masse, requiring the service of the entire population and the conscription of all unmarried males between the ages of eighteen and twenty-five:
From this moment until such time as its enemies shall have been driven from the soil of the Republic, all Frenchmen are in permanent requisition for the services of the armies. Young men will go forth to battle; married men will forge weapons and transport munitions; women will make tents and clothing and serve in the hospitals; children will make lint from old linen; and old men will be brought to public squares to arouse the courage of soldiers while preaching the unity of the Republic and hatred against kings.
The French army swelled from 290,000 men to 700,000 within four months of the decree. Thus was born the first truly national army of the modern era, and a precedent was set that would change the face of warfare and politics.
When Napoleon came to power, he used the state’s resources to field an army that in 1808 consisted of 300,000 troops in Spain, 100,000 in France, 200,000 in the Rhineland, and another 60,000 in Italy. Four years later, this force had grown to one million men. Thanks to the revolution’s having swept the aristocracy out of the leadership of the military, France’s national army made broad reforms possible, since changes to military structure were no longer seen as threats to society’s overall social or political structure. Napoleon’s genius for organization and his ability to make use of the early capabilities of the industrial era to supply and manage such an army led to the creation of a new standard of power in Europe. The only choice for Napoleon’s adversaries was to attempt to match him step for step. Using similar approaches to marshaling armies, Britain and its allies were able to call 750,000 men to arms, about a third of whom were on the field at Waterloo when Wellington finally dispatched the French emperor once and for all.
While the United States was happily far away from these conflicts, it wanted to make sure that its citizens were not drawn into these battles, especially in ways that might spill over into North America and put the young country at risk. This led to the passage of the Neutrality Act in 1794 prohibiting “citizens or inhabitants of the United States from accepting commissions or enlisting in the service of a foreign state.” In 1817, a follow-up act extended the list of foreign states to which the law applied. These laws did not limit what Americans could do overseas, they only stopped them from joining in activities that might bring problems to America’s doorstep. European countries including Britain, France, Germany, Brazil, Italy, Portugal, Spain, Russia, and Sweden passed similar laws, as did Mexico. Then, eight years after the turning-point revolutions of 1848 and a year before the final undoing of the world’s largest private army in the Sepoy rebellion in India, the Treaty of Paris saw all of the world’s major powers agree to outlaw the practice of privateering.
The consequence was that any nation that was going to project force was going to have to mobilize a state-sponsored national army. Further, given the scope of the battles of the Napoleonic era and the growing ability to harness mass-production techniques, new communications tools, and new forms of transport to support warmaking, those militaries were going to have to be both big and expensive. While Britain’s army remained at about 250,000 throughout the century, France’s grew from 132,000 after Napoleon’s defeat to 544,000 by 1880. Russia’s army became the Continent’s largest at just under a million men. While Prussia’s army had comprised 130,000 men before the wars of German unification, afterward it was consolidated into a national force of 430,000.
The need to maintain such large forces played a role in accelerating the rise of the nation-state as the primary actor on the global stage. Whereas there had once been well over a hundred political entities in Europe at the time of the Thirty Years’ War, there were just over twenty by 1900. New approaches to managing national force consolidated the power of the states. Under the leadership of Chancellor Otto von Bismarck, first Prussia and later Germany became a model of the muscular modern nation. All young Prussian men were obligated to spend three years in the regular army and then four in the reserves. Later they served in the Landwehr, which provided support away from the front lines. Education that ensured young recruits were better schooled than some neighboring countries’ peasant armies and the organizational skills of the Prussian general staff gave the Prussian army advantages that were only amplified after they drove the Habsburgs from Germany and consolidated the other cities and principalities of the region into a single state. Like Napoleon, Garibaldi, and other national leaders, Bismarck recognized that by identifying with the higher calling of serving the nation’s greater good, he and the leadership around him could generate both legitimacy and loyalty.
This ability to harness emerging national identities to support the development of ever greater national power was not exclusive to Europe. In the wake of the Civil War, the United States emerged as a stronger union with a clearer national vision of itself and its destiny. It also emerged from a war that claimed six hundred thousand lives with a much greater understanding of and capacity to meet the technological and organizational requirements of modern militaries, skills it kept developing as it pacified its own frontiers. Similarly, after the Meiji restoration in 1868, Japan actively emulated the European model, developing a constitution that echoed Prussia’s, reforming its legal, banking, and education systems and going so far as to hire the best Western experts to help rebuild its military. First with French and later with German advisers guiding their armies and British advisers working with their navy, they instituted a draft in 1873 and began to flex their muscles in a series of conflicts including the first Sino-Japanese war, the Boxer Rebellion, the Russo-Japanese war, and their 1910 conquest of Korea. By the time World War I broke out, they discussed plans (that were never realized) of sending one hundred thousand to five hundred thousand men into combat. By 1920, their standing army consisted of three hundred thousand troops.
By 1910, the Russians had 1.2 million men in their regular army. France and Germany had six hundred thousand each. Austria-Hungary had four hundred thousand and the British and the Italians had approximately a quarter of a million. Despite those numbers, within four years a conflict would unfold that would require an almost unfathomable quantum increase in the scale at which military affairs were conducted. During World War I, the Allied Powers—the United States, Britain, France, Russia, and Italy—mobilized a total of 40.7 million men. The Central Powers—Germany, Austria-Hungary, Bulgaria, and the Ottoman Empire—fielded 25.1 million. Between 9 and 13 million died in combat. Another 10 million or more died of starvation. The flu epidemic that followed the war claimed 50 million more, but other outbreaks of disease during the war also claimed millions. The economic costs have been estimated at approximately $350 billion but are impossible to calculate accurately.
World War II required the mobilization of more than 100 million men and women. It is estimated that between 50 and 70 million people died in the conflict. Globalization and the Industrial Revolution had combined to create the most devastating man-made disaster in human history. What is more, the intensity of the conflict, which involved virtually every major power on the planet, produced massive investments in the technology of warfare that resulted, by war’s end, in developments that ensured that any future global conflict would come at unspeakably higher cost. Nuclear, biological, and chemical weapons had been used in battle, and billions were spent on developing the infrastructure to maintain, produce, and deliver such weapons should they be required for future conflicts. The detonation of a single bomb, code-named Fat Man, over the city of Nagasaki at 11:02 a.m. on the morning of August 9, 1945, caused in an instant the utter destruction of the city and led to the deaths of almost seventy-five thousand people and grievous injuries to seventy-five thousand more. It also sent a message that, as incomprehensible as the scope of the first two state-sponsored world wars had been, that incomprehensibility was no longer an obstacle to a world capable of the most unspeakable crimes against itself, particularly since these definitions of “total war” had “progressed” from referring to conflicts between armies to including the destruction of large segments of an enemy’s entire society.
What Happens When the Great Powers Make the Laws to Keep the Peace?
War had been a legitimate and accepted means of resolving the differences between states throughout the period of their early development. But by nationalizing the mobilization and management of armies and bringing to bear great national resources to support those armies, the costs of war had been raised to levels that beggared the imagination. The search for alternatives to war has occupied the thoughts and efforts of men and women throughout history, but in the wake of each of the global conflicts of the twentieth century, new efforts unprecedented in their geographic scope and ambition were undertaken. While far from wholly successful, they have had an impact in making war more difficult for some. And, combined with the costs of modern warfare, they may also have had the consequence of changing the nature of what kind of warfare is possible.
The first set of approaches sought to use the fragile, fledgling idea of international law to set guidelines for how states could behave. This was a step beyond individual treaties or agreements regarding norms. In the wake of World War I, when Woodrow Wilson went to Europe on the first foreign mission undertaken by any American president (and only the second trip overseas by any American president, following Theodore Roosevelt’s journey to Panama in the prior decade), he had the ambition of forging an agreement unlike any other the world had known. He sought nothing less than a constitutional world order that would not only base the existence of states on the right of self-determination but also ensure standards of treatment and behavior among those states that would forestall future conflict. He imagined a new kind of social contract among states in which they would give up power to preserve security in much the same way that individuals did as they emerged from the Hobbesian state of nature. As important, Wilson’s rules envisioned a League of Nations that would protect states from aggression.
As is often the case in human behavior, it takes a crisis to produce real change. Or rather, it typically takes more than one crisis. Because even after the “war to end all wars” the sense remained that gross catastrophes are anomalous. Tens of millions dead and hundreds of billions lost were not sufficient motivation for the leaders of the world’s major powers to give up on their traditions of sovereignty in the way that was required by the League. European leaders and Republican opponents of the idea back in the United States dragged their feet, and ultimately the idea ran out of steam.
But in the wake of the Second World War, it became clear that the folly of the League of Nations lay not in Wilson’s conception of it but in its rejection. For this reason, on April 25, 1945, representatives of fifty nations gathered in San Francisco to contemplate an international organization that might actually be effective in preventing future conflict and clarifying the rights and roles of nations. The group embraced a term used by U.S. president Franklin Delano Roosevelt, originally to refer to the members of the Atlantic alliance: “united nations.” This was a clue that the new organization was going to be organized as much to consolidate the postwar gains of the victors as to advance any egalitarian rights of the states involved, but the initiative nonetheless proved considerably more successful than the one undertaken by Wilson.
In the U.N. Charter, which came into effect in October 1945, the rule of force is outlawed explicitly several times, and, where it is deemed acceptable, the right to use it is limited to the U.N. Security Council. Article 2(4) reads:
All members shall refrain in their international relations from the threat or the use of force against the national integrity or political independence of any state, or in any other manner inconsistent with the purposes of the United Nations.
Article 39 says that the U.N. Security Council shall “determine the existence of any threat to the peace, breach of the peace or act of aggression and shall make recommendations or decide what measures shall be taken … to maintain or restore international peace and security.” Article 42 permits the Security Council to “take such action by air, sea, or land forces as may be necessary to maintain or restore international peace and security.”
These principles have been further amplified in the intervening years by the International Court of Justice, which was also established by the U.N. Charter. In 1949, it ruled that intervening in the domestic matters of foreign states is forbidden even if international organizations are incapable of addressing the situation. In 1986 the court went further in the case of The Republic of Nicaragua v. The United States of America, in which it stated that the “principle of non-use of force … may thus be regarded as a principle of international law” outside of any proscriptions of force that are included in the U.N. Charter. Subsequently, legal scholars have asserted that a wide prohibition against the use of force “has the nature of a preemptory norm of international law,” which is to say it cannot be modified or excepted.
While these laws would seem to put all states on an equal footing, in practice that has not been the case. One reason for this is the reluctance of the United Nations to use force even when it is clearly warranted. Thus the one institution that is designed to take up arms on behalf of smaller powers and to level the playing field for them does not do so except rhetorically. Meanwhile, because the U.N.’s structure gives special powers to the largest of the post–World War II victors, these larger powers, which can veto any resolutions against them, have acted as though they were less constrained by the United Nations, its rules, and the views of its other members.
For example, in the case of Nicaragua versus the United States cited above, which focused on U.S. support for a Contra army (yes—they never entirely left the picture) that violated Nicaragua’s “sovereignty, territorial integrity and political independence,” the ICJ ruled against the United States. The Reagan administration shrugged off the decision, first arguing that the court lacked jurisdiction and then simply refusing to pay the required billions in reparations to the Nicaraguans. Later, the United States pressured a new Nicaraguan government to drop the case in exchange for continued aid payments.
Writing about the case, Harvard professor Lawrence Tribe asserted that “at home and abroad, the Reagan Administration has made it clear that it will not be inconvenienced by mere laws; it will do as it likes.” In this respect, President Reagan had much in common with the leaders of the world’s other major powers in the years since the formation of the United Nations. The Soviets vetoed the Security Council’s resolution condemning their invasion of Afghanistan. Britain and France vetoed resolutions calling for their withdrawal from the Suez Canal. NATO bombed Yugoslavia without the fig leaf of U.N. support because Russia had made it clear it would veto any effort to authorize the mission.
Major countries also twist the law to suit their purposes because there is no power able to enforce a contrary opinion. When the United States invoked its “right of self-defense” to preemptively attack Iraq, it was more than a stretch, it was a clear misrepresentation. There was no threat.
While some small countries have been able to use great-power sponsorship to help them dodge pressure from the United Nations, many feel less free to do so. This is due in part to the fact that these smaller entities are more vulnerable to economic pressure, whether in the form of withheld funds from international financial institutions (wherein great powers dominate by the voting terms) through the imposition of sanctions, or by the establishment of a no-fly zone, as in the case of seeking to pressure Libya’s Muammar Qaddafi to desist from the wholesale slaughter of opponents of his regime. Similarly, while some smaller states have deftly used the United Nations and its one nation–one vote structure to gain an even larger influence than they might otherwise have—as Sweden and Norway have on environmental issues, for example—this is seldom the case with security-related concerns. International institutions can therefore be said to have been successful in constraining the impulse to use force by nations, but only by some of them—the smaller ones. As it turns out, of course, that didn’t actually make smaller nations safer than large ones; quite the contrary.
Two factors fed this paradox. First, large countries cornered the market on the ability to project meaningful force. Secondly, the decline of the vast majority of countries into what might be called “semi-states” unable to project force or, in most instances, even defend themselves, has led to growing instability among some members of their ranks—especially the weakest failing and failed states.
Who Can Bear the Costs of Modern War?
It is estimated that America’s interventions in Iraq and Afghanistan will cost, when all is said and done, between $2 trillion and $3 trillion. At the time of this writing, the United States was spending approximately $2 billion a week in Afghanistan to support approximately a hundred thousand troops. Those troops are supported by supply chains that stretch back to the United States and by technologies that employ advanced capabilities from outer space to beneath the surface of the oceans, from unmanned drones to complex information systems. These forces may not ultimately achieve the goal of bringing democracy to Afghanistan or stabilizing the region, but they have capabilities no other armed force has.
In fact, in the first two decades of the twenty-first century, there is only one nation that has the ability to project force anywhere in the world from space, air, land, or sea. That country is the United States, which annually spends on defense as much as all the other countries of the world combined. Its closest rival, China, spends only one-tenth as much as does the United States. In fact, China, France, Great Britain, and Russia—the other members of the top five in defense spending—each accounts only for 4 to 6 percent of total global defense spending. The ten countries after that are responsible for only about one-fifth of global defense spending among them. Those fifteen countries are really the only countries on the planet that can be considered to have a serious military capability, the ability to successfully project force for any period of time. Even among them, the ability for most is extremely limited. The total defense spending of the remaining 175 countries of the world accounts for only 18 percent of all defense spending.
Beyond the issue of overall budgets, there is the fact that modern warfare requires special and difficult-to-acquire technological capabilities. For example, there are only nine nuclear powers in the world, and only five official “nuclear weapons states.” Only twenty states are believed to have ballistic missile capabilities, and most of these are very limited and of dubious effectiveness for any prolonged conflict. What is more, the cost of most weapons systems is prohibitive. It is estimated that the real dollar cost of new bomber aircraft has risen approximately 2,000 percent since 1933, an estimated annual growth rate of 13.3 percent, far faster than the growth of all but one or two national economies. Fighter aircraft have increased in cost by 9.9 percent a year. Aircraft carriers, unaffordable by most nations, increase at about 10 percent a year. At the same time, personnel costs have also climbed, leaving less and less for new weapons or replacement parts.
As a consequence, most states have been priced out of the ability to project force. Only forty of almost two hundred countries have active forces that can muster more than a hundred thousand troops. Only twenty-three have more than two hundred thousand. Currently, for example, Sweden’s active duty military is down to just over thirteen thousand troops, which, despite reserves of a couple hundred thousand, still leads to halfhearted jokes among insiders that there are more admirals and generals in the Swedish military today than there are ships, tanks, planes, or men for them to command.
Whether the reason that fewer states have the ability or the inclination to either project force or repel aggression lies in the unthinkable nature of modern conflict, its financial costs, or international laws, one thing is certain: each passing century has seen fewer wars involving great powers. The twenty-first seems as if it will continue the trend. According to Charles Tilly, the sixteenth century saw 34 such wars, the seventeenth saw 29, the eighteenth saw 17, the nineteenth saw 20, and the twentieth saw 15. The average duration of the wars has fallen from 1.6 years to 0.4, and the proportion of years of the century during which wars were under way has fallen from 95 percent to 53 percent.
It is clear that by accident or design, for the moment at least, the great powers have made the world safer for themselves. Smaller countries, however, must contend with both a diminished ability to influence affairs and, increasingly, the reappearance of private actors who are taking many forms, some ancient and some quite new.
Wars of Giants, Wars of Pygmies
Winston Churchill said, “When the war of the giants is over the wars of the pygmies will begin.” As prophetic as that may sound, the reality is that the wars of pygmies have been going on all along. In an era in which giants do not fight, smaller conflicts gain in profile and relative importance. Of course, after the consolidations of the nineteenth century, the post–World War II era saw the concept of the right of self-determination and the collapse of empire produce a proliferation of states. At the first meeting of the United Nations in 1945, only 51 states were represented. Today, there are more than 190 members. Many of these newer states are clearly among those who are the most limited in exercising the most basic of state powers, from controlling their borders to projecting force to creating a currency. And in many of these places where effective force was lacking and the void was compounded by the ineffectiveness of the United Nations and the inattentiveness of the greater powers, that void was filled by nonstate actors seeking to flex their muscles, gain an advantage, engender change, profit, or simply take refuge. Sometimes the greater powers actually collaborated in exacerbating such situations, encouraging these destabilizing activities because they perceived them as benefiting the bigger states’ broader strategies.
From Colombia to the Lakes Region of Africa, rebel groups, narcotics traffickers, and even powerful street gangs have played dominant roles in writing the history of violent conflict in the years since the great wars. One place that exemplifies the modern phenomenon is no stranger to the turmoil caused when the world’s most important countries seek to impose global order. This place, Afghanistan, experienced such turmoil in the time of Alexander the Great, and it experienced it in the days when the most important force in its region was the British East India Company. In fact, it was the forces of the British East India Company entering Afghanistan via the Bolan Pass that set in motion the first Anglo-Afghan war in 1839, the opening move in what was called “the Great Game,” a competition for regional influence between the British and Russian empires.
From the middle of the twentieth century through the 1970s, Afghanistan was ruled by a king who kept the largely tribal and ethnically fragmented country more or less unified and functional. After a series of coups, however, the country’s government allied itself with the Soviet Union and began implementing secular laws. This triggered a backlash by the more religious and culturally conservative members of the country’s population. The United States, reflexively trying to contain any extension of Soviet influence, seized this opportunity and backed the country’s Islamist opposition.
In 1979, the Soviets invaded Afghanistan. They had an easier time shrugging off the condemnation of the United Nations than they did the tenacious mujahideen on the ground in Afghanistan—especially given the increasingly active support that the United States was smuggling to them from Pakistan. The conflict became the Soviet equivalent of Vietnam, and the hunger for candid talk about it was one of the triggers for glasnost and a tipping point in the series of reforms and upsets that ultimately undid the Communists in Moscow.
America’s allies on the ground—the Taliban—took control of the country and began imposing strict fundamentalist rule. They also provided a haven for terrorist groups, such as Al Qaeda, which itself had been a beneficiary of U.S. support during the war with the Soviets. It was, of course, Al Qaeda’s attacks on the United States on September 11, 2001, that made the world understand the potential impact of nonstate actors. Al Qaeda’s attack also framed in stark relief the difficulties associated with entering into a conflict against an enemy without a border, a capital city, or a formal army. The United States illustrated this by invading not one but two countries to go after Al Qaeda, and in the decade since, Al Qaeda has demonstrated the advantages of its structure by not only evading the United States in the mountains between Afghanistan and Pakistan and forcing the United States to spend great amounts of money, but also developing a network structure with nodes cropping up in weak states from Yemen to the Sudan to North Africa and even cells in the United States and Europe. By June 2010, the United States had been involved longer in its war with Al Qaeda than in any other war in its history. Almost certainly before the war is over, the United States will have spent more on it than on any war in its history. And it is a war with an entity that almost certainly has fewer members and affiliates than, say, the Swedish army, most of whom are impoverished, on the run, under-armed, or all of the above.
Meanwhile, back in Afghanistan, a U.S.-supported government with a massive Western coalition at its side is finding it nearly impossible to exert sufficient force on its own people to bring stability—despite the fact that the U.S. Senate Foreign Relations committee has estimated that coalition spending in Afghanistan provides in 2011 a staggering 97 percent of the country’s GDP. Porous borders, an inability to project force, total dependence on aid or illicit trade for the bulk of its economic prosperity, social and political instability—to call Afghanistan a true state is a classic case of the triumph of hope over experience. Indeed, it is something less than even a semi-state, more a failed state that has during extended periods of its recent history ceded control over significant portions of its territory to nonstate actors from tribal warlords to terrorist extremists. Despite the presence of the world’s most advanced military and all its new technologies, the power vacuum created by the weakness of that state has produced the kind of chaotic situation that has occurred throughout history in ungoverned or ineffectively governed regions.
Similarly, Somalia, despite having been colonized by both Britain and later Italy, has never shed its clan-based structure and has thus had a very hard time developing the necessary functions of a nation-state. In 1991, chaos there briefly drew in U.S. troops fresh from their triumph in the first Gulf war who were almost immediately humiliated in the blunder-and heroism-filled “Black Hawk Down” incident and its aftermath. Years later, order was briefly imposed by Islamists who held the local warlords in check for a few months, but then the United States backed an Ethiopian intervention to remove the Islamists and prevent the spread of Al Qaeda to yet another orderless corner of the globe. The Ethiopians attempted to impose the fourteenth new government the “country” had seen since 1991, but the cobbled-together coalition of thugs and gangsters who had been recruited to prop up the regime came apart and once again chaos ensued.
The world turned its back on the situation until groups of pirates—warlords of the African littoral—began to emulate the heroes of Britain’s golden age such as Drake: these pirates raided ships that passed through their waters off Somalia’s coast. Twenty thousand such vessels make their way to and from the Suez Canal by the Gulf of Aden each year, and the pickings were rich. Though the pirates were lightly armed and few in number, trying to contain them has motivated an effort incorporating support from the navies of the United States, China, India, Italy, Russia, France, Denmark, Saudi Arabia, Malaysia, Greece, Turkey, Britain, and Germany. The results have been disappointing at best, illustrating that while failed states can do little to defend themselves, they can offer excellent cover for nonstate actors, a far more difficult-to-penetrate cloud of disorder that represents one of the most prominent threats in a world of dramatically shifting power relationships.
It is certain that those relationships will change even further the moment one of those nonstate groups gets its hands on a weapon of mass destruction and deploys it against a more powerful foe and then retreats into, say, Waziristan or Somalia to plan its next move. In a series of interviews with leading terrorism experts, virtually all reckoned the possibility of such an event’s occurring in the next two decades as between “likely” and, in the words of one who is currently working with the U.S. government, “a certainty. We live in a world in which no one is as strong as us but in which virtually everyone is equally vulnerable.”
Mercenaries 2.0: The Reprivatization of Force
The redistribution of military power in the world was also abetted by the forces that led to the redistribution of economic power from the state to the market. Even as the U.S. government was dismantling Glass-Steagall and keeping the CFTC from monitoring the derivative instruments that would ultimately trigger the biggest financial crash in three-quarters of a century, it was also working on privatizing its national security capabilities. A report released in 1996 by the Department of Defense and the Defense Science Board urged consideration for outsourcing noncombat defense functions. It argued that such measures could “reduce the cost and improve the performance of [DoD’s] support activities.” The next year, the Defense Department’s Quadrennial Defense Review announced intentions to “adopt and adapt the lessons of the private sector.”
The champion for many of these reforms was an avowed disciple of Milton Friedman, someone who attended Friedman’s lectures while he served in the U.S. Congress and who actively supported Friedmanesque ideas such as an all-volunteer army. His name was Donald Rumsfeld. And, as secretary of defense under George W. Bush, he would oversee the accelerated privatization of many American defense capabilities that had historically been purely the responsibility of the state. In this he was, of course, abetted by Vice President Dick Cheney, who, prior to joining the government, was the CEO of one of the biggest beneficiaries of this process—Halliburton.
When the Bush team took office, defense spending was at its lowest level since 1979. Therefore, turning to the private sector for assistance with the rapid ramp-up “required” in the wake of 9/11 was easily defended—to the extent that anyone was even questioning the moves. And few were. Outsourcing, which had previously been used for administrative support, was now being widely and increasingly applied to field operations. In an interview with Middlebury professor Allison Stanger on her definitive study of this phenomenon, called One Nation Under Contract, Theresa Whelan, deputy assistant secretary of state for African affairs, argued that “we crossed the Rubicon in 2002 when we allowed Northrop Grumman to do training for peacekeeping in Africa. Before then we had used contractors for training in the classroom and for computer simulation exercises, but never before had they been deployed on the field.”
The move was also embraced by many within the military because they recognized that the increasingly technologically sophisticated nature of the modern military required skills that were more readily found within the private workforce. Further, those same technologies make it easier to network together disparate and far-flung public and private groups and have them effectively work as cohesive units. This in turn allows military leaders to focus their attention on core operations that can only be performed by government personnel.
Of course, it is becoming harder to know where such core operations end and “appropriate” activities for contractors begin. According to Stanger, over 80 percent of Department of Defense funds go to contractors, and an even greater amount of State Department funds go to private sector actors. Over 90 percent of U.S. Agency for International Development resources are channeled through private companies. The United States and the international public were generally unaware of these activities until March 31, 2004, when Sunni insurgents killed four contractors in Fallujah, Iraq. The men all worked for a company based in the swamps of North Carolina called Blackwater. Once a low-profile private security company that performed increasingly “mission-critical” assignments in the Iraqi war zone, Blackwater has since been renamed Xe Enterprises in an effort to regain the anonymity that once served it well but was squandered through a series of abuses.
Despite the fact that individual Blackwater foot soldiers were reportedly earning salaries greater than many of the generals who were giving them their orders, the company’s work in Fallujah that day in March 2004 was pretty humdrum. In the middle of a very volatile region, it was providing armed escorts for a convoy of trucks that were operated by yet another contractor—Eurest Support Service, a Halliburton company that provided kitchen equipment to the U.S. military. The operation had all the hallmarks of a screwup in waiting. The Blackwater men had never worked together. One had just arrived in the country. The team was undermanned in terms of what their contract required. The vehicles weren’t armored, a violation of Blackwater’s contract with ESS. And, unknown to all, someone had leaked the details of the convoy to insurgents. Slowed by the aftereffects of a roadside bomb as they entered the city, they were ambushed. A grenade was thrown. Machine gun fire rang out. Two of the Blackwater men were shot while trying to evade the attack in their jeep. They were torn apart. Two others were killed and then dragged to the Euphrates River, where their bodies were hung for ten hours before being cut down, burned, and paraded through the city.
Suddenly the private contractors were in the spotlight, even though the military was actually well along its way to hiring 1.8 million new contract employees between 2002 and 2005. (The United States is not alone in employing such contractors. It has become an increasingly common phenomenon worldwide, with British, South African, Russian, and Israeli companies among those noted for providing services that once were undertaken primarily by direct employees of national governments.) Among these was a new class of “military provider firm” that provided both support and combat services. As the families of the killed Blackwater employees found out, and as did other victims of missteps by contractors in the years since, one of the most serious problems associated with shifting so much of the responsibility for the defense of the world’s most powerful nation to such people is their lack of accountability.
After the Fallujah incident, the families of the four Blackwater contractors sued the company, alleging that it mismanaged the entire affair and then tried to cover it up. Blackwater argued it could not be sued because it was part of the U.S. “Total Force,” so any lawsuit would violate the president’s power as commander-in-chief. KBR, a Halliburton subsidiary, filed an amicus brief supporting this view. Blackwater used the same argument again when sued after an aircraft crash that claimed the lives of three U.S. soldiers. This time the company also made the astonishingly brazen argument that it was not liable for tort claims under the doctrine that grants the government sovereign immunity for tort suits when servicemen are injured in situations that “arise out of or are in the course of activity incident to service.” At the same time, the company has also asserted that its employees are not liable as they might be under the Uniform Code of Military Justice (UCMJ) because of their private status. Thus Blackwater has attempted to wedge itself into a space between existing laws in which it is accountable to no one and in fact would be less accountable than the government for which it was working.
Since these developments, Congress has attempted to add language to recent appropriations legislation that would make contractors liable under the UCMJ. Further, one of my former Kissinger Associates colleagues, L. Paul Bremer, the U.S. “viceroy” in Iraq who was famously protected not by the military but by Blackwater contractors, has asserted that his Order 17 giving immunity to all contractors did not make Blackwater and others immune from the law because they could still be subject to Iraqi law—a not wholly satisfactory argument, given the low likelihood that the United States would allow such a trial to take place or that Blackwater would allow its employees to wait in the country long enough to find out what would happen. The U.S. Justice Department avoided wading into these waters until September 2007, when five Blackwater men killed seventeen Iraqi civilians and wounded twenty others at a busy traffic circle in Baghdad. Charges were filed fifteen months later.
Another problem associated with private contractors is that the profit motive will not necessarily lead them in the same direction as the public interests of the United States This kind of conflict of interest came to light when one of the company’s cofounders, Al Clark, a former mentor of Blackwater’s CEO, quit because of a difference regarding a training assignment. Clark stated that he believed the company should share with the law enforcement officials it was training every bit of knowledge it had, while Prince argued that that was a business loser because it meant the officials would have no incentive to come back for more training. Other instances have seen contractors illegally inflate their invoices to the government, as in the case of the bizarrely named Custer Battles Company that was found to have defrauded the Coalition Provisional Authority in Iraq of millions of dollars. At the time the gouging was discovered, the company had already been awarded $100 million in contracts with the United States.
In Africa, the exploits of another private contractor named Executive Outcomes, which employed former elite soldiers of the South African military and police forces, made its name by offering itself for hire to regimes in West Africa to protect them from uprisings. In 1995, as rebel forces closed in on the capital of Sierra Leone and the governments of the United States and the United Kingdom as well as the United Nations refused to intervene, EO was retained by a Sierra Leonean army captain named Valentine Strasser who had read about it in Soldier of Fortune magazine. It had come highly recommended by a former British Special Forces soldier named Anthony Buckingham, an executive at a local diamond mine. Since the regime couldn’t afford EO’s fees, a deal was struck in which Buckingham offered to pay it for concessions in the diamond mining region.
Within a month, 160 EO personnel arrived with their own helicopter gunships. Within just over a week, the rebels were pushed deep into the jungle. Hundreds of rebels were killed or wounded. Once the battle was won, EO kept going straight to the diamond fields. There it held its ground and protected the operations of the mining company. Periodically, it reengaged on behalf of the government against the rebels, repeatedly defeating them. Strasser was ultimately pushed from office and elections were held. The new government invited the foreign mercenaries to depart. EO warned this would lead to problems for the government, and when the request to leave was not retracted, it departed, followed ninety-five days later by the new government, which was deposed by the rebels that EO had previously kept from the capital.
Finally, another problem cited by Stanger and other experts regarding the privatization of military affairs through the hiring of contractors is that the government has far too few people working on procurement to effectively oversee contracts. Consequently, contracts often call for the contractors themselves to monitor the activities of their subcontractors as the only, or at least the primary, means of ensuring the fulfillment of the government’s requirements. Needless to say, this too leads to undeniable conflicts of interest. Wrongdoing often escapes notice until a scandal hits the headlines. For example, twenty-seven of the thirty-seven interrogators at Abu Ghraib prison were contractors. DynCorp, a prominent contractor, was found to operate a sex slavery ring in the former Yugoslavia in which “girls as young as twelve were sold on an hourly, daily, or permanent basis.” And in August 2009, two former Blackwater employees testified in federal court that Blackwater’s owner, Erik Prince, may have “murdered or facilitated the murder of individuals who were cooperating with federal authorities investigating the companies.” They added that Blackwater was illegally smuggling weapons into Iraq, with Prince’s knowledge. When these revelations came to light, thanks to a public interest lawsuit, others followed, including one that Blackwater and specifically Prince were involved in a covert assassination program run by the CIA and the U.S. Joint Special Forces Command.
The CIA program became public when the agency’s director, Leon Panetta, alerted Congress that he had discovered and canceled the program, which had, among its objectives, the assassination of Osama bin Laden—one set of CIA-sponsored private operatives in pursuit of another from a prior era. It was subsequently revealed that Blackwater had participated in other intelligence-gathering operations run out of Pakistan in cooperation with the military, engaging in “snatch and grabs” and assassinations of Taliban and Al Qaeda operatives. Allegedly, unlike the program initially reported by Panetta, the second program was actually implemented and operated under a classified no-bid contract.
The Next War, Non-War, and Stora’s Brazilian Army
Controversy surrounds the use of private contractors by the U.S. national security establishment and major powers around the world. But no one predicts that the practice will be abandoned. This is due in part to the fact that virtually all developed world powers face major budgetary constraints and consequent pressure to reduce permanent government overhead, a condition that typically leads to hiring contractors. Further, the nature of future warfare is likely to grow more complex and to require more public-private collaboration. In fact, that collaboration may take new forms. For example, following the widely reported Stuxnet malware attack on the Iranian nuclear program, Iran accused Siemens, the company that wrote the code that was being penetrated in the targeted Iranian power facility, of helping the United States and Israel to launch the cyeberattack. An informed source who has investigated the matter has told me that he believes but could not prove that the attack would not have been possible without some assistance from Siemens. Siemens denies any involvement in that attack.
As the total war of the past is replaced by the invisible, highly technical “non-war” of tomorrow—constant probing and penetrating via advanced information technologies—what options will governments have when all the best and brightest programmers and technologists are working for the better salaries and benefits offered by the private sector? Similarly, as information technology providers are seen as offering the connective tissue of the electronic polities of tomorrow, how much of intelligence and security work will require similar collaborations—and create real ethical dilemmas for the businesses involved?
At the same time, it is unlikely that most of the states of the world will see a substantial enhancement of their ability to project force or even to protect their own borders from incursions by more powerful or technologically resourceful adversaries or rivals. For the most part, they will use it to maintain the peace at home.
Here again we arrive at another new public-private power equation, in which public power is used to lure global corporations to invest at home. There are countless examples each year of local authorities putting down protests or stabilizing regions in order to enable investors to move in. Take for example the case in rural Brazil in which women of the peasant network Via Capesina occupied lands they believed had been illegally purchased by a foreign shell company in contravention of announced Brazilian land reform policies intended to benefit the poor. The military was called in to suppress the protests. Protesters allege that the reason they were able to do so was a too-cozy relationship between the local Brazilian government and the foreign company seeking to expand its pulp and paper business to that developing part of Brazil’s state of Rio Grande do Sul.
The global company that was earning profits for its international shareholder base thanks to the Brazilian government’s local monopoly on the legitimate use of force was, of course, Stora Enso. It was no longer in need, it seems, of its own private army if it could influence a host government to use its own on Stora’s behalf.