3

POWER IN THE BALANCE

WITH A TIGHTLY INTEGRATED GLOBAL ECONOMY AND A PLANET-WIDE digital network, we are witnessing the birth of the world’s first truly global civilization. As knowledge and economic power are multiplied and dispersed far more widely and swiftly than by the Print and Industrial Revolutions, the political equilibrium of the world is undergoing a massive change on a scale not seen since the decades following Europe’s linkage by sea routes to the Americas and Asia 500 years ago.

As a result, the balance of power among nations is changing dramatically. Just as the Industrial Revolution led to the dominance of the world economy by Western Europe and the United States, the emergence of Earth Inc. is shifting economic power from West to East and spreading it to the new growth economies developing throughout the world. China, in particular, is overtaking the U.S. as the center of gravity in the global economy.

More importantly, just as nation-states emerged as the dominant form of political organization in the wake of the printing press, the emergence of the Global Mind is changing many of the social and political assumptions on which the nation-state system was based. Some of the sources of power traditionally wielded primarily by nations are no longer as firmly under their exclusive control. While our individual political identities remain primarily national, and will for a long time to come, the simultaneous globalization of information and markets is transferring power once reserved for national governments to private actors—including multinational corporations, networked entrepreneurs, and billions of individuals in the global middle class.

No nation can escape these powerful waves of change by unilaterally imposing its own design. The choices most relevant to our future are now ones that confront the world as a whole. But because nation-states retain the exclusive power to negotiate policies and implement them globally, the only practical way to reclaim control of our destiny is to seek a global consensus within the community of nations to secure the implementation of policies that protect human values. And since the end of World War II—at least until recently—most of the world has looked primarily to the United States of America for leadership when facing the need for such a consensus.

Many fear, however, that the ability of the U.S. to provide leadership in the world is declining in relative terms. In 2010, China became the world’s leading manufacturing nation, ending a period of U.S. leadership that had lasted for 110 years. An economic historian at Nuffield College, Oxford, Robert Allen, said this milestone marked the “closing of a 500-year cycle in economic history.” When China’s overall economic strength surpasses that of the United States later this decade, it will mark the first time since 1890 that any economy in the world has been larger than the American economy.

Worse, not since the 1890s has U.S. government decision making been as feeble, dysfunctional, and servile to corporate and other special interests as it is now. The gravity of the danger posed by this debasement of American democracy is still not widely understood. The subordination of reason-based analysis to the influence of wealth and power in U.S. decision making has led to catastrophically bad policy choices, sclerotic decision making, and a significant weakening of U.S. influence in the world.

Even a relative decline in the preeminence of the U.S. position in the world system has significant consequences. It remains “the indispensable nation” in reducing the potential for avoidable conflicts—keeping the sea lanes open, monitoring and countering terrorist groups, and playing a balancing role in tense regions like the Middle East and East Asia, and in regions (like Europe) that could face new tensions without strong U.S. leadership. Among its many other roles, the United States has also exercised responsibility for maintaining relative stability in the world’s international monetary system and has organized responses to periodic market crises.

At the moment, though, the degradation of the U.S. political system is causing a dangerous deficit of governance in the world system and a gap between the problems that need to be addressed and the vision and cooperation necessary to address them. This is the real fulcrum in the world’s balance of power today—and it is badly in need of repair. In the absence of strong U.S. leadership, the community of nations is apparently no longer able to coalesce in support of international coordination and agreements that establish the cooperative governance arrangements necessary for the solution of global problems.

Meetings of the G20 (which now commands more attention than the G8) have become little more than a series of annual opportunities for the leaders of its component nations to issue joint press releases. Their habit of wearing matching colorful shirts that represent the fashion motif of the host nation recalls the parable of the child who noticed that the emperor has no clothes. Except in this case, the clothes have no emperor.

Largely because of U.S. government decisions to follow the lead of powerful domestic corporate interests, once-hopeful multilateral negotiations—like the Doha Round of trade talks (commenced in 2001) and the Kyoto Protocol (commenced in 1997)—are now sometimes characterized as “zombies.” That is, they are neither alive nor dead; they just stagger around and scare people. Similarly, the Law of the Sea Treaty is in a condition of stasis.

The global institutions established with U.S. leadership after World War II—the United Nations, the World Bank, the International Monetary Fund, and the World Trade Organization (formerly the General Agreement on Tariffs and Trade)—are now largely ineffective because of the global changes that have shaken the geopolitical assumptions upon which they were based. Chief among them was the assumption that the U.S. would provide global leadership.

So long as the United States offered the vision necessary for these institutions—and so long as most of the world trusted that U.S. leadership would move the world community in a direction that benefited all—these institutions often worked well. If any nation’s goals are seen as being motivated by the pursuit of goals that are in the interest of all, its political power is greatly enhanced. By contrast, if the nation offering leadership to the world is seen as primarily promoting its own narrower interests—the commercial prospects of its corporations, for example—its capacity for leadership is diminished.

Two thirds of a century after their birth, these multilateral institutions face criticism from developing countries, environmentalists, and advocates for the poor because of what many see as “democratic deficits.” Both the World Bank and the International Monetary Fund require support from 85 percent of the voting rights held by member nations. Since the United States alone has more than 15 percent of the voting rights in both organizations, it has effective veto power over their decisions. Similarly, some countries ask why France and the United Kingdom are still among only five permanent members of the U.N. Security Council when Brazil, with a GDP larger than either, and India, whose GDP is greater than both combined and will soon be the most populous country in the world, are not.

The significant loss of confidence in U.S. leadership, especially since the economic crisis of 2007–08, has accelerated the shift in the equilibrium of power in the world. Some experts predict the emergence of a new equilibrium with both the United States and China sharing power at its center; some have already preemptively labeled it the “G2.”

RELATIVE OR ABSOLUTE DECLINE?

Other experts predict an unstable, and more dangerous, multipolar world. It seems most likely that the increasing integration of global markets and information flows will lead to an extended period of uncertainty before global power settles into a new more complex equilibrium that may not be defined by poles of power at all. The old division of the world into rich nations and poor nations is changing as many formerly poor nations now have faster economic growth rates than the wealthy developed nations. As the gap closes between these fast-growing developing and emerging economies on the one hand and the wealthy mature economies on the other, economic and political power are not only shifting from West to East, but are also being widely dispersed throughout the world: to São Paulo, Mumbai, Jakarta, Seoul, Taipei, Istanbul, Johannesburg, Lagos, Mexico City, Singapore, and Beijing.

Whatever new equilibrium of power emerges, its configuration will be determined by the resolution of several significant uncertainties about the future of the United States, China, and nation-states generally: First, is the United States really in a period of decline? If so, can the decline be reversed? And if not, is it merely relative to that of other nations, or is there a danger of an absolute decline? Second, is China likely to continue growing at its current rate or are there weaknesses in the foundations on which its prosperity is being built? Finally, are nation-states themselves losing relative power in the age of Earth Inc. and the Global Mind?

There is a lively dispute among scholars about whether the United States is in decline at all. The loss of U.S. geopolitical power has been a recurring theme for far longer than many Americans realize. Even before the U.S. became the most powerful nation, there were episodic warnings that American power was waning. Some argue that concerns about China overtaking the United States in forms of power other than economic output represent just another example of what happened when so many were concerned about Japan Inc. in the 1970s and 1980s—and even earlier concerns when the former Soviet Union was seen as a threat to U.S. dominance in the 1950s and 1960s.

For more than a decade following World War II, many strategic thinkers worried that the U.S. was in danger of quickly falling from the pinnacle of world power. When the USSR acquired nuclear weapons and tightened its grip on Eastern and Central Europe, these fears grew. When Sputnik was launched in 1957, making the USSR the first nation in space, the warning bells rung by declinists were heard even more loudly.

Many of the alarms currently being sounded about the decline of U.S. power are based on a comparison between our present difficulties and a misremembered sense of how completely the U.S. dominated global decision making in the second half of the twentieth century. A more realistic and textured view would take into account the fact that there was never a golden age in which U.S. designs were implemented successfully without resistance and multiple failures.

It is also worth remembering that while the U.S. share of global economic output fell from 50 percent in the late 1940s to roughly 25 percent in the early 1970s, it has remained at that same level for the last forty years. The rise of China’s share of global GDP and the economic strength of other emerging and developing economies has come largely at the expense of Europe, not the United States.

The rise of the United States as the dominant global power began early in the twentieth century when it first became the world’s largest economy, when President Theodore Roosevelt aggressively asserted U.S. diplomatic and military power, and when it played the crucial role in determining the outcome of World War I under President Woodrow Wilson. And of course after providing the decisive economic and military strength to defeat the Axis powers in World War II, the United States emerged as the victor in both the European and Pacific theaters and was recognized as the leading power in the world. The economies of the European nations had been devastated and exhausted by the war. Those of Japan and Germany had been destroyed. The Soviet Union, having suffered casualties 100 times greater than those of the United States, had been weakened. Whatever antithetical moral authority it might have once aspired to under Lenin had been long since destroyed by Stalin’s 1939 pact with Hitler and his exceptional cruelty and brutality toward his own people.

Moving quickly, the United States provided crucial leadership to establish the postwar institutions for world order and global governance. These included the Bretton Woods Agreement, which formalized the U.S. dollar as the world’s reserve currency, and a series of regional military self-defense alliances, the most important of which was NATO, the North Atlantic Treaty Organization. By using foreign aid and generous trade agreements that provided access to U.S. markets, the United States grew into an even more dominant role. And the United States promoted democratic capitalism throughout the noncommunist parts of the world.

It catalyzed the emergence of European economic and political integration by midwifing the European Coal and Steel Community (which later evolved into the Common Market and the European Union). And the visionary and generous Marshall Plan lifted the nations of Europe that had been devastated by World War II to prosperity and encouraged a commitment to democracy and regional integration. Secretary of State Cordell Hull, who was described by FDR as “the father of the United Nations,” was an advocate of freer reciprocal cross-border trade in Europe and the world, arguing that “when goods cross borders, armies do not.” By presiding over the reconstruction, democratization, and demilitarization of Japan, the United States also solidified its position as the dominant power in Asia.

In 1949, when the Soviet Union became the world’s second nuclear power and China embraced communism after the victory of Mao Zedong, the four-decade Cold War imposed its own dynamic on the operations of the world system. The nuclear standoff between the U.S. and the USSR was accompanied by a global struggle between two ideologies with competing designs for the organization of both politics and economics.

For several decades, the structure of the world’s equilibrium of power was defined by the constant tension between these two polar opposites. At one pole, the United States led an alliance of nations that included the recovering democracies of Western Europe and a reconstructed Japan, all of whom advocated the ideology of democratic capitalism. At the other pole, the Union of Soviet Socialist Republics led a captive group of nations in Central and Eastern Europe in advocating the ideology of communism. This abbreviated description belies more complex dynamics, of course, but virtually every political and military conflict in the world was shaped by this larger struggle.

When the Soviet Union was unable to compete with the economic strength of the United States (and was unable to adapt its command economy and authoritarian political culture to the early stages of the Information Revolution), it imploded. With the collapse of the Berlin Wall in 1989 and the subsequent breakup of the Soviet Union two years later (when Russia itself withdrew from the USSR), communism disappeared from the world as a serious ideological competitor.

U.S. HEGEMONY IN the world thus reached its peak, and the ideology of democratic capitalism spread so widely that one political philosopher speculated that we were seeing “the end of history”—implying that no further challenge to either democracy or capitalism was likely to emerge.

This ideological and political victory secured for the United States universal recognition as the dominant power in what appeared to be, at least for a brief period, a unipolar world. But once again, the superficial label concealed complex changes that accompanied the shift in the power equilibrium.

Well before the beginning of World War II, Soviet communism had run afoul of a basic truth about power that was clearly understood by the founders of the United States: when too much power is concentrated in the hands of one or a small group of people, it corrupts their judgment and their humanity.

American democracy, by contrast, was based on a sophisticated understanding of human nature, the superior quality of decision making to be found in what is now sometimes called the wisdom of crowds, and lessons learned from the history of the Roman Republic about the dangers posed to liberty by centralized power. Unhealthy concentrations of power were recognized to be detrimental to the survival of freedom. So power was separated into competing domains designed to check and balance one another in order to maintain a safe equipoise within which individuals could maintain their freedom to speak, worship, and assemble freely.

The ability of any nation to persuade others to follow its leadership is often greatly influenced by its moral authority. In the case of the United States, it is undeniably true that since the ratification of its Constitution and Bill of Rights in 1790–91, its founding principles have resonated in the hearts and minds of people throughout the world, no matter the country in which they live.

Since the end of the eighteenth century, there have been three waves of democracy that spread throughout the world. The first, in the aftermath of the American Revolution, produced twenty-nine democracies. When the Great Liberator, Simón Bolívar, led democratic revolutions in South America in the two decades after America’s founding, he carried a picture of George Washington in his breast pocket.

This was followed by a period of decline that shrank the number to twelve by the beginning of World War II. After 1945, the second wave of democratization swelled the number of democracies to thirty-six, but once again this expansion was followed by a decline to thirty from 1962 until the mid-1970s. The third wave began in the mid-1970s and then accelerated with the collapse of communism in 1989.

The struggle within the United States over policies that promote the higher values reflected in the U.S. Constitution—individual rights, for example—has often been lost to the interests of business and calculations of realpolitik. When Western European countries began to grant independence to their overseas colonies and pull back from the spheres of influence they had established during their imperial periods, the United States partially filled the resulting power vacuums by extending aid and forming economic, political, and military relationships with many of the newly independent nations. When the United States feared that the withdrawal of France from its colonial role in Vietnam might lead to the expansion of what some mistakenly viewed as a quasi-monolithic communist sphere, this misunderstanding of Ho Chi Minh’s fundamentally nationalist motivation contributed to the tragic miscalculation that resulted in the Vietnam War.

Nevertheless, in spite of its strategic mistake in Vietnam (following the earlier long and costly stalemate in the Korean War), heavy-handed military interventions in Latin America, and other difficult challenges, the U.S. consolidated its position of leadership in the world. The unprecedented growth of U.S. prosperity in the decades following World War II—along with its continued advocacy of freedom—made it an aspirational model for other countries. It is difficult to imagine that human rights and self-determination could have made as much progress throughout the world in the post–World War II era without the U.S. being in a dominant position.

More recently, the spread of democracy has slowed. Since the market crisis of 2007–08, there has been a decline in the number of democratic nations in the world and a degradation in the quality and extent of democracy in several others—including the United States. But even though the world is still in a “democratic recession,” some believe that the Arab Spring and other Internet-empowered democratic movements may signal the beginning of a fourth wave of democratization, though the results are still ambiguous at best.

In any case, it is premature to predict an absolute decline in U.S power. Among positive signs that the United States may yet slow its relative decline, the U.S. university system is still far and away the best in the world. Its venture investment culture continues to make the U.S. the greatest source of innovation and creativity. Although the U.S. military budget is lower as a percentage of GDP than it has been for most of the post–World War II era, it has increased in absolute terms to the highest level since 1945. The U.S. military is still by far the most powerful, best trained (by the best officer corps), best equipped, and most lavishly financed armed force the world has ever seen. Its annual budget is equal to the combined military budgets of the next fifty militaries in the world and almost equal to the military spending of the entire rest of the world put together.

AS SOMEONE WHO was frequently described as a pro-defense Democrat during my service in the Congress and in the White House, I have seen how valuable it has been for the United States and for the cause of freedom to maintain unquestioned military superiority. However, after more than a decade of fighting two seemingly endless wars, while simultaneously maintaining large deployments in Europe and Asia, U.S. military resources are strained to the point of breaking. And the relative decline of America’s economic power and wealth is beginning to force the reconsideration of such large military budgets.

The same global trends that have dispersed productive activity throughout Earth Inc. and connected people throughout the world to the Global Mind are also dispersing technologies relevant to warfare, which used to be monopolized by nation-states. The ability to launch destructive cyberattacks, for example, is now being widely spread on the Internet.

Some of the means of waging violent warfare are being robosourced and outsourced. The use of drones and other semiautonomous robotic weapons proliferated dramatically during the wars in Iraq and Afghanistan. The U.S. Air Force now trains more pilots for unmanned vehicles than it trains pilots of manned fighter jets. (Interestingly, the drone pilots suffer post-traumatic stress disorder at the same rate as fighter pilots even though they see their targets over a television screen thousands of miles away.)

On several occasions, drones have been hacked by the forces they are targeting. In 2010, intelligence analysts found that Islamic militants in Iraq used commercially available software selling for $26 to hack into the unencrypted video signals coming from U.S. drones and watch the same video in real time that was being sent to the U.S. controllers back in the United States. In Afghanistan, insurgent forces were able to do the same thing, and at the end of 2011, Iran hacked into the control system of a U.S. stealth drone and commanded it to land on an airstrip in Kashmar, Iran.

A new generation of robotic weapons in the air, on the land, and in the sea is being rapidly developed. More than fifty countries are now experimenting with semiautonomous military robots of their own. (A new legal doctrine of “robot rights” has been developed by U.S. military lawyers to give unmanned drones and robots the legal right to unleash deadly fire when threatened, just as a fighter pilot has the right to fire at a potential attacker as soon as he is alerted to the fact that a targeting radar has “lit up” his plane.)

At the same time, some dangerous combat missions are being outsourced. During the war in Iraq, the United States shifted significant operations in the war zone to private contractors.* After the unpopular Vietnam War, the United States abandoned the draft and has since relied on a professional volunteer army—which many claim emotionally insulates the American people from some of the impact wars used to have on the general population.

THE CHINA ISSUE

Meanwhile, China’s military budgets—while still only a fraction of U.S. defense spending—are increasing. Yet there are questions about the sustainability of China’s present economic buildup. Many feel that it is premature to predict a future in which China becomes the dominant global power, or even occupies the center of a new power equilibrium alongside the United States, because they doubt that the social, political, and economic foundations in China are durable. In spite of the economic progress in China, experts warn that the lack of free speech, the concentrated autocratic power in Beijing, and the high levels of corruption throughout China’s political and economic system raise questions about the sustainability of its recent growth rates.

For example, at the end of 2010, there were an estimated 64 million empty apartments in China. The building bubble there has been attributed to a number of causes, but for several years visitors have remarked upon the large number of subsidized high-rise apartment buildings that spring up quickly and remain unoccupied for very long periods of time. According to research by Morgan Stanley, almost 30 percent of the windmills constructed by China are not connected to the electrical grid; many have been placed in remote locations with strong winds but no economical way to extend the grid to them. China’s success in building its capacity to construct renewable energy systems of low cost has been of benefit to China and to the global market, but as with the many empty apartment buildings, the idle windmills serve as a warning that some trends in the Chinese economic miracle may not continue at the same pace. China’s banking system suffers from the same distortions of state manipulation. Some state-owned banks are recycling their allocations of credit into black market lending at usurious and unsustainable interest rates.

There are also questions about China’s social and political cohesion during what has already been a disruptive economic transition, accompanied by the largest internal migration in history and horrendous levels of pollution. Although precise statistics are hard to verify, a professor at Tsinghua University, Sun Liping, estimated that in 2010 there were “180,000 protests, riots and other mass incidents.” That number reflects a fourfold increase from 2000. Numerous other reports confirm that social unrest appears to be building in response to economic inequality, intolerable environmental conditions, and opposition to property seizures and other abuses by autocratic local and regional leaders. Partly as a result of dissatisfaction and unrest—particularly among internal migrant workers—wages have been increasing significantly in the last two years.

Some scholars have cautioned against a Western bias in prematurely predicting instability in countries whose governments do not gain democratic legitimacy. In China, according to some experts, legitimacy can be and is derived from other sources besides the participatory nature of their system. Since Confucian times, legitimacy has been gained in the eyes of the governed when the policies implemented are successful and when the persons placed in positions of power are seen to have earned their power in a form of meritocracy and demonstrate sufficient wisdom to seem well chosen.

IT IS PRECISELY these sources of legitimacy that are now most at risk in the United States. The sharp decline of public trust in government at all levels—and public trust in most all large institutions—is based in large measure on the perception that they are all failing to produce successful policies and outcomes. The previous prominence of reason-based decision making in the U.S. democratic system was its greatest source of strength. The ability of the United States, with only 5 percent of the world’s people, to lead the world for as long as it has is due in no small measure to the creativity, boldness, and effectiveness of its decision making in the past.

Ironically, the economic growth in China since the reforms of Deng Xiaoping, launched in 1978, were brought about not only by his embrace of a Chinese form of capitalism but also by his intellectual victory within the Chinese Central Committee in advocating reason-based analysis as the justification for abandoning stale communist economic dogma—and his political skill in portraying this dramatic shift as simply a reaffirmation of Maoist doctrine. In a speech to the All-Army Conference in the year his reforms were begun, Deng said, “Isn’t it true that seeking truth from facts, proceeding from reality and integrating theory with practice form the fundamental principle of Mao Zedong Thought?”

One reason for the rise of the United States over its first two centuries to the preeminent position among nations was that American democracy demonstrated a genius for “seeking truth from facts.” Over time, it produced better decisions and policies to promote its national interests than the government of any other nation. The robust debate that takes place when democratic institutions are healthy and functioning well results in more creative and visionary initiatives than any other system of government has proven capable of producing.

Unfortunately, however, the U.S. no longer has a well-functioning self-government. To use a phrase common in the computer software industry, American democracy has been hacked. The United States Congress, the avatar of the democratically elected national legislatures in the modern world, is now incapable of passing laws without permission from the corporate lobbies and other special interests that control their campaign finances.

THE LONG REACH OF CORPORATIONS

It is now common for lawyers representing corporate lobbies to sit in the actual drafting sessions where legislation is written, and to provide the precise language for new laws intended to remove obstacles to their corporate business plans—usually by weakening provisions of existing laws and regulations intended to protect the public interest against documented excesses and abuses. Many U.S. state legislatures often now routinely rubber-stamp laws that have been written in their entirety by corporate lobbies.

Having served as an elected official in the federal government for the last quarter of the twentieth century, and having observed it closely before that period and since, I have felt a sense of shock and dismay at how quickly the integrity and efficacy of American democracy has nearly collapsed. There have been other periods in American history when wealth and corporate power have dominated the operations of government, but there are reasons for concern that this may be more than a cyclical phenomenon—particularly recent court decisions that institutionalize the dominance and control of wealth and corporate power.

This crippling of democracy comes at a time of sweeping and tumultuous change in the world system, when the need for U.S. advocacy of democratic principles and human values has never been greater. The crucial decisions facing the world are unlikely to be made well, or at all, without bold and creative U.S. leadership. It is therefore especially important to restore the integrity of U.S. democracy. But in order to do so, it is necessary to accurately diagnose how it went so badly off track. The shift of power from democracy to markets and corporations has a long history.

In general, political freedom and economic freedom have reinforced one another. The new paradigm born in the era of the printing press was based on the principle that individuals had dignity, and when armed with the free flow of information could best chart their own destinies in both the political and economic realms by aggregating their collective wisdom through regular elections of representatives, and through the “invisible hand” of supply and demand.

Throughout history, capitalism has been more conducive to higher levels of political and religious freedom than any other way of organizing economic activity. But internal tensions in the compound ideology of democratic capitalism have always been present and frequently difficult to reconcile. Just as America’s founders feared concentrated political power, many of them also worried about the impact on democracy of too much concentrated economic power—particularly in the form of corporations.

The longest running corporation was created in Sweden in 1347, though the legal form did not become common until the seventeenth century, when the Netherlands and the United Kingdom allowed a proliferation of corporate charters, especially for the exploitation of trade to and from their new overseas colonies. After a series of spectacular frauds and other abuses, including the South Sea Company scandal (which gave birth to the economic concept of a “bubble”), England banned corporations in 1720. (The prohibition was not lifted until 1825 when the Industrial Revolution required the capitalization of railway companies and other new firms to exploit emerging technologies.)

The American revolutionaries were keenly aware of this history and originally chartered corporations mostly for civic and charitable purposes, and only for limited periods of time. Business corporations came later, in response to the need to raise capital for industrialization.

Referring to the English experience, Thomas Jefferson wrote in a letter to U.S. Senator George Logan of Pennsylvania in 1816, “I hope we shall take warning from the example and crush in its birth the aristocracy of our monied corporations which dare already to challenge our government to a trial of strength and bid defiance to the laws of our country.”

Between 1781 and 1790 the number of corporations expanded by an order of magnitude, from 33 to 328. Then in 1811, New York State enacted the first of many statutes that allowed the proliferation of corporations without specific and narrow limitations imposed by government.

So long as the vast majority of Americans lived and worked on farms, corporations remained relatively small and their impact on the conditions of labor and the quality of life was relatively limited. But during the Civil War, corporate power increased considerably with the mobilization of Northern industry, huge government procurement contracts, and the building of the railroads. In the years following the war, the corporate role in American life grew quickly, and the efforts by corporations to take control of the decisions in Congress and state legislatures grew as well.

The tainted election of 1876 (deadlocked on election night by disputed electoral votes in the state of Florida) was, according to historians, settled in secret negotiations in which corporate wealth and power played the decisive role, setting the stage for a period of corrupt deal making that eventually led the new president, Rutherford B. Hayes, to complain that “this is a government of the people, by the people and for the people no longer. It is a government of corporations, by corporations, and for corporations.”

As the Industrial Revolution began to reshape America, industrial accidents became commonplace. Between 1888 and 1908, 700,000 American workers were killed in industrial accidents—approximately 100 every day. In addition to providing brutal working conditions, employers also held wages as low as possible. Efforts by employees to obtain relief from these abuses by organizing strikes and seeking the passage of protective legislation provoked a fierce reaction from corporate owners. Private police forces brutalized those attempting to organize labor unions and lawyers and lobbyists flooded the U.S. Capitol and state legislatures.

When corporations began hiring lobbyists to influence the writing of laws, the initial reaction was one of disgust. In 1853, the U.S. Supreme Court voided and made unenforceable a contingency contract involving lobbying—in part because those providing the money did so in secret. The justices concluded that such lobbying was harmful to public policy because it “tends to corrupt or contaminate, by improper influences, the integrity of our … political institutions” and “sully the purity or mislead the judgments of those to whom the high trust of legislation is confided” with “undue influences” that have “all the injurious effects of a direct fraud on the public.”

Twenty years later, the U.S. Supreme Court addressed the question once again, invalidating contingency contracts for lobbyists with these words: “If any of the great corporations of the country were to hire adventurers who make market of themselves in this way, to procure the passage of a general law with a view to the promotion of their private interests, the moral sense of every right-minded man would instinctively denounce the employer and employed as steeped in corruption, and the employment as infamous. If the instances were numerous, open and tolerated, they would be regarded as measuring the decay of the public morals and the degeneracy of the times.” The state of Georgia’s new constitution explicitly banned the lobbying of legislators.

Nevertheless, the “promotion of private interests” in legislation grew by leaps and bounds as larger and larger fortunes were made during the heyday of the Industrial Revolution—and as the impact of general laws on corporate opportunities grew. During the Robber Baron era of the 1880s and 1890s, according to the definitive history by Matthew Josephson, “The halls of legislation were transformed into a mart where the price of votes was haggled over, and laws, made to order, were bought and sold.”

It was during this corrupt era that the U.S. Supreme Court first designated corporations as “persons” entitled to some of the protections of the Fourteenth Amendment in an 1886 decision (Santa Clara County v. Southern Pacific Railroad Company). The decision itself, in favor of the Southern Pacific, did not actually address the subject of corporate “personhood,” but language that some historians believe was written by Justice Stephen Field was added in the “headnotes” of the case by the court reporter, who was the former president of a railway company. The chief justice had signaled before hearing the oral arguments that “the court does not wish to hear argument on the question of whether … the Fourteenth Amendment … applies to these corporations. We are all of the opinion that it does.” (This backhanded precedent for the doctrine of corporate personhood was relied upon by conservative Supreme Courts in the late twentieth century for extensions of “individual rights” to corporations—and in the Citizens United decision in 2010.)

This pivotal case has an interesting connection to the first nerve endings of the worldwide communications networks that later became the Global Mind. The brother of Justice Field, Cyrus Field, laid the first transoceanic telegraph cable in 1858. A third Field brother, David (whose large campaign contributions to Abraham Lincoln had resulted in Stephen’s appointment to the Supreme Court), happened to be in Paris with his family during the Paris Commune in 1871, and used the telegraph cable to send news of the riots, disorder, and subsequent massacre back to the United States in real time. It was the first time in history that an overseas news event was followed in the United States, as it unfolded, on a daily basis.

Though the Paris Commune had complex causes (including the bitter emotions surrounding the French defeat in the Franco-Prussian War that month and the struggle between republicans and monarchists), it became the first symbolic clash between communism and capitalism. Karl Marx had published Das Kapital just four years earlier and wrote The Civil War in France during the two months of the Commune, saying that it would be “forever celebrated as the glorious harbinger of a new society.” A half century later, at Lenin’s funeral, his body was wrapped in a torn and tattered red and white flag that had been flown by Parisians during the two months of the Commune.

But as much as the Paris Commune inspired communists, it terrified elites in the United States, among them Justice Field, who was obsessively following the daily reports from his brother and journalists in Paris. The Paris Commune received more press coverage—almost all of it hostile—than any other story that year besides government corruption. The fear provoked by the Commune was magnified by labor unrest in the U.S., particularly by many who had arrived since the 1830s from the poorer countries of Europe in search of a better life but had been victimized by the unregulated abuses in low-wage industrial jobs. Two years later, the U.S. was plunged into a depression by the bankruptcy of financier and railroad entrepreneur Jay Cooke. Wages fell even lower and unemployment climbed even higher. The New York Times warned, “There is a ‘dangerous class’ in New York, quite as much as in Paris, and they want only the opportunity or the incentive to spread abroad the anarchy and ruin of the French Commune.”

According to historians, Justice Field was so radicalized by the Commune and what he feared were its implications for U.S. class warfare that he decided to make it his mission to strengthen corporations. His strategy was to use the new Fourteenth Amendment, which had been designed to confer the constitutional rights of persons on the freed slaves, as a vehicle for extending the rights of persons to corporations instead.

By the last decade of the nineteenth century, concentrated corporate power had attained such a shocking degree of control over American democracy that it triggered a populist reaction. When the Industrial Revolution resulted in the mass migration of Americans from farms to cities, and public concern grew over excesses and abuses such as child labor, long working hours, low wages, dangerous work environments, and unsafe food and medicines, reformers worked within the democracy sphere to demand new government policies and protections in the marketplace.

The Progressive movement at the turn of the twentieth century began implementing new laws to rein in corporate power, including the first broad antitrust law, the Sherman Act of 1898, though the Supreme Court sharply limited its constitutionality, as it limited the application and enforcement of virtually all Progressive legislation. In 1901, after the pro-corporate president William McKinley was assassinated only six months into his term, Theodore Roosevelt unexpectedly became president, and the following year launched an extraordinary assault on monopolies and abuses of overbearing corporate power.

Roosevelt established the Bureau of Corporations inside his new Department of Commerce and Labor. He launched an antitrust suit to break up J. P. Morgan’s Northern Securities Corporation, which included 112 corporations worth a combined $571 billion (in 2012 dollars), at the beginning of the twentieth century, and was worth “twice the total assessed value of all property in thirteen states in the southern United States.” This was followed by forty more antitrust suits. A seemingly inexhaustible source of presidential energy, Roosevelt also passed the Pure Food and Drug Act and protected more than 230 million acres of land, including the Grand Canyon, the Muir Woods, and the Tongass forest reserve—all while building the Panama Canal and winning the Nobel Peace Prize for resolving the Russo-Japanese War.

Roosevelt made a fateful decision at the beginning of his presidency not to run for a second full term in 1908, noting that he had served almost the full eight years that George Washington had established as the “wise custom” by serving only two terms. When Roosevelt’s handpicked successor, William Howard Taft, abandoned many of TR’s reforms, the march of corporate power resumed. In response, Roosevelt began to organize his Bull Moose Party campaign to replace Taft as president in the election of 1912.

In October of 1910, Roosevelt said, “Exactly as the special interests of cotton and slavery threatened our political integrity before the Civil War, so now the great special business interests too often control and corrupt the men and methods of government for their own profit.” Eighteen months later, in the midst of the campaign, he said that his party was engaged in a struggle for its soul:

The Republican party is now facing a great crisis. It is to decide whether it will be, as in the days of Lincoln, the party of the plain people, the party of progress, the party of social and industrial justice; or whether it will be the party of privilege and of special interests, the heir to those who were Lincoln’s most bitter opponents, the party that represents the great interests within and without Wall Street which desire through their control over the servants of the public to be kept immune from punishment when they do wrong and to be given privileges to which they are not entitled.

After Roosevelt lost that campaign to Woodrow Wilson (Taft came in third), he continued to speak out forcefully in favor of Progressive reforms and a rollback of corporate power. He said that the most important test of the country remained “the struggle of free men to gain and hold the right of self-government as against the special interests, who twist the methods of free government into machinery for defeating the popular will.” He proposed that the U.S. “prohibit the use of corporate funds directly or indirectly for political purposes,” and in speech after speech, argued that the Constitution “does not give the right of suffrage to any corporation.” Thanks in part to his vigorous advocacy, the Progressive movement gained strength, passing a constitutional amendment to reverse the Supreme Court’s prohibition against an income tax, enacting an inheritance tax, and enacting numerous regulations to rein in corporate abuses.

The many Progressive reforms continued during Woodrow Wilson’s presidency, but the pendulum shifted back toward corporate dominance of democracy during the Warren Harding administration—remembered for its corruption, including the Teapot Dome scandal in which oil company executives secretly bribed Harding administration officials for access to oil on public lands.

Following three pro-corporate Republican presidents, President Franklin Roosevelt launched the second wave of reform when he took office in 1933 in the midst of the suffering caused by the Great Depression that was triggered by the stock market crash of 1929. The New Deal expanded federal power in the marketplace to a formidable scale and scope. But once again the conservative Supreme Court stopped many of the Progressive initiatives, declaring them unconstitutional. Theodore Roosevelt had declared the justices “a menace to the welfare of the nation” and FDR essentially did the same. But he went further, proposing a court-packing plan to add to the number of justices on the court in an effort to dilute the power of the pro-business majority.

Historians differ on whether Roosevelt’s threat was the cause or not, but a few months later the Supreme Court reversed course and began approving the constitutionality of most New Deal proposals. To this day, some right-wing legal advocates refer to the court’s switch as a “betrayal.” In the twenty-first century, right-wing judicial activists are trying to return court rulings to the philosophy that existed prior to the New Deal.

In spite of FDR’s initiatives, the U.S. found it difficult to escape hard times, and slipped back into depression in 1938. Then, when America mobilized to respond to the totalitarian threat from Nazi Germany and Imperial Japan, the Depression finally ended. After the U.S. emerged victorious, its remarkable economic expansion continued for more than three decades. By then, the consensus in favor of an expanded role for the federal government in addressing national problems was supported by a majority of voters across the political spectrum.

In the turbulent decade of the 1960s, however, the seeds of a corporate-led counterreform movement were planted. After the assassination of President John F. Kennedy in the fall of 1963, a variety of social reform movements swept the nation—driven in part by the restless energy and idealism of the huge postwar baby boom generation just entering young adulthood. The civil rights movement, the women’s movement, the first gay rights demonstrations, the consumer rights movement, Lyndon Johnson’s War on Poverty, and the escalating protests against the continuation of the ill-considered proxy war against communism in Southeast Asia all combined to produce a fearful reaction by corporate interests and conservative ideologues.

Just as the Paris Commune had radicalized Justice Stephen Field 100 years earlier, the social movements in the U.S. during the 1960s also awakened a fear of disorder, radicalized a generation of right-wing market fundamentalists, and instilled a sense of mission in soon-to-be Supreme Court Justice Lewis Powell. Powell, a Richmond lawyer then best known for representing the tobacco industry after the surgeon general’s 1964 linkage of cigarettes to lung cancer, wrote a lengthy and historic 1971 memorandum for the U.S. Chamber of Commerce in which he presented a comprehensive plan for a sustained and massively funded long-term effort to change the nature of the U.S. Congress, state legislatures, and the judiciary in order to tilt the balance in favor of corporate interests. Powell was appointed to the Supreme Court by President Nixon two months later—though his plan for the Chamber of Commerce was not disclosed publicly until long after his confirmation hearings. A former president of the American College of Trial Lawyers, Powell was widely respected, even by his ideological opponents. But his aggressive expansion of corporate rights was the most consequential development during his tenure on the court.

Justice Powell wrote decisions creating the novel concept of “corporate speech,” which he found to be protected by the First Amendment. This doctrine was then used by the court to invalidate numerous laws that were intended to restrain corporate power when it interfered with the public interest. In 1978, for example, Powell wrote the opinion in a 5–4 decision that for the first time struck down state laws prohibiting corporate money in an election (a citizens referendum in Massachusetts) on the grounds that the law violated the free speech of “corporate persons.” Thirty-two years later, the U.S. Supreme Court relied on Powell’s opinion to allow wealthy individual donors to contribute unlimited amounts to campaigns secretly, and further expanded the 1886 Southern Pacific precedent declaring corporations to be persons.

While it is true that corporations are made up of individuals, the absurdity of the legal theory that corporations are “persons”—as defined in the Constitution—is evident from a comparison between the essential nature and motives of corporations compared to those of flesh-and-blood human beings. Most corporations are legally chartered by the state with an ironclad mandate to focus narrowly on the financial interests of their shareholders. They are theoretically immortal and often have access to vast wealth. Twenty-five U.S.-based multinational corporations have revenues larger than many of the world’s nation-states. More than half (53) of the 100 largest economies on Earth are now corporations. ExxonMobil, one of the largest corporations in the world, measured by revenue and profits, has a larger economic impact than the nation of Norway.

Individuals are capable of decisions that reflect factors other than their narrow financial self-interest; they are capable of feeling concern about the future their children and grandchildren will inherit—not just the money they will leave them in their wills; America’s founders decided as individuals, for example, to pledge “our Lives, our Fortunes, and our Sacred Honor” to a cause deemed far greater than money. Corporate “persons,” on the other hand, now often seem to have little regard for how they can help the country in which they are based; they are only concerned about how that country can help them make more money.

At an oil industry gathering in Washington, D.C., an executive from another company asked the then CEO of Exxon, Lee Raymond, to consider building additional refinery capacity inside the United States “for security” against possible shortages of gasoline. According to those present, Raymond replied, “I’m not a U.S. company and I don’t make decisions based on what’s good for the U.S.” Raymond’s statement recalls the warning by Thomas Jefferson in 1809, barely a month after leaving the White House, when he wrote to John Jay about “the selfish spirit of commerce, which knows no country, and feels no passion or principle but that of gain.”

With the emergence of Earth Inc., multinational corporations have also acquired the ability to play nation-states off against one another, locating facilities in jurisdictions with lower wages and less onerous restrictions on their freedom to operate as they wish. The late chairman of the libertarian Cato Institute, William Niskanen, said, “corporations have become sufficiently powerful to pose a threat to governments,” adding that this is “particularly the case with respect to multinational corporations, who will have much less dependence upon the positions of particular governments, much less loyalty in that sense.” In 2001, President George W. Bush was asked by the prime minister of India, Manmohan Singh, to influence ExxonMobil’s pending decision on allowing India’s state-owned oil company to participate in a joint venture including the oil company and the government of Russia. Bush replied, “Nobody tells those guys what to do.”

Those who advocate expanding the market sector at the expense of democratic authority believe that governments should rarely have the power to tell corporations “what to do.” For the last forty years, pursuant to the Powell Plan, corporations and conservative ideologues have not only focused on the selection of Supreme Court justices favorable to their cause and sought to influence Court opinions, they have also pursued a determined effort to influence the writing of laws and the formation of policies to expand corporate power. They dramatically increased corporate advertising aimed at conditioning public opinion. They significantly expanded the number of lobbyists hired to pursue their interests in Washington, D.C., and state capitals. And they significantly increased their campaign contributions to candidates who pledged to support their agenda.

In only a decade, the number of corporate political action committees exploded from less than 90 to 1,500. The number of corporations with registered corporate lobbyists increased from 175 to 2,500. Since then, the number has continued to increase dramatically; recorded expenditures by lobbyists increased from $100 million in 1975 to $3.5 billion per year in 2010. (The U.S. Chamber of Commerce continues to top the list of lobbying expenditures, with more than $100 million per year—more than all lobbyist expenditures combined when the Powell Plan was first conceived.) One measure of how quickly attitudes toward lobbying have changed in the political culture of Washington was that in the 1970s, only 3 percent of retiring members of Congress gained employment as lobbyists; now, more than 50 percent of retiring senators and more than 40 percent of retiring House members become lobbyists.

Corporate coffers were far from the only source of funding for efforts consistent with the Powell Plan. Several wealthy conservative individuals and foundations were also radicalized by the 1960s, which Powell had described as “ideological warfare against the enterprise system and the values of Western society.” When he called for an organized, well-funded response by “business to this massive assault upon its fundamental economics, upon its philosophy, upon its right to manage its own affairs, and indeed upon its integrity,” many conservative business leaders rose to answer Powell’s charge.

John M. Olin, for example, reacted to the armed takeover by militant black students of a campus building at Cornell University, his alma mater, by refocusing his wealthy foundation to support right-wing think tanks and a variety of right-wing efforts to change the character of American government. He embarked on a plan to not only spend the annual income from his foundation’s endowment but to spend down the entire principal as quickly as possible in order to have maximum impact. Numerous other right-wing foundations also financed efforts consistent with the Powell Plan, including the Lynde and Harry Bradley Foundation and the Adolph Coors Foundation.

Perhaps the most effective part of the heavily funded conservative strategy has been their focus on populating the federal courts—particularly the U.S. Supreme Court—with ideological allies. The Powell Plan had noted specifically, “Under our constitutional system, especially with an activist-minded Supreme Court, the judiciary may be the most important instrument for social, economic and political change.… This is a vast area of opportunity for the Chamber … if, in turn, business is willing to provide the funds.”

Subsequently, corporate interests became particularly active and persistent in lobbying to place judges on the bench who would be responsive to conservative legal theories that diminish individual rights, constrict the sphere of democracy, and elevate the rights and freedom of action for corporations. They have also established conservative law schools to train an entire generation of counterreformist advocates, and a network of legal foundations to influence the course of American jurisprudence. Two U.S. Supreme Court justices have even taken corporate-funded vacations at resorts where they were treated to legal instruction in seminars organized by wealthy corporate interests.

Meanwhile, the highly organized and well-funded counterreform movement also created and funded think tanks charged with producing research and policy initiatives designed to further corporate interests. In addition, they financed the creation of political movements at the local, state, and national level. By the 1980s and 1990s, this movement launched fierce battles to place opponents of robust government policies in state legislatures, in Congress, and in the White House. Ronald Reagan’s defeat of Jimmy Carter was their first watershed victory, and the takeover of Congress in the mid-1990s solidified their ability to bring most Progressive reform to a halt.

In part, the policies of FDR—which had been, in the main, supported by presidents and Congresses of both parties for several decades—were victims of their own success. As tens of millions were lifted into the middle class, many lost their enthusiasm for continued government interventions, in part because they began to resist the levels of taxation necessary to support a more robust government role in the economy. Labor unions, one of the few organized forces supporting continued reform, lost members as more jobs migrated from manufacturing into services, and as outsourcing and robosourcing hollowed out the U.S. middle class. The nature and sources of America’s economic strength have changed over the last several decades as manufacturing has declined. America’s branch of Earth Inc. can’t be driven solely by wages—investment is of course critical—but the tilt is important, and too little noted.

Slowly at first, but then with increasing momentum, the prevailing ideology of the United States—democratic capitalism—has shifted profoundly on its axis. During the decades of conflict with communism, the internal cohesion between the democratic and capitalist spheres was particularly strong. But when communism disappeared as an ideological competitor and democratic capitalism became the ideology of choice throughout most of the world, the internal tensions between the democratic sphere and the capitalist sphere reappeared. As economic globalization accelerated, the imperatives of business were relentlessly pursued by multinational corporations. With triumphalist fervor and the enormous resources made available for a sustained implementation of the Powell Plan, corporate and right-wing forces set about diminishing the role of government in American society and enhancing the power of corporations.

Market fundamentalists began to advocate the reallocation of decision-making power from democratic processes to market mechanisms. There were proposals to privatize—and corporatize—schools, prisons, public hospitals, highways, bridges, airports, water and power utilities, police, fire, and emergency services, some military operations, and other basic functions that had been performed by democratically elected governments.

By contrast, virtually any proposal that required the exertion of governmental authority—even if it was proposed, debated, designed, and decided in a free democratic process—was often described as a dangerous and despicable step toward totalitarianism. Advocates of policies shaped within the democratic sphere and implemented through the instruments of self-government sometimes found themselves accused of being agents of the discredited ideology that had been triumphantly defeated during the long struggle with communism. The very notion that something called the public interest even existed was derided and attacked as a dangerous concept.

By then, the encroachment of big money into the democratic process had convinced many Democrats as well as almost all Republicans to adopt the new ideology that supported the contraction of the democratic sphere and the expansion of the market sphere. It was during this same transition period that television supplanted newspapers as the principal source of information for the majority of voters, and the role of money in political campaigns increased, giving corporate and other special interest donors an even more unhealthy degree of power over the deliberations of the United States Congress and state legislatures.

When the decisions of the United States result not from democratic debate but are instead determined by powerful special interests, the results can be devastating to the interests of the American people. Underfunded and poorly designed U.S. social policies have produced a relative decline in the conditions of life. Compared to the other nineteen advanced industrial democracies in the Organisation for Economic Co-operation and Development (OECD), the United States has the highest inequality of incomes and the highest poverty rate; the lowest “material well-being of children” according to the United Nations’ index, the highest child poverty rate and the highest infant mortality rate; the biggest prison population and the highest homicide rate; the biggest expenditures on health care and the largest percentage of its citizens unable to afford health care.

At the same time, the success by corporate interests in reducing regulatory oversight created new risks for the U.S. economy. For example, the deregulation of the financial services industry, which accompanied the massive increase in flows of trade and investment throughout the world, led directly to the credit crisis of 2007, which caused the Great Recession (which some economists are now calling “the Second Great Contraction” or “the Lesser Depression”).

The international consequences of that spectacular market failure dramatically undermined global confidence in U.S. leadership of economic policy and marked the end of an extraordinary period of U.S. dominance. Nations had generally accepted the so-called Washington Consensus as the best formula for putting their economies on sound footing and building the capacity for sustainable growth. Although most of the policy recommendations contained in the consensus were broadly seen as reflecting sound economic common sense, they tended to expand the market sphere in domestic economies as they removed barriers to global trade and investment flows.

Two other factors combined with the 2007–08 economic crisis to undermine the leadership of the United States: first, the rise of China’s economy, which did not follow the prescriptions of the Washington Consensus even though its success was driven by the uniquely Chinese form of capitalism; second, the catastrophic invasion of Iraq—for reasons that were later proven to be false and dishonest.

Within the United States, it is a measure of how distorted the “conversation of democracy” has become that in the aftermath of the economic catastrophe, the most significant “populist” reaction in the U.S. political system was not a progressive demand for protective regulations to prevent a recurrence of what had just happened, but instead a right-wing faux-populist demand by the Tea Party for less government regulation. This movement was financed and hijacked by corporate and right-wing lobbyists who took advantage of the sense of grievance and steered it toward support of an agenda that promoted corporate interests and further diminished the ability of the government to rein in abuses. Extreme partisanship by congressional Tea Party Republicans almost produced a default of the U.S. government in 2011, and threatened to again at the end of 2012.

The sudden growth of the Tea Party was also due in significant measure to its promotion by Fox News, which under the ownership of Rupert Murdoch and the leadership of a former media strategist for Richard Nixon—Roger Ailes—has exceeded the wildest dreams of the Powell Plan’s emphasis on changing the nature of American television. Powell had proposed that “The national television networks should be monitored in the same way that textbooks should be kept under constant surveillance.” He called for the creation of “opportunity for supporters of the American system” within the television medium.

The inability of American democracy to make difficult decisions is now threatening the nation’s economic future—and with it the ability of the world system to find a pathway forward toward a sustainable future. The exceptionally bitter partisan divide in the United States is nominally between the two major political parties. However, the nature of both Democrats and Republicans has evolved in ways that sharpen the differences between them. On the surface, it appears that Republicans have moved to the right and purged their party of moderates and extinguished the species of liberal Republicans that used to be a significant minority within the party. Democrats, according to this surface analysis, have moved to the left and have largely pushed out moderates and the conservative Democrats who used to play a prominent role in the party.

Beneath the surface, however, the changes are far more complex. Both political parties have become so dependent on business lobbies for the large sums of money they must have to purchase television advertisements in order to be reelected that special interest legislation pushed by the industries most active in purchasing influence—financial services, carbon-based energy companies, pharmaceutical companies, and others—can count on large bipartisan majorities. The historic shift of the internal boundary between the overlapping capitalist and democratic spheres that make up America’s reigning ideology, democratic capitalism, has resulted in increased support within both parties for measures that constrain the role of government.

This shift has now moved so far to the right that it is not unusual for Democrats to propose ideas that originated with Republicans a few years ago, only to have them summarily rejected as “socialist.” The resulting impasse threatens the future of hugely popular entitlement programs, including Social Security and Medicare, and is heightening partisan divisions on questions considered basic and nonnegotiable on both sides. The tensions have grown more impassioned and bitter than at any point in American history since the decades leading up to the Civil War.

“Market fundamentalism” has acquired, in the eyes of its critics, a quasi-religious fervor reminiscent of the zeal that many Marxists displayed before communism failed—although those to whom the label applies feel that liberals and progressives pursue the ideology of “statism” with a single-minded devotion. U.S. self-government is now almost completely dysfunctional, incapable of making important decisions necessary to reclaim control of its destiny.

James Madison, one of the most articulate of America’s extraordinary founders, warned in his Federalist No. 10 about the “propensity of mankind to fall into mutual animosities” and cluster into opposing groups, parties, or factions:

The latent causes of faction are thus sown in the nature of man; and we see them everywhere brought into different degrees of activity, according to the different circumstances of civil society. A zeal for different opinions concerning religion, concerning government, and many other points, as well of speculation as of practice; an attachment to different leaders ambitiously contending for pre-eminence and power; or to persons of other descriptions whose fortunes have been interesting to the human passions, have, in turn, divided mankind into parties, inflamed them with mutual animosity, and rendered them much more disposed to vex and oppress each other than to co-operate for their common good.

Madison noted that this tendency in human nature is so strong that even “the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.” But he went on to highlight the “most common and durable source of factions” as “the various and unequal distribution of property.” The inequality in the distribution of wealth, property, and income in the United States is now larger than at any time since 1929. The outbreak of the Occupy movement has been driven by the dawning awareness of the majority of Americans that the operations of democratic capitalism in its current form are producing unfair and intolerable results. But the weakened state of democratic decision making in the U.S., and the enhanced control over American democracy by the forces of wealth and corporate power, have paralyzed the ability of the country to make rational decisions in favor of policies that would remedy these problems.

These two trends, unfortunately, reinforce one another. The more control over democratic decision making by powerful wealthy interests, the more they are able to ensure that decisions on policy enhance their wealth and power. This classic positive feedback loop makes inequality steadily worse, even as it makes democratic solutions for inequality less accessible.

The issue of inequality has become a political, ideological, and psychological fault line. Neuroscientists and psychologists have deepened the understanding of political scientists about the true nature of the “left-right” or “liberal-conservative” divide in the politics of every country. Research shows conclusively that these differences are also “sown in the nature of man,” and that in every society there is a basic temperamental divide between those who are relatively more tolerant and others who are relatively less tolerant of inequality.

The same divide separates those for whom it is relatively more or relatively less important to care for the weak and victimized, maintain respect for authority—particularly when disorder is threatening—prioritize loyalty to one’s group or nation, demonstrate patriotism, and honor the sanctity of symbols and objects that represent group values. Both groups value liberty and fairness but think about them differently. Recent research indicates that these temperamental differences may be, in part, genetically based, but perhaps more importantly, the differences are reinforced by social feedback loops.

The issue of inequality also lies on the ideological fault line between democracy and capitalism. For those who prioritize capitalism, inequality is seen as an obvious and necessary condition for the incentivization of productive activity. If some receive outsized rewards in the marketplace, that is a beneficial outcome not only for those so rewarded but for the capitalist system as a whole, because it demonstrates to others what can happen if they too become more productive.

For those who prioritize democracy, the tolerance of persistent inequality is far more likely to stimulate demands for change in the underlying policies that consistently produce unequal outcomes. Inheritance taxes have become a flashpoint in American politics. Why, ask liberals, is there a social value in failing to redistribute some portion of great fortunes when a wealthy person dies? Yet for conservatives, the ability to pass on great wealth at death is just another part of the incentive to earn great wealth in the first place. And they view the imposition of what they call a “death tax” (a label coined by a conservative strategist who conducted deep research on what language would most trigger feelings of outrage) as an encroachment upon their freedom. In my own view, it is absurd to eliminate inheritance taxes; they should be raised instead. The extreme concentration of wealth is destructive to economic vitality and to the health of democracy.

Any legislative effort to address inequality with measures that require funding through taxes of any sort has also come to mark the political fault line dividing the United States into two opposing factions. The corporate-led counterreform movement that began in the 1970s adopted as one of its key tenets a cynical strategy known as “starve the beast”; while proclaiming the importance of “balancing the budget” and “reducing deficits,” the movement pushed massive tax cuts as the initial step in a plan to use the resulting funding gap as an excuse to force sharp reductions in the role of government. This was part and parcel of the larger effort to diminish the democracy sphere and enhance the market sphere.

What is most troubling to advocates of American democracy is that the radically elevated role of money in politics has given the forces representing wealth and corporate power sufficient strength to advance their agenda even when a sizable majority of the American people oppose it. In effect, those who zealously advocate the expansion of the role of markets while demanding a constriction of the ability of people in democracies to enact policies that address the abuses and disruptive risks that often accompany unrestrained market activity are posing a threat to the internal logic of the nation-state itself.

America’s middle class has been hollowed out by, among other causes, the emergence of Earth Inc., the increasing proportion of retired Americans, and advances in the availability of expensive health care technologies. The result is a fast growing financial crisis that is threatening the ability of the United States to provide world leadership. U.S. government indebtedness compared to GDP is threatening to spiral out of control. According to a study by the nonpartisan Congressional Budget Office, the U.S. debt-to-GDP ratio is 70 percent in 2013, and already exceeds GDP if money the government owes to itself is added to the debt.

Although a highly publicized credit downgrade by the bond rating firm Standard & Poor’s in 2011 had no perceptible effect on the demand for U.S. bonds, experts have warned that a sudden loss of confidence in the dollar and in the viability of U.S. finances cannot be ruled out in the coming decade. Partly due to the weakness of the euro and a lack of trust in the Chinese yuan, or renminbi (RMB), the U.S. dollar remains the world’s reserve currency. For those and other reasons, the United States is still able to borrow from the rest of the world at extremely low interest rates—as of this writing at less than 2 percent for ten-year bonds.

Yet the looming financial troubles are potentially large enough to provoke a sudden loss of confidence in the future of the dollar, and a sudden increase in the interest rates the U.S. government would be required to pay to holders of its debt. Even a one percentage point rise over projected increases in the interest rates paid on the debt would add approximately $1 trillion to interest payments over the next decade.

The strength of any nation’s economy is, of course, crucial for the exertion of power in multiple ways. It undergirds the ability to finance weapons and armies, and to use foreign aid and trade concessions to build necessary alliances. It enables the building of superior infrastructures and the provision of public goods such as education, job training, public safety, pensions, enforceability of contracts, quality of the legal system, health care, and environmental protection. It also allows for the creation of a superior capacity for research and development, now crucial to gain access to the fruits of the accelerating scientific and technological revolution.

More broadly, the ability of any nation to wield power on a sustained basis—whether military, economic, political, or moral—depends upon multiple additional factors, including:

    •  Its ability to form intelligent policies and implement them effectively in a timely manner, which usually requires reason-based, transparent decision making and the forging of a domestic consensus in support of policies—particularly if they require a long-term commitment. The Marshall Plan, for example, would not have been possible without bipartisan support in the Congress and the willingness of the American people to commit significant resources to a visionary plan that required decades to implement.

    •  The cohesion of its society, which generally requires the perception of fairness in the distribution of incomes and net worth, and a social contract within which real needs are satisfactorily met and governmental power is derived from the genuine consent of the governed. The maintenance of cohesion also requires alertness to and sustained respect for the differing experiences and perspectives of minorities, and a full understanding of the benefits from the absorption of immigrants.

    •  The protection of property rights, the enforcement of contracts, and opportunities to invest money without an unreasonable risk of losing wealth.

    •  The development and enforcement of sustainable fiscal and monetary policies and bank regulations that minimize the risk of market disruptions and do not accentuate swings in the business cycle. Economic success also requires investments in infrastructure, research and development, and appropriate antitrust enforcement.

    •  The development of its human capital with adequate investments in education and job training, health care and mental health care, and nutrition and child care. The Information Revolution has enhanced the importance of investments in human capital, even as it requires a regular updating of appropriate strategies.

    •  The protection, conservation, and stewardship of natural capital with environmental protection and energy efficiency. The global climate crisis requires extensive planning for adaptation to the big changes coming, and much greater attention to the need for rapid reductions in global warming pollution.

The United States is now failing to satisfy many of these criteria. But it is not the only nation-state that is in danger of dissipating its ability to make sound decisions about the future. The larger and more significant change in the balance of power throughout the world is the relative decline in the effective power of nation-states generally. In the words of Harvard professor Joseph Nye, “the diffusion of power away from government is one of this century’s great political shifts.”

NATION-STATES IN TRANSITION

One of the principal reasons for the steady decline in the effective power of nation-states has been a rise in the power of multinational corporations. The redistribution of economic power and initiative to multinational corporations operating in many national jurisdictions simultaneously (even while exerting increased influence over the domestic policies of the nations in which they are based) has significantly diminished the role of nation-states.

With their ability to outsource and robosource their labor inputs, many corporations no longer have the same incentive to support improvements in national education systems and other measures that would enhance labor productivity in their home nations. And with the astonishing increase in trade and investment flows, multinational corporations are playing a far more significant role than ever before. Some political scientists have asserted that the influence of corporations on modern governance is now almost analogous to the influence of the medieval church during the era of feudalism.

The integration of the global economy has shifted power profoundly toward markets. The massive flows of capital over digital networks in Earth Inc. have made some national economies highly vulnerable to the sudden outflow of “hot money” if and when global markets reach a negative judgment about the viability of their fiscal and monetary policies. International banks and bond rating firms have become more significant players in national debates about taxing and spending. Greece is only the best known of many examples of countries no longer able to make decisions for themselves. It must first get permission from the European Union, which supports it, and international banks, which hold its debt.

The historic decline in the power, influence, and prospects for the Eurozone countries (those European nations that have joined in a monetary union) stems in large measure from a widely recognized fatal flaw in the decision by those European nations entering the monetary union to gamble that they could delay tight integration of their fiscal policies (without which a single currency is ultimately not viable) until the political momentum toward unity made that difficult step possible.

Recently released documents confirm that when the Eurozone was founded, there was widespread awareness, particularly in Germany, that Southern European countries were not even close to the fiscal conditions that would have reduced the risk of monetary integration. Yet then chancellor Helmut Kohl and other European leaders decided that the benefits of European unity were worth the gamble that cohesion could be maintained until there was sufficient Europe-wide support for tighter fiscal unity. When the financial crisis of 2007–08 exposed the fatal flaw, the global credit markets essentially called Europe’s bet.

Broadly speaking, Europe now faces two options. First, it can acknowledge the failure of the Eurozone experiment and sharply contract the number of nations that remain in the Eurozone alongside Germany and France—the core of Europe’s economy. This option is unattractive for several reasons: there are no legal procedures for the withdrawal of a country from the Eurozone; the transition from the euro back to a national currency—for a country like Greece, for example—promises to be exceptionally painful and expensive; and Germany would find itself once again threatened with competitive devaluations—in nations like Italy, for example—whenever the strength of the German economy significantly outpaced that of its neighbors.

The second option is to move quickly and boldly forward to a fiscal unification of the Eurozone, notwithstanding the disparities in the strength and productivity of Germany’s economy compared to the nations of Southern Europe. However, the only way to maintain anything remotely approaching parity in living standards in a fiscally unified Europe would be for Germany to make transfer payments (essentially budget subsidies) to the weaker European countries for at least a generation. Yet even though this might be a long-term economic bargain for Germany, the relatively more prosperous taxpayers of the former West Germany have shouldered the burden of subsidies to those in the relatively weaker former East Germany for the two decades since reunification—at an estimated cost of $2.17 trillion—and as a result their appetite for taking on this new burden is quite low.

The inability of Europe’s leaders to establish the needed fiscal integration and move more quickly toward a unified Europe has created a significant political and economic crisis that threatens to undo one of the most important U.S. successes in the aftermath of World War II. The weakening of political cohesion and economic dynamism in Western Europe (along with the long-running political paralysis and economic slowdown in Japan) have also contributed to the new difficulties faced by the United States in providing world leadership.

As with the compound ideology of democratic capitalism, the political concept of a nation-state is also made up of two ideas that overlap one another. The idea of a nation is based on the common identity of the people who live in a national territory; whether or not they share the same language (often the vast majority do), they usually share the same feeling that they are members of a national community. The state, by contrast, is an administrative, legal, and political entity that provides the infrastructure, security, and judicial basis for life within the state. When both of these concepts overlap, the result is the kind of nation that we commonly think of as the principal form by which global civilization is organized.

There is a rich historical debate about the origins of nation-states. The first large “states” emerged around 5,400 years ago when the Agricultural Revolution first produced large food surpluses in areas endowed with plant varieties that were especially suitable for cultivation: the Nile River Valley in Egypt, the Yellow River Valley in China, the Indus River Valley in India, the Tigris and Euphrates river valleys, and the Fertile Crescent (and in nearby Crete). These states also appeared in several other areas of the world, including Mexico, the Andes, and Hawaii.

The marriage of state and nation occurred much later. In a very real sense, modern nation-states were created as an outgrowth of the Print Revolution. Throughout most of human history, it was not the dominant form of organization. Empires, city-states, confederations, and tribes all coexisted in large areas of the Earth for millennia. Although there are a few examples of nation-states that existed prior to the Print Revolution, the rise of the modern nation-state as the dominant form of political organization occurred when the spread of printed books and pamphlets in a shared form of national languages stimulated the emergence of common national identities.

Prior to the Print Revolution, languages such as French, Spanish, English, and German, among others, featured a multiplicity of dialects and forms that were so distinctive that speakers of one form often had difficulty communicating with speakers of other forms. After the Print Revolution gained momentum, however, the economic imperatives of mass mechanical reproduction of texts provided a powerful push toward a common dialect of each tongue that was then adopted as a common language within each national territory. The emergence of group identities in regions where the majority of people spoke, read, and wrote in the same language created the conditions that led to the emergence of modern nation-states.

The Reformation and the Counter-Reformation unleashed passions that combined with these new national identities to trigger a long series of bloody wars that finally culminated in the Treaty of Westphalia in 1648—the treaty that formalized the construction of a new order in Europe based on the primacy of nation-states, and the principle of noninterference by any nation-state in the affairs of another.

Soon thereafter, the dissemination of news—printed in national languages and presented within a distinctively national frame of reference—further strengthened national identities. Over time, the wider availability of civic knowledge also led to the emergence of representative democracy and elected national legislatures. When the people of nations gained political authority over the making of laws and policies, the functions of the state were married to those of the nation.

During the Industrial Revolution, the introduction of transportation networks such as railroads and highways further expanded the political role of nation-states, and further consolidated national identities. At the same time, the nature and scale of industrial technologies expanded potential points of conflict between the operations of the market and the political prerogatives of the state.

The internal cohesion of modern nation-states was also strengthened by the introduction of national curricula in schools that not only reinforced the adoption of a common national dialect but also spread a common understanding of national histories and cultures—usually in ways that emphasized the most positive stories or myths in each nation’s history, while often neglecting to include narratives that might diminish feelings of nationalism. (For example, Japanese textbooks that minimize its invasion and occupation of China and Korea have regularly become sources of tension in Northeast Asia.)

Transnational global technologies such as the Internet and satellite television networks are exercising influence in spheres that used to be dominated primarily by the power of nation-states. Many regional satellite television networks dispense with national frames of reference in presenting news. And the Internet, in particular, is complicating many of the strategies formerly relied upon by nation-states to build and maintain national cohesion. Just as the printing press drew adherence to single versions of national languages and solidified national identities, the Internet is making available the knowledge of every country to the people of every other country. Google Translate, the largest of many machine translation services, now operates in sixty-four different languages and provides translations from one language to another for more documents, articles, and books in one day than all of the human translators in the world provide in a full year.

And of course, the number of texts translated by computers is increasing exponentially. Seventy-five percent of the web pages translated are from English to other languages. It is often, and inaccurately, said that English is the language of the Internet. Actually there are more Chinese language users of the Internet than there are people in the United States. But the content of the Internet that is now being dispersed throughout the world is content that still mainly originates in English.

The narratives of national histories that have dominated the curricula of mandatory public education systems now have competition from alternative narratives widely available on the Internet. And they often have the persuasive ring of truth—for example, for minorities within nation-states whose historical mistreatment can no longer be as easily obscured or whitewashed.

For these and other reasons, the glue holding some nations together in spite of their ethnic, linguistic, religious and sectarian, tribal and historical differences appears to be losing some of its cohesive strength. Belgium, for example, has reallocated the power once vested in its national government to its component regional governments. Flanders and Wallonia are not technically nation-states but might as well be.

In many parts of the world, identity-driven subnational movements are becoming more impatient and, in some cases, aggressive, in seeking independence from the nations of which they are now part. Nation-states have been described as “imagined communities”; it is, after all, impossible for citizens of a nation-state to interact with all other members of the national community. It is their common identity that forms the basis of their national bonds. If those bonds no longer lay as strong a claim on their imagination, their identity bonds may attach elsewhere—often to older identities that predated the formation of the nation-state.

In many regions, the growth of fundamentalism is also connected to the weakening of the psychological bonds of identity in the nation-state. Muslim, Hindu, Christian, Jewish—even Buddhist—fundamentalism are all sources of conflict in the world today. This does not come as a surprise to historians. After all, it was the desperate need to control religious wars and sectarian violence that led to the formal codification of nation-states as the primary form of governance in the first place.

In the midst of the English Civil War, Thomas Hobbes proposed one of the first and most influential arguments for a “social contract” to prevent the “war of every man against every man” by giving a monopoly on violence to the nation-state and granting to the sovereign of that state—whether a monarch or an “assembly of men”—the sole authority “to make war and peace … and to command the army.”

Nationalism became a potent new cause of warfare over the three centuries between the Treaty of Westphalia and the end of World War II. As the weaponry of war was industrialized—with machine guns, poison gas, tanks, and then airplanes and missiles—the destructive power unleashed led to the horrendous loss of life in the wars of the twentieth century. And the imposition of order by nation-states within their own borders sometimes created internal tensions that led their leaders to use the projection of violence against neighboring nation-states as a means of strengthening internal cohesion by demonizing “the other.” Tragically, the monopoly on violence granted to the state was also sometimes brutally directed at disfavored minorities within their borders.

In the wake of World War I, a number of nation-states were formed in the imagination of the United States, the United Kingdom, and other European nations that were seeking to create stability in regions like the Middle East and Africa, where tribal, ethnic, sectarian, and other divisions threatened continued destabilizing violence. One of the premier examples of an imagined community was Yugoslavia. When the unifying ideology of communism was imposed on this amalgam of separate peoples, Yugoslavia functioned fairly well for three generations.

But when communism collapsed, the glue of its imagined nation no longer could hold it together. The great Russian poet Yevgeny Yevtushenko described what happened next with the metaphor of a prehistoric mammoth found frozen in the ice of Siberia. When the ice melted, and the mammoth’s flesh thawed, ancient microbes within the flesh awakened and began decomposing the mammoth. In like fashion, the ancient antagonisms between Serbian Orthodox Christians, Croatian Catholics, and Bosnian Muslims decomposed the glue that had formed what is now referred to as the “former Yugoslavia.”

Not coincidentally, the border between Serbia and Croatia had marked the border 1,500 years ago between the Western and Eastern Roman empires, while the border between Serbia and Bosnia marked the fault line between Islam and Christendom 750 years ago. After the breakup of Yugoslavia, the new leader of independent Serbia went to the disputed territory of Kosovo to mark the 600th anniversary of the great battle there in which the Serbian Empire was defeated by the Ottoman Empire; in a demagogic and warmongering speech, he revivified the ancient hatreds wrapped in memories of that long ago defeat and launched genocidal violence against both Bosnians and Croats.

The legacy of empires has continued to vex the organization of politics and power in the world long after nation-states became the dominant form of political organization. In the last three decades of the nineteenth century, European countries colonized 10 million square miles of land in Africa and Asia, 20 percent of all land in the world, putting 150 million people under their rule. (Indeed, several modern nation-states continued to govern colonial empires well into the second half of the twentieth century.) To pick one of many examples, the breakup of the Ottoman Empire in the aftermath of World War I resulted in the decision by Western powers to create new nation-states in the Middle East, some of which pushed together peoples, tribes, and cultures that had not previously been part of the same “national” community, including Iraq and Syria. It is not coincidental that both of these nations have been coming apart at the seams.

With the weakening of cohesion in nation-states, wherever peoples feel a strong and coherent identity that is separate from the one cultivated by the nation-state that contains them, there is new restlessness. From Kurdistan to Catalonia to Scotland, from Syria to Chechnya to South Sudan, from indigenous communities in the Andean nations to tribal communities in Sub-Saharan Africa, many people are shifting their primary political identities away from the nation-states in which they lived for many generations. Although the causes are varied and complex, a few nations, like Somalia, have devolved into “post-national entities.”

In many parts of the world, nonstate terrorist groups and criminal organizations such as those who are now wielding power in so-called narco-states are aggressively challenging the power of nation-states. There is an overlap between these nonstate actors: nineteen of the forty-three known terrorist groups in the world are linked to the drug trade. The market for illegal narcotics is now larger than the national economies of 163 of the world’s 184 nations.

It is significant that the most consequential threat to the United States in the last three decades came from a nonstate actor, Osama bin Laden’s Al-Qaeda. A malignant form of Muslim fundamentalism was the primary motivation for Al-Qaeda’s 9/11 attack. (According to numerous reports, bin Laden was revulsed by the presence of U.S. military deployments in Saudi Arabia, the custodian of Islam’s holiest sites.)

The damage done by the attack itself—the murder of more than 3,000 people—was horrible enough, but the tragic response it provoked, the misguided invasion of Iraq, which, as everyone now acknowledges, had nothing whatsoever to do with attacking the U.S., was ultimately an even more serious blow to America’s power, prestige, and standing. Hundreds of thousands died unnecessarily, $3 trillion was wasted, and the reasons given for launching the war in the first place were later revealed as cynical and deceptive.

The decision by the United States government to abandon its historic prohibitions against the torture of captives and the indefinite detention of individuals without legal process has been widely seen around the world as diminishing its moral authority. In a world divided into different civilizations, with different religious traditions and ethnic histories, moral authority is arguably an even greater source of power. Even though the ideologies of nations vary widely, the values of justice, fairness, equality, and sustainability are valued by the people of every nation, even if they often define these values in different ways.

The apparent rise of fundamentalism in its many varieties may be due, in part, to the pace of change that naturally causes many people to more tightly embrace orthodoxies of faith as a source of spiritual and cultural stability. The globalization of culture—not only through the Internet, but also through satellite television, compact discs, and other media—has also been a source of conflict between Western societies and conservative fundamentalist societies. When cultural goods from the West depict gender roles and sexual values in ways that conflict with traditional norms in fundamentalist cultures, religious leaders condemn what they view as the socially destabilizing impact.

But the impact of globalized culture goes far beyond issues of gender equity and sexuality. Cultural goods serve as powerful advertisements for the lifestyles that are depicted, and promotions for the values of the country where such goods originate. In a sense, they carry the cultural DNA of that country. As the global middle class is exposed to images of homes, automobiles, appliances, and other common features of life in industrial countries, the pressure they exert for changes in their own domestic political and economic policies often grows accordingly.

The longer-term impact may well be to break down differences. A recent study in Cairo found that there is a strong correlation between the amount of television watched and the decline of support for fundamentalism. One of the sources of the enhanced influence of Turkey in the Middle East is the popularity of its movies and television programs. The dominance of American music has enhanced the impression of the United States as a dynamic and creative society. The ability to influence the thinking of peoples through the dissemination of cultural goods such as movies, television programs, music, books, sports, and games is increasing in an interconnected world where consumption of media is rising every year.

WAR AND PEACE

The second half of the twentieth century saw a decline in the number of people killed in wars, and a decline in the number of wars in every category, international and civil—even though millions continued to die because of the pathological behavior of dictators. The decline has continued in this century, leading some to argue that humankind is maturing, humane values are spreading, and military power is less relevant in an interconnected world. It is a measure of this change that the people of the United States feel a palpable loss of national power at a time when its military budget is larger than those of the next fifty other nations combined. However, self-described foreign policy “realists” (who believe that nation-states always compete in an inherently anarchic international system) warn that similar predictions made in past eras proved to be false.

History provides all too many examples of unwarranted optimism about the decline of war during previous eras when a new appreciation for the benefits of peace seemed to be on the rise. The best-selling book globally in 1910 was The Great Illusion by Norman Angell, who argued that the increased economic integration that accompanied the Second Industrial Revolution had made war obsolete. Less than four years later, on the eve of World War I, Andrew Carnegie, the Bill Gates of his day, wrote a New Year’s greeting to friends: “We send this New Year Greeting, January 1, 1914 strong in the faith that International Peace is soon to prevail, through several of the great powers agreeing to settle their disputes by arbitration under International Law, the pen thus proving mightier than the sword.”

Human nature has not changed and the history of almost every nation contains sobering reminders that the use of military power has often been decisive in changing their fate. Nationalist politicians in many countries—including the United States and China—will, of course, seek to exploit fears about the future—and the fear of one another—by calling for the buildup of military strength. In the present era, some Chinese military strategists have written that a well-planned cyberattack on the United States could allow China to “gain equal footing” with the U.S. in spite of U.S. superiority in conventional and nuclear weaponry. And as has often been the case in history, fear begets fear; the buildup of a capacity for war leads those against whom it might be used to infer that there is an intent to do just that.

The fear of a surprise military attack has itself had a distorting influence on the priority given to military expenditures throughout history, and is a fear inherently difficult for the people and leaders of any nation to keep in proper perspective. That is one reason why national security depends more than ever on superior intelligence gathering and analysis in order to protect against strategic surprise and to maintain alertness to strategic opportunities.

In addition, new developments in technology have frequently changed the nature of warfare in ways that have surprised complacent nations who were focused on the technologies that were dominant in previous wars. The Maginot Line painstakingly constructed by France after World War I proved impotent in the face of new highly mobile tanks deployed by Nazi Germany. Military power now depends more than ever on the effective mastery of research and development to gain leverage from the still accelerating scientific and technological revolution, which has an enormous impact on the evolution of weaponry.

While the utility of military power may indeed be finally declining in significance in a world where the people and businesses of every nation are more closely linked than ever before, the recent decline in warfare of all kinds in the world—particularly war between nation-states—may have less to do with a sudden outbreak of empathy in mankind and may have more to do with the role played by the United States and its allies in the post–World War II era in mediating conflicts, building alliances, and sometimes intervening with a combination of limited military force and economic sanctions—as it did, for example, in the former Yugoslavia to limit the spread of violence between Serbia, Croatia, and Bosnia.

Supranational entities have also been playing an ever growing role, intervening in nations unable to halt violent conflicts and mediating the resolution of disputes. These international groups include not only U.N.-sponsored global efforts, but also, increasingly, efforts by regional supranational entities like the African Union, the Arab League, the European Union, NATO, and others. Nongovernmental organizations, faith-based charitable groups, and philanthropic foundations are playing an increasingly significant role in providing essential public goods in areas where nation-states are faltering. When sustained military operations are necessary and established supranational entities are unable to reach consensus, “coalitions of the willing” have been formed.

But in many of these interventions—particularly where NATO and coalitions of the willing were involved—the United States has played a key organizing and coordinating role, and has often provided not only the critical intelligence collection and analysis but also the decisive military force as well. If the equilibrium of power in the world continues to shift in ways that weaken the formerly dominant position of the United States, it could threaten an end to the period some historians have labeled the Pax Americana.

The recent decline in war may also be related to two developments during the long Cold War between the United States and the USSR. First of all, when these two superpowers built vast arsenals of nuclear bombs mounted on intercontinental ballistic missiles, submarines, and bombers, the quantum increase in the probable consequences of all-out war became so obviously and palpably unacceptable that both the U.S. and the USSR soberly backed away from the precipice. The escalating cost of maintaining and modernizing these arsenals also became a burden for both superpowers. (The Brookings Institution has calculated that since 1940, the U.S. has spent $5.5 trillion on its nuclear war fighting capability—more than on any other program besides Social Security.) Though the risk of such a war has been sharply reduced by arms control agreements, the partial dismantling of both arsenals, and enhanced communications and safeguards (including a recent bilateral nuclear cybersecurity agreement), the risk of an escalation in tensions must still be continually managed.

Second, during the last third of the twentieth century, both the U.S. and the USSR had bitter experiences in failed efforts to use overwhelming conventional military strength against guerrilla armies using irregular warfare tactics, blending into their populations and fighting a war of attrition. The lessons learned by the superpowers were also learned by guerrilla forces. Partly as a result, the continued spread of irregular warfare tactics is now seriously undermining the nation-state monopoly on the ability to use warfare as a decisive instrument of policy.

The large excess inventories of rifles and automatic weapons manufactured during previous wars are increasingly available not only to insurgent guerrilla forces, but also to individuals, terrorist groups, and criminal organizations. When a new generation of weapons is manufactured, the older generation is not destroyed. Rather, they find their way into the hands of others, often magnifying the bloodshed in regional and civil wars. Unfortunately, the lobbying power and political influence of gun and munitions manufacturers and defense companies has contributed to this spread of weapons throughout the world. President Barack Obama reversed U.S. policy in 2009 and resumed advocacy of a treaty to limit this destructive trade, but progress is slow at best because of opposition from several countries and the dysfunctionality of global decision making.

The U.S. continues to dominate the international trade in weapons of all kinds—including long-range precision weapons and surface-to-air missiles—some of which end up being trafficked in black markets. In his final speech as president, Dwight Eisenhower warned the United States about the “military industrial complex.” As the victorious commanding general in World War II, Eisenhower could hardly have been accused of being soft on national security. Although there are undeniable benefits to the United States from weapons deals, including an enhanced ability to form and maintain useful alliances, it is troubling that more than half (52.7 percent in 2010) of all of the military weapons sold to countries around the world originate in the United States.

More significantly, the dispersal of scientific and technological knowledge and expertise throughout Earth Inc. and the Global Mind has also undermined the monopoly exercised by nation-states over the means of inflicting mass violence. Chemical and biological agents capable of causing mass casualties are also on the list of weapons now theoretically accessible to nonstate groups.

The knowledge necessary to build weapons of mass destruction, including nuclear weapons, has already been dangerously dispersed to other nations. Instead of the two nuclear powers that faced off at the beginning of the Cold War, there are now thirty-five to forty countries with the potential to build nuclear bombs. North Korea, which has already developed a handful of nuclear weapons, and Iran, which most believe is attempting to do so, are developing longer-range missile programs that could over time result in the ability to project intercontinental power. Proliferation experts are deeply concerned that the spread of nuclear weapons to some of these countries could markedly increase the risk that terrorist groups could purchase or steal the components they need to make a bomb of their own. The former head of Pakistan’s nuclear program, A. Q. Khan, developed extensive ties with Islamic militant groups. North Korea, strapped for cash as always, has already sold missile technology and many believe it is capable of selling nuclear weapons components.

National security experts are also concerned about regional cascades of nuclear proliferation in regions like the Persian Gulf and Northeast Asia. In other words, the development of a nuclear arsenal by Iran would exert pressure on Saudi Arabia and potentially other countries in the region to develop their own nuclear arsenals in order to provide deterrence. If North Korea were to gain the credible ability to threaten a nuclear attack against Japan, the pressure on Japan to develop its own arsenal would be intense in spite of Japan’s historic experience and opposition to nuclear weapons.

Because leadership in the community of nations is essential, there is an urgent need to restore the integrity of democratic decision making in the United States. And there are hopeful trends, not least the awakening of reformist activism on the Internet. Throughout the world, the Internet is empowering the rapidly increasing members of the global middle class to demand the kinds of accountability and reform from their governments that middle-class citizens have historically always been more likely to demand than the poor and underprivileged. Stanford political science professor Francis Fukuyama notes that this is “most broadly accepted in countries that have reached a level of material prosperity sufficient to allow a majority of their citizens to think of themselves as middle class, which is why there tends to be a correlation between high levels of development and stable democracy.”

The trends associated with the emergence of Earth Inc.—particularly robosourcing, the transfer of work from humans to intelligent interconnected machines—threaten to slow the rise of the global middle class by diminishing aggregate wages. But a recent report from the European Strategy and Policy Analysis System (ESPAS) calculates that the global middle class will double in the next twelve years from two billion to four billion people, and will reach almost five billion people by 2030.

The report adds: “By 2030, the demands and concerns of people in many different countries are likely to converge, with a major impact on national politics and international relations. This will be the result mainly of greater awareness among the world’s citizenry that their aspirations and grievances are shared. This awareness is already configuring a global citizens’ agenda that emphasizes fundamental freedoms, economic and social rights and, increasingly, environmental issues.”

The awareness of higher living standards, higher levels of freedom and human rights, better environmental conditions, and the benefits of more responsive governments will continue to spread within the Global Mind. This new global awareness of the myriad ways in which the lives of billions can be improved is certain to exert a profound influence on the behavior of political leaders throughout the world.

Already, the spread of independence movements committed to democratic capitalism in states throughout the former Soviet Union, and the explosive spread of the Arab Spring in nations throughout the Middle East and North Africa, also serve as examples of the real possibility that such changes can occur even more quickly in a world empowered by its connections to the Global Mind.

With the ongoing emergence of the world’s first truly global civilization, the future will depend upon the outcome of the struggle now beginning between the raw imperatives of Earth Inc. and the vast potential inherent in the Global Mind for the insistence by people of conscience that excesses be constrained with the imposition and enforcement of standards and principles that honor and respect human values.

Lest this sound impractical or hopelessly idealistic, there are many examples of new global norms having been established by this mechanism in the past—well prior to the enhanced potential we now have available for promoting new global norms by using the Internet. The abolition movement, the anti-Apartheid movement, the promotion of women’s rights, restrictions on child labor, the anti-whaling movement, the Geneva Conventions against torture, the rapid spread of anticolonialism in the 1960s, the ban on atmospheric nuclear testing, and successive waves of the democracy movement—all gained momentum from the sharing of ideas and ideals among groups of committed individuals in multiple countries who pressured their governments to cooperate in the design of laws and treaties that led to broad-based change in much of the world.

No matter the nation in which we reside, we as human beings now face a choice: either to be swept along by the powerful currents of technological change and economic determinism into a future that may threaten our deepest values, or to build a capacity for collective decision making on a global scale that allows us to shape that future in ways that protect human dignity and reflect the aspirations of nations and peoples.


* Mercenary armies have always been present in the history of warfare, but are more prominent than ever in some long-running conflicts, such as those that have killed 400,000 people in the Democratic Republic of the Congo.

Though Marx wrote in The Communist Manifesto that the 1848 French revolution had been the first “class struggle.”