1
THE AMERICAN EXCEPTION
For every migrant should well consider, that in a country like the United States of America, . . . where no princes and their corrupt courts represent the so-called “divine right of birth,” in spite of merit and virtue—that in such a country the talents, energy and perseverance of a person must have a far greater opportunity for display, than in monarchies, where the evils above mentioned have existed for centuries, and with their sad effects exist still.
—F. W. Bogen, The German in America (Boston, 1851)
 
 
 
 
WHAT DETERMINES PUBLIC SUPPORT FOR CAPITALISM? A recent study shows that in any given country it is positively associated with the perception that hard work, not luck, determines success and negatively correlated with the perception of corruption.1 These correlations go a long way toward explaining public backing for America’s capitalist system. According to another recent study, only 40 percent of Americans think that luck rather than hard work plays a major role in income differences. Compare that with the 75 percent of Brazilians—or the 66 percent of Danes and 54 percent of Germans—who think that income disparities are mostly a matter of luck, and you begin to get a sense of why American attitudes toward the free-market system stand out.2

WHAT IS SO SPECIAL ABOUT THE UNITED STATES?

Some scholars argue that this public belief in capitalism’s legitimacy is merely the result of a successful propaganda campaign for the American Dream—a myth embedded in American culture. And it’s true that there is scant evidence that rates of social mobility are higher in the United States than in other developed countries. But while the difference in economic openness of the American system does not show up clearly in aggregate statistics, it is powerfully present at the top of the income distribution—which also shapes people’s attitudes most extensively. Even before the Internet boom of the late 1990s gave us many young billionaires, one out of four billionaires in the United States could be described as “self-made”—compared to just one out of ten in Germany. In fact, in Europe self-made people are often referred to as parvenus (newcomers). This is a derogatory expression implying that such people are not as “classy” as those who have inherited money and did not have to work hard to earn it. In other words, in Europe wealth tends to be seen as a privilege, not a reward for effort.
Self-made billionaires also exist outside of the States, of course, but the way they have made their money is often quite different from the way America’s very rich did. The wealthiest self-made American billionaires—from Bill Gates and Michael Dell to Warren Buffett and Mark Zuckerberg—have made their fortunes in competitive businesses, not much affected by government regulation, whereas in most other countries the wealthiest people frequently accumulate their fortunes in regulated businesses in which success often depends more on having the right government connections than on having initiative and enterprise. Think about the Russian oligarchs or Silvio Berlusconi in Italy and Carlos Slim in Mexico. They all got rich in businesses that are highly dependent on governmental concessions: energy, real estate, telecommunications, mining. In much of the world, in fact, the best way to make lots of money is not to come up with brilliant ideas and work hard at implementing them but, instead, to cultivate a government ally. Such cronyism is bound to shape public attitudes about a country’s economic system. When asked in a recent study to name the most important determinants of financial success, Italian managers put “knowledge of influential people” in first place (80 percent considered it “important” or “very important”).3 “Competence and experience” ranked fifth, behind characteristics such as “loyalty and obedience.” These divergent paths to prosperity reveal more than just a difference of perception. Capitalism in the United States is distinct from its counterparts in Europe and Asia for reasons that reach deep into history, geography, culture, and the institution of federalism.

Historical Factors

In America, unlike in much of the rest of the West, democracy predates industrialization. By the time of the second industrial revolution in the latter part of the nineteenth century, the United States had already enjoyed several decades of universal (male) suffrage and widespread education. These circumstances forged a public with high expectations—one unlikely to tolerate evident unfairness in economic policy. It is no coincidence that the very concept of antitrust law—a promarket but sometimes antibusiness idea—was articulated in the United States at the end of the nineteenth century and the beginning of the twentieth.
American capitalism also arose at a time when government involvement in the economy was quite weak. At the beginning of the twentieth century, when modern American capitalism was taking shape, US government spending was only 3 percent of gross domestic product.4 After World War II, when modern capitalism took hold in Western European countries, government spending in those countries was, on average, 30 percent of GDP. Until World War I, the United States had a tiny federal government compared to national governments in other countries. This was partly due to the fact that the United States faced no significant military threat, so the government had to spend only a relatively small proportion of its budget on the military. The federalist nature of the American regime, by empowering states, also played a role in limiting the size of the national government.
When government is small and relatively weak, the most effective way to make money is to start a successful private-sector business. But the larger the size and scope of government spending, the easier it is to make money by diverting public resources. After all, starting a business is difficult and involves a lot of risk. Getting a government favor or contract is easier, at least if you have connections, and is a much safer bet. Thus, in nations with large and powerful governments, the state usually finds itself at the heart of the economic system, even if the system is relatively capitalist—an arrangement that confounds politics and economics, both in practice and in public perceptions: the larger the share of capitalists who acquire their wealth thanks to their political connections, the greater the perception that capitalism is unfair and corrupt.
Another distinguishing feature of American capitalism is that it evolved relatively untouched by foreign influence. Although European (and especially British) capital did play a role in America’s nineteenth-and early-twentieth-century economic expansion, Europe’s economies were not more advanced than America’s—and thus while European capitalists could invest in or compete with American companies, they could not dominate the system. As a result, American capitalism developed more or less organically and, indeed, still shows the marks of those origins. The American bankruptcy code, for instance, exhibits significant prodebtor biases, because the United States was born and grew up as a nation of debtors.
Things are very different in nations that became capitalist economies after World War II. These countries—in non-Soviet-bloc continental Europe, parts of Asia, and much of Latin America—industrialized under the giant shadow of American power. Local elites felt threatened by the potential for economic colonization by American firms that were far more efficient and better capitalized than their own firms were. To protect domestic companies from foreign ownership, local establishments created various forms of indigenous cross-ownership (from the Japanese keiretsu to the Korean chaebol). These structures encouraged collusion and corruption. They have also proven resilient in the decades since: once economic and political systems are built to reward relationships instead of economic efficiency, it is extremely difficult to reform them, since the people in power are the ones who would ultimately lose the most.
Another explanation for the United States’ openness to a promarket agenda instead of a probusiness agenda is that the nation was largely spared the direct influence of Marxism, though it is possible that the nature of American capitalism is the cause, as much as the effect, of the absence of strong Marxist movements in this country. Either way, this difference from other Western regimes significantly affected Americans’ attitudes toward economics. In countries with prominent and influential Marxist parties, defenders of free markets were compelled to combine their forces with large businesses, even when they did not trust them. If one faces the prospect of nationalization (i.e., the control of resources by a small political elite), even relationship capitalism—which involves control of those resources by a small business elite—becomes an appealing alternative. At least in relationship capitalism there are private owners, who lose out as a result of inefficiency and thus have an incentive to stay competitive.
Because they could not afford to divide the opposition to Marxism, many of these countries could not develop a more competitive and open form of capitalism. And the free-market banner wound up completely appropriated by probusiness forces, which were better equipped and better fed. Even as the appeal of Marxist ideas faded, this confusion of promarket and probusiness forces remained in place. After decades of fighting side-by-side with and being financed by the large industrialists, the promarket forces could no longer separate themselves from the probusiness camp. Nowhere is this scenario more evident than in Italy, where the free-market movement is almost literally owned by one businessman, Silvio Berlusconi, who also happened to be prime minister for much of the nation’s recent history. Until he had to resign from political office in 2011, Berlusconi had basically run the country in the interest of his own business.

Geographical Factors

Besides historical factors, geography and demography have also played significant roles in shaping America’s unique form of capitalism. Initially, what drove Europe’s colonization of much of the Americas was the quest for gold and silver. In Central and South America, the Spanish sent their nobles and viceroys to preside over the extraction of precious metals, transplanting European hierarchies and institutions in the process. North America was lucky that the Europeans did not find gold right away. At this point in its history, the continent offered relatively inhospitable plains and forests. What attracted colonies here was not the search for gold but the search for freedom. In coming to America, immigrants left behind not only their relatives but also oppressive institutions. They arrived here determined to build a better system of government.5
They were also helped in this goal by the fact that the United States was relatively underpopulated. In Old Europe the scarce factor was land. Those who controlled the land could enjoy an economic rent; in other words, they could live off it without adding any value. This is what enabled the European aristocracy to thrive and to control the state. European (especially Continental European) institutions were designed to enshrine the power of the aristocracy. The Europeans created not only governments of the landlords, by the landlords, and for the landlords but also an economic system of the landlords, by the landlords, and for the landlords. Even though European countries slowly moved toward more democratic institutions, they initially granted the vote only to landowners and made education accessible only to the children of the upper class.
What made the difference in America was competition. Even with their wonderful new institutions, the original thirteen colonies might have degenerated into a more rigid, European-style society if not for the openness of the American frontier. The frontier made it easy for people to move, fundamentally undermining the power of American governments vis-à-vis their citizens. Unlike Europeans, Americans were free to choose where to live. No American state enjoyed a monopoly over its citizens, since it faced the competition of other states. And so American states have always had to compete, in terms of improving institutions, to attract the best and the brightest, just as businesses must attract customers in order to survive and flourish. Universal franchise and universal education, it is worth noting, were introduced initially in the western states, which were eager to attract a workforce from the eastern ones. Thus, the United States became not just a government by the people but also a government for the people.
Such is the power of competition, which transforms even the political state—the Leviathan—into an instrument for the people. By contrast, monopoly can transform private enterprises into a destructive form of the Leviathan. A terrifying example is the Congo Free State at the end of the nineteenth century. When Belgium showed little interest in colonial expansion, its king, Leopold II, decided to pursue it on his own. The Congo Free State was not a colony of Belgium but, rather, a personal property of the king, who ran it as his own private company. After initial problems, the enterprise became extremely profitable, making Leopold II one of the richest monarchs in Europe. Unfortunately, this occurred at the expense of both the local people and the environment. In 1904, British consul Roger Casement published a report of all the atrocities that took place in the Congo Free State.6 Eventually, international pressure forced Leopold II to surrender his private state to Belgium, leading to an improvement of the living conditions of the local population. Nevertheless, Congo’s institutions still reflect their sad origin as tools for the most ruthless extraction of resources ever recorded. This unfortunate legacy continues to permeate the culture of Congo as well. Even after independence Congo continued to suffer under brutal dictatorships.

Cultural Factors

America’s Declaration of Independence fittingly starts with “We, the People.” Unlike the countries of Europe, whose various foundings depended on monarchs allegedly vested with power by God, the United States of America is vested with power arising from the people. This popular, if not populist, foundation shaped the prevailing American culture for the better.
In the United States, juries and elected judges have always helped to limit the power and influence of money. And the common-law system itself, with its appeal to commonly shared values such as fairness, has always been a limit to lobbying power. Special interests can often find it easy to corrupt the legislative process, but they cannot as easily change the notion of fairness applied by popularly elected judges. For the same reason, common law provides a better shield against legislative corruption than a code of law, the system prevailing outside Britain and the former British colonies. In a system where law is enshrined into a rigid code (like the Civil Code of France and all of Continental Europe), little discretion is left to the judge, whose role is simply to map codified legal norms onto real-world situations. This system creates a strong incentive for various interests to lobby legislators.7 Whoever “captures” lawmakers can more readily dictate the outcomes in future contingencies, gaining great benefit. By contrast, in a common law system the legislature is supposed to provide only general principles, limiting the payoff that lobbyists can obtain.
Another manifestation of the American populist bent that tempers the power of big business is the institution of class action lawsuits. Though they can be and have been abused, such suits not only provide an incentive for lawyers to fight in defense of powerless people but they also create an alternative lobby. In countries with no tradition of class action suits—such as France and Italy—the legal profession, completely captured by moneyed interests, becomes an apologist for large and powerful corporations and individuals.
Finally, although Americans have historically avoided anticapitalist biases, they have nurtured something of a populist antifinance bias—that is, an opposition to excessive concentration of financial power. A healthy financial system is crucial to any working market economy. And widespread access to finance is essential to harnessing the best talents, allowing them to prosper and grow, drawing new entrants into the system, and fostering competition. But the financial system also has the ability to allocate power and profits. As the old saying goes, whoever has the gold makes the rules—and banks are where the gold is. More important is that the financial system, by influencing entry into the market, affects the profitability of the industrial sector.8 Thus, if this system is not fair, there is little hope that the rest of the economy can be. And the potential for unfairness or abuse in the financial system is always great. Americans have long been sensitive to such abuse.
Throughout American history, the populist antifinance bias has led to many political decisions that were inefficient from an economic point of view. But this bias has also helped preserve the long-term health of America’s democratic capitalism.

Institutional Factors

Yet another ingredient of good fortune that made the United States so special is the federal nature of its government. Federalism was crucial in two important respects: it made competition among states possible, as I noted earlier, and it kept the power of individual corporations at bay. In individual states the power of certain large corporations was unlimited. Coal mines controlled West Virginia, the tobacco industry controlled Kentucky and North Carolina, and so on. Still, it was difficult for any industry to control a majority of the states. The excesses that come with absolute power were tamed.
Consider the case of Jeff Wigand, who, as mentioned in the Introduction, worked for the Brown & Williamson Tobacco Corporation. Just as he was about to blow the whistle on the company’s deliberate policy to make cigarettes more addictive, the firm obtained a restraining order from a Kentucky state judge prohibiting him from speaking about his experiences at Brown & Williamson. Such was the power of the tobacco industry in Kentucky, where the tobacco sectors once employed more than seventy-five thousand people. It was only because the attorney general of Mississippi, a nontobacco state, brought a suit against the major American tobacco companies that Wigand’s testimony was eventually revealed.9
Indeed, it was often the rivalry—the competition—among states that kept companies at bay. Throughout much of American history, state bank regulations were driven by concerns about the power of New York banks over the rest of the country—and, more generally, by fears that big banks would drain deposits from the countryside in order to redirect them to the cities. To address these fears, states introduced a variety of restrictions such as unit banking (banks could have only one office), limits on intrastate branching (banks from northern Illinois could not open branches in southern Illinois), and limits on interstate branching (New York banks could not open branches in other states). From a purely economic point of view, these restrictions were completely misguided. They forced reinvestment of deposits in the same areas where they were collected, badly distorting the allocation of funds. And by preventing banks from expanding, they made banks less diversified and thus more prone to failure. Yet the new policies did have a positive side effect: splintering the banking sector, they reduced its political power and thereby created the preconditions for a vibrant securities market.
The separation between investment banking and commercial banking introduced by the New Deal’s Glass-Steagall Act was a product of this long-standing American tradition. Unlike many other banking regulations, Glass-Steagall had an economic rationale: to prevent commercial banks from exploiting their depositors by dumping on them the bonds of firms that were unable to repay the money they had borrowed from banks. The Glass-Steagall Act’s most significant consequence, though, was the fragmentation of the banking industry. This fragmentation created divergent interests in different parts of the financial sector, reducing its political power. Over the past three decades, these arrangements were overturned, starting with the progressive deregulation of the banking sector.

THE LUCKY OUTCOME

For all of these reasons, then, the United States constructed a system of capitalism that comes closer than any other to embodying the free-market ideal of economic liberty and open competition. The image many Americans have of capitalism therefore calls to mind Horatio Alger’s rags-to-riches-via-hard-work stories, which have come to define the American Dream. In most of the rest of the world, by contrast, Horatio Alger is unknown—and the concept of social mobility is defined by Cinderella or Evita stories, in which success comes not from hard work but from luck. This goes a long way toward explaining why the level of support for capitalism in the United States is greater than in any other country and, in turn, why capitalism itself has always seemed on firmer footing in America.
The American system is far from perfect. It does not lack stories of corporate abuses and political corruption. The ITT Corporation, for instance, is famous for having influenced America’s policy toward Latin America in the 1960s and 1970s, including support for atrocious regimes. On a more personal note, I live in Illinois—two former governors of which are currently in jail for corruption.
Yet the greatest feature of the United States is its system of checks and balances. The fact that my two former governors are in jail shows that justice can prevail. A president of the United States, Richard Nixon, was forced to resign. Even more to the point, the US government was able to break up major monopolies, such as Standard Oil in 1912 and AT&T in 1984. No other country has a comparable record.