THE COVER of Time magazine on April 3, 1944, pictured a grey-haired man with blue eyes and a hopeful smile. Below the picture, the caption was short: “In this war, science is G-5.”1 Readers knew the reference. The G-5 was a torpedo boat used against submarines like those that the Nazis had deployed to terrorize Allied fleets. The technology was Soviet, but the Soviets were our allies at the time. And that was the point: Science was our vital ally in the most fateful war the world had ever seen—and in fostering American success more broadly.
The man on the cover was Vannevar Bush, director of the Office of Scientific Research and Development (OSRD), which spearheaded the Manhattan Project.2 Bush (no relation to the political dynasty) looked the part as well as he played it. Handsome, with wire-rimmed glasses and a pipe in his hand most of the time, the fifty-something civil servant was a distinguished electrical engineer and inventor who had taught for years at MIT and even developed a mechanical precursor to the modern computer.3 But he would become best known for a federal report laying out the case for massive public investments in science after World War II—a report that would help guide American policy, and spur American prosperity, for decades to come.4
The report, Science: The Endless Frontier, published in July 1945, defined an unlikely genre. It was a product of the US Government Printing Office that was not just read but also became a national bestseller. Even more unlikely, it was the product of a free-market Republican who lionized Herbert Hoover but worked closely with FDR and whose ambitious plans for big-government science helped inaugurate the National Science Foundation, created in 1950 by bipartisan congressional majorities.5
“It has been basic United States policy that Government should foster the opening of new frontiers,” Bush wrote in the report’s introduction. “It opened the seas to clipper ships and furnished land for pioneers. Although these frontiers have more or less disappeared, the frontier of science remains. It is in keeping with the American tradition—one which has made the United States great—that new frontiers shall be made accessible for development by all American citizens.”6
Opening the “frontier of science” meant spending money—lots of money. President Harry S. Truman’s budget chief joked that the report should have been called Science: The Endless Expenditure.7 World War II had ushered in what Senator William Fulbright later disparagingly called the “military-industrial-academic complex.”8 (Despite frequent assertions to the contrary, there is no hard evidence that Eisenhower dropped academic—or scientific or congressional—from the draft version of his 1961 speech.)9 Though realized under Truman, the National Science Foundation was conceived under FDR, to whom Bush said he “developed a personal loyalty and liking that was intense.”10 As World War II was drawing to a close, Bush recounted later, “Roosevelt called me into his office and said, ‘What’s going to happen to science after the war?’ I said, ‘It’s going to fall flat on its face.’ He said, ‘What are we going to do about it?’ And I told him, ‘We better do something damn quick.’ ”11
Federal involvement in science was nothing new. From the Founding, America’s leaders had sought to expand the nation’s scientific frontiers as well as its geographic ones. Thomas Jefferson saw Lewis and Clark’s journey as in part a scientific expedition, and not only made Meriwether Lewis train with leading scientists but also devoted his own time and intellect to preparing Lewis. In the 1800s, the government standardized weights and measures, carried out research on agricultural productivity, and invested in better medical treatment.12 Under the Morrill Act of 1862 and its successors, the federal government provided huge swaths of land that states could sell if they wished to set up institutions of higher learning.13 These land grants—one of the biggest of many giveaways that in essence substituted for the ramped-up spending of fledgling European welfare states—seeded seventy-something institutions that became the core of the nation’s scientific infrastructure. Vannevar Bush’s MIT was one, as were Cornell University, the University of California at Berkeley, and the University of Minnesota, not to mention many of the nation’s historically black colleges and universities.14
Government’s role in science expanded dramatically during World War I and then again amid the programmatic flurry of the New Deal. The watershed, however, was the US entry into World War II. In a few short years, the United States became the home (in many cases adopted) of the world’s best researchers and theorists.15 As head of OSRD, Bush had more than six thousand scientists doing the public’s work on the public’s tab.16 And though defense technology was the focus, federal largesse was spread across medicine, engineering, agriculture, and basic scientific research. Moreover, much of the defense spending turned out to have civilian uses. This wasn’t a coincidence: While in-house scientists did some of the work, the government farmed out the vast majority of it to universities and private research labs, from Bell Labs (of AT&T) to Raytheon to Pfizer—companies eager to nab commerce as well as contracts.17 Military-industrial-academic complex indeed.
As big as the war effort to support science was, however, Bush wanted it to be bigger. “Research within the Government represents an important part of our total research activity and needs to be strengthened and expanded after the war,” he argued. “Such expansion should be directed to fields of inquiry and service which are of public importance and are not adequately carried on by private organizations.” Science was needed “for the war against disease,” “for our national security,” and “for the public welfare.” “To create more jobs,” Bush wrote, “we must make new and better and cheaper products. We want plenty of new, vigorous enterprises. But new products and processes are not born full-grown. They are founded on new principles and new conceptions which in turn result from basic scientific research.”18
Bush wasn’t alone in thinking investment in science was integral to economic prosperity as well as national security. At the close of the war, the only real disagreements were over the character and scale of public investment, not the need for it. These struggles delayed the creation of the NSF for five years. But the final 1950 law, close to Bush’s original vision, enjoyed overwhelming bipartisan support as well as Truman’s and then Eisenhower’s imprimatur.19 Indeed, the NSF was one among many new federal efforts undertaken just after World War II to spur innovation and job growth at home and meet the Soviet threat abroad: the national highway system, a massive increase in federal aid for education, civilian use of atomic energy, the creation of towering new dams, the Saint Lawrence Seaway (so big it was said to create “a fourth coast”), a national satellite program, and the establishment of the National Aeronautics and Space Administration, or NASA (which, along with the Department of Defense, spearheaded the development of satellite and communications technology).20
Nor, as we have seen, was Bush wrong about the positive effects. Taken as a whole, the much more active state that emerged out of World War II was distinctly well suited to building the foundations for future growth. Bipartisan and pluralistic—with government and markets working in tandem rather than in conflict—it was also enormously successful.21 We have already seen some of the major payoffs: radar, satellites, momentous breakthroughs in computing and medicine, and growth-boosting infrastructure. By far the most consequential, however, was the supereducated population fostered by state and federal support for the nation’s rapidly expanding universities.22 The United States wasn’t just racing ahead in the production of cars it was also racing ahead in the production of college graduates. It was a formula tailor-made for goosing the Solow residual, that part of growth explained not by the quantity of inputs but by the capacity to transform those inputs into outputs. In the immediate decades after the war, productivity per hour worked rose at its fastest rate ever.23
Looking back today, what seems most remarkable is how unremarkable it all was. Federal backing of organized labor was the great exception—a bitter struggle that, after the war, the antiunion forces dominated. As for the rest, it was all good on Capitol Hill. The NSF had two-thirds or greater support in the House and Senate.24 And it was one of the more controversial bills: Eisenhower’s ambitious highway legislation passed with one dissenting Senate vote (Russell Long of Louisiana, who opposed the three-cents-a-gallon gas tax) and a voice vote in the House.25
Of course, Eisenhower and the congressional coalition that backed his initiatives were not European social democrats. Nor, to be certain, were they enthusiasts of state planning. But Eisenhower and Bush were simply not very conservative. The political mainstream flowed toward a mixed economy based on publicly regulated and supported markets and an active, though often decentralized, state role in providing public goods and overcoming market failures. On matters of race and anti-Communism and organized labor, fierce fights abounded. Yet what wasn’t fought over was at least as important as what was. Permanent mass taxation to fund big government was the new status quo. So, too, were Social Security and other components of America’s budding welfare state. And so, too, was the highly active role for government in supporting science and infrastructure and developing workers’ capacities that Vannevar Bush had spearheaded.
All of this sounds jarring to the contemporary ear. If you were to ask most Americans why the United States got rich, “effective government”—much less “the mixed economy”—probably wouldn’t make the top ten, or even the top hundred. Most would likely start with Americans’ famous work ethic. Then perhaps our great inventors and the great inventions they created, our great frontier and the great frontier spirit it fostered, and our great victory in World War II and the Greatest Generation that won it.
If you were to ask those who study economic development why nations get rich, however, you would hear a different story. They would tell you something that is, on reflection, both obvious and true: Effective governance is at the heart of sustained growth. These analysts would come back with a list very different from the idiosyncratic explanations to which most people are drawn. There would be disputes, to be sure, but the following ten factors would make most analysts’ lists:
1. private property rights and legally secure contracts backed up by an independent legal system;
2. a well-functioning financial system, including a central bank to provide a common currency, manage the macroeconomy, and serve as lender of last resort;
3. internal markets linked by high-quality communications and transportation infrastructure;
4. policies supporting and regulating external trade and financial flows;
5. substantial public investment in R&D and education;
6. regulation of markets to protect against externalities, such as pollution, and help consumers make informed decisions;
7. public provision of goods that won’t be provided at all or sufficiently if left to markets, such as public health;
8. inclusion of all sectors of society in the economy, so that human capital isn’t wasted;
9. reasonably independent and representative political institutions, so that elite capture and rent seeking aren’t rife; and
10. reasonably capable and autonomous public administration—including an effective tax system that citizens view as legitimate—so that items 1 through 9 can be carried out in relatively efficient and unbiased ways.
What do all these ingredients have in common? They require active, effective government. A libertarian “night-watchman state” that merely provides defense and protects property rights isn’t going to cut it. It’s not that national defense and property rights aren’t important. It’s just that they’re not sufficient, and even achieving them requires a state much more active than the libertarian vision suggests. As one economist concludes after examining the experience of ninety developing countries, “Too often, policy makers lose sight of how much effort is entailed in creating and sustaining the institutional framework for private enterprise. Big government may be part of the problem, to paraphrase former US President Ronald Reagan, but the solution is competent, not always smaller, government—government that does the jobs where it is paramount.”26
The “East Asian miracle” of rapid economic growth in nations such as South Korea provides a revealing case study. As one of the most influential analyses concludes, countries that sought “to follow all the advice of the visiting preachers of the free market . . . too often failed to grow.”27 Instead, successful nations were “mixed economies” in which government actively ensured macroeconomic stability, encouraged and directed investment, regulated markets to make them work (especially financial markets, which were “repressed” to discourage speculation), and even created markets where they didn’t yet exist. As two economists from Hong Kong—a nation often lauded as the free-market ideal—explain in an article entitled “The Night-Watchman State’s Last Shift”: “The market needs a strong state to manage it. This means that whether a government is ‘big’ or ‘small’ is less important than . . . whether the state is able to ensure high-quality market order. Current policy debates have largely neglected this aspect of the state’s role, because Western thinkers take their countries’ [market infrastructure] for granted, especially their regulatory and judicial systems, which have benefited from hundreds of years of development.”28
And remember, these analysts are concerned principally with just one question: How fast does an economy grow overall? But as we have learned, increasing national income isn’t synonymous with increasing human welfare. Additional government policies are needed if political leaders are to preserve clean air and water or provide basic social protections against poverty and risk. But even if growth were the only goal, the overwhelming weight of the evidence and theory is on the side of a mixed economy with an extensive state role.
So it would be surprising indeed if America got rich without or in spite of government. Yet that is what we are often told. Here, for example, is Milton Friedman writing in 1980 about the wondrous Gilded Age: “an era of rugged, unrestrained individualism,” “an era with the closest approximation to pure economic laissez-faire in American history,” “an era in which there was, for most of it, no ICC, no FCC, no SEC, and you pick out any other three letters of the alphabet, and it wasn’t there either.” “Far from being a period in which the poor were being ground under the heels of the rich and exploited unmercifully,” Friedman insisted, “there is probably no other period in history, in this or any other country, in which the ordinary man had as large an increase in his standard of living as in the period between the Civil War and the First World War, when unrestrained individualism was most rugged.”29
In fact, there is another period: the period through which Friedman lived, from the mid-1940s well into the 1970s. The late nineteenth century saw healthy increases in average income, but they were nothing extraordinary by later standards. Friedman penned his paean to “unrestrained individualism” at the end of the tumultuous 1970s—which featured per capita GDP growth a third again as fast as the wondrous 1870s.30 More important, the late nineteenth century was an era of enormous hardship as well as outsized fortunes, with inequality rising, life expectancy stagnating, and average heights actually falling.31 Life for “the ordinary man” improved markedly around the turn of the century—but it did so precisely because “pure economic laissez-faire” (never even closely approximated) lost even more ground. If we truly want to understand why ordinary Americans saw their standard of living skyrocket, we need to remember what four decades of revisionism and government bashing have caused us to forget.
Milton Friedman was trying to shock his readers when he cast the Gilded Age as the Golden Age. A generation later, however, his revisionism looks mild. Whole books now torture history to show that nineteenth-century American economic development was a lost Eden of unfettered capitalism. In these accounts, the Constitution was meant to create a highly limited government intended primarily to protect property rights and enable market exchange.32 Only in the twentieth century were Americans cast from the free-market garden as their leaders abandoned these founding ideals.
Calling in the Founders for ideological reinforcement is a long tradition, but not one that displays much commitment to historical accuracy. The Founders disagreed among each other, often fiercely. They changed their views, often fundamentally. They were smart but hardly infallible; public spirited but hardly immune to selfish or blinkered thinking (exhibit A: slavery). Above all, they were operating in a context far removed from our present era. Statements about how they would respond to current challenges are often just restatements of the speakers’ prejudices.
So it is with the notion of the Founders as apostles of laissez-faire. Of all the ways to misunderstand their thinking, to see them as protolibertarians may be the most profound. To the contrary, they were enthusiastic state builders whose primary concern was creating a government strong enough to protect and regulate a fledgling nation.33
The fifty-five men who gathered in Philadelphia viewed the weak Articles of Confederation, established in the midst of the Revolutionary War, as a disaster. The economy was in shambles—locked in a depression as deep as that of the 1930s—as states pursued beggar-thy-neighbor policies.34 A federal government unable to tax teetered on the edge of bankruptcy. With no funds except the paltry sums volunteered by the states, the federal government was incapable of mounting a force that could defend itself. At one point, the national army fell to just eighty men. Spain closed the Mississippi River to American commerce, and other powerful nations were positioned to take advantage.35 To James Madison and his allies, the federal government needed sweeping new powers. Madison did not win on every issue: Opponents defeated his proposal to give the federal government an absolute veto over state laws, for example—a proposal contemporary conservatives who celebrate Madison conveniently overlook. But all the key Founders shared his conviction that the problem was too little central power, not too much.36
You don’t have to review the Founders’ deliberations to see how misplaced the celebration of them as free-market fundamentalists is. You can also look at the choices that they made once those deliberations ended. After all, to a degree unique in American history, the men who designed the nation’s new institutions were also the men who contested for power and led within those institutions. We do not just have to listen to what they said, in other words. We can also watch what they did.
And here is what they did: They created a stable national currency and central bank; promoted domestic manufacturers; created a national army for external protection and internal expansion; purchased and seized land and laced it with new roads, navigable waterways, and systems of communication, including the world’s largest public postal network; guaranteed the expansion of public education by setting aside land to be sold to finance local schooling; expended federal funds again and again to relieve distress from natural disasters and major economic losses; and bound together all these efforts with a national legal system that would eventually give birth to the modern corporation.37
As the great American historian Henry Adams wrote in 1879, “A people which had in 1787 been indifferent or hostile to roads, banks, funded debt, and nationality, had become in 1815 habituated to ideas and machinery of the sort on a grand scale.”38
Popular histories tell this story today as a grand struggle between two traditions: the friendly-toward-government tradition of Alexander Hamilton and the skeptical-of-government tradition of Thomas Jefferson. Mostly, however, this ongoing “rivalry has been resolved by putting the Jeffersonians in charge of the rhetoric and the Hamiltonians in charge of policy”—as the economist Erik Reinert puts it wryly.39
The truth is that leaders in both camps used the state aggressively, if not always visibly, to promote economic development. George Washington’s famous indictment of the “baneful effects of the spirit of party” apparently failed to convince America’s incipient parties to stop fighting.40 But his proposals for binding the nation through a better-paid federal workforce, federal support for university education, a national military academy, and federal promotion of domestic industries and agricultural productivity all would be realized in one form or another—under both parties—over the coming century.41
Even Jefferson wasn’t all that Jeffersonian. He never uttered the famous quote attributed to him: “That government governs best which governs least.”42 And while he did come into office promising an end to Hamiltonian policies, fortunately for the nation’s economy, that didn’t happen. The new administration cut a few hundred federal workers, and held the line on taxes and spending.43 Yet as the historian Brian Balogh points out, “Republicans were more than willing to use the latent authority of the General Government once they moved from minority to majority status. Though Jefferson’s victory was hailed as a ‘revolution,’ Republicans eventually embraced a large part of Hamilton’s economic policy.”44
Examples abound: Jefferson’s secretary of the Treasury, Albert Gallatin, continued the national bank, Jefferson drew up plans for a national road system, and Gallatin and others successfully advocated a whole host of other “internal improvements.”45 Over the years spanning his presidency, 1801 to 1809, the federal government also expanded the tiny postmaster’s office overseen by Benjamin Franklin during the Revolution into the biggest postal system in the world, vital to commerce as well as communication (which, as a federal official, Jefferson used free of charge into his retirement). In the early nineteenth century, the United States had seventy-four post offices for every hundred thousand people. By comparison, Great Britain had seventeen per hundred thousand; France, four.46
The greatest example of the Hamiltonian Jefferson is the Louisiana Purchase (which Hamilton, ironically, opposed): a project of federally guided national expansion, launched with ambiguous constitutional authority, financed through complex debt transactions, and bought at a price that, while a steal, was 40 percent larger than all federal revenues at the time.47
Jefferson and Gallatin recognized that the public land they acquired could substitute for the public taxes they disliked. It was Jefferson, after all, who had drafted the Land Ordinance of 1785, which set aside land that could be sold to finance local schooling—a provision also included in the better-known Northwest Ordinance of two years later.48 Gallatin, according to the historian Thomas McCraw, “saw more clearly than most American statesmen that the public lands were by far the government’s most valuable asset. Further, he perceived that land could be used for many different purposes, including the raising of money.”49
It was not simply, as McCraw argues, that land “became the keystone of US economic growth, because of North America’s rich soil and abundant natural resources.” These assets had to be obtained and exploited, and government made that possible. More important, land became the currency of economic statecraft—provided to settlers to encourage agriculture and enterprise (notably in the Homestead Acts of the 1860s and 1870s); to localities to build schools; to investors to finance public projects; to private companies to build rail lines and canals; and eventually to public agencies, such as the US Forest Service and the US Geological Survey, to oversee direct extraction of the nation’s resources. To give a sense of the scale of these commitments, a single railroad, the Union Pacific, received free territory equivalent in size to New Hampshire and New Jersey combined.50
Land was the medium of America’s hidden interventionism. Other growing governments mainly took through taxation and gave through spending.51 Much of US economy policy followed a track far less visible but no less effective—and no less governmental. The federal government exercised its most aggressive power literally at the borders, where new land was seized from native peoples, surveyed, developed, exploited, and sold. Inside these expanding borders, governance appeared localized, decentralized, privatized, and mostly invisible. But government was there nonetheless, forging a national market.
State and local governments got into the game, too. In 1902 the American economist Guy S. Callendar pointed out what few at the time wished to acknowledge—“that this country was one of the first to exhibit [the] modern tendency to expand the activity of the State into industry.”52 Callendar was speaking not of federal policies but of state governments’ efforts to aid industrial development through infrastructure investment. All told, according to Callendar’s calculations, state development bonds added up to a debt greater than the federal government had ever assumed—“the first large funded debt created by the government of any country for purely industrial purposes.”53
Besides its infrastructure-linked national market, America’s biggest growth advantage was its extensive network of local schools. It’s therefore worth remembering that, as the economists Goldin and Katz explain, “the most important features of US elementary and secondary education around 1900 were its public funding and public provision”—which, again, often depended on federal grants of land.54 While local governments pioneered primary schooling, state governments took the lead on public higher education, with the help of the Morrill Act and subsequent laws providing federal land for colleges and universities.55
America’s emergence as a world economic power in the latter half of the nineteenth century featured plenty of enterprising citizens seizing on the opportunities for economic advancement the Constitution protected. But the role of the Founders and their political heirs was much more direct. They built a state with the power to tax, spend, enforce, defend, and expand. Once in office, they and their fellow leaders helped create a vast nation linked by infrastructure, governed by a federal legal system, and hosting the most educated workforce in the world, using land rather than spending as their main policy tool.56 These were the seeds of America’s economic flowering, but their budding branches would soon face stiff new winds from concentrated corporate power.
For much of the twentieth century, America’s leaders looked back on the Gilded Age as a cautionary period in American history. In recent decades, however, the era of extremes has received something of a makeover. The critics were wrong, says a growing chorus, to impugn the giants of industry and finance who dominated the period. “Never before—or (arguably) since—have opportunities been so ripe for those capable of mobilizing capital and organizing large enterprises,” argues the historian Maury Klein in the conservative Manhattan Institute for Policy Research’s in-house journal. “The entrepreneurs who came to dominate this scramble were to the American economy what the Founding Fathers were to the political system.”57 Others have lauded the philanthropic efforts of these titans.58 The implication is obvious: Laissez-faire works, government doesn’t, and today’s billionaires uphold a tradition of innovation and generosity pioneered by the captains of commerce of a century ago.
In fact, the era of “opportunities” is a casebook on the limits of markets—and not just because its greatest achievements rested on public foundations. The Gilded Age demonstrated that a modern industrial economy could not function without independent national authority. And no one made the case more powerfully, if inadvertently, than a railroad tycoon named Jay Gould.
Gould was the quintessential robber baron, as the muckraking journalist Mathew Josephson would term the economic princes of his day. Like nearly half of the superrich of the era—men with wealth greater than twenty thousand times that of the average worker—Gould was a railroad magnate.59 Yet his real skill was finance. Ruthless, corrupt, and brilliant, Gould ignored the mechanics of rail and focused on the money.
Gould was not alone. The railroad fortunes that dominated turn-of-the-century America came not from moving goods but from moving securities.60 They also came from moving politicians and judges with the proceeds. The most famous example—both brazen and emblematic—was the “Erie War” of the late 1860s, fought between Gould and Commodore Cornelius Vanderbilt for control of the lucrative Erie Railway Company.61 Vanderbilt ran the rival New York Central Railroad. Capturing the Erie would give him a monopoly on railroad traffic from New York westward. Defending Erie against Vanderbilt’s hostile takeover, Gould first diluted its stock by printing millions in new securities. Vanderbilt responded by buying off a judge to rule against Gould. Eventually the fight ended up in the New York legislature, where—as a contemporary account puts it—an “open auction” ensued, “which Gould won by paying higher bribes to more legislators.”62 Said the Railway Times of another of Gould’s more audacious financial exploits, in 1884: “No pickpocket, either ancient or modern, has been more successful.”63
Later Gould would throw the US gold market into turmoil with a scheme that implicated top public officials in the administration of President Ulysses Grant, as well as Grant’s own brother-in-law. Only by selling $4 million in gold did the federal government avert the plot and protect the financial backbone of the national economy. (A Treasury Department insider tipped off Gould, and he got out before the collapse.)64
The economic chaos that ensued was a sign of just how unstable the nation’s financial system was. Indeed, with the country lacking either deposit insurance or a national bank (abandoned under President Andrew Jackson), financial crises were the norm, not the exception. Major bank panics rocked the nation at least six times between 1873 and 1907.65 Meanwhile, the country was in a depression or recession in 1865–67 (featuring a 24 percent decline in business activity), 1869–70 (10 percent), 1873–79 (34 percent), 1882–86 (33 percent), 1887–88 (15 percent), 1890–91 (22 percent), 1893–94 (37 percent), 1895–97 (25 percent), and 1899–1900 (15 percent).66 Deflation—that strange inversion of the normal rule of rising prices—was common, encouraging investors to hoard cash and wreaking havoc on farmers, who turned increasingly toward antibusiness populism.
Today’s defenders of the Gilded Age try to divide the barons into good and bad apples. One account, The Myth of the Robber Baron, argues that crony capitalists gave the real entrepreneurs a bad name—as if everything would have been fine if the government had just stayed out of the way.67 Yet all the robber barons depended on government and their manipulation of it. Another railroad financier, Collis Huntington—who, along with Leland Stanford (yes, the founder of the California university), rounded up support for the ill-fated Central Pacific Railroad—managed to see higher purpose in his bribery of politicians: “If you have to pay money to have the right thing done, it is only just and fair to do it.”68
Trying to separate the good barons from the bad also misses the bigger problem: a massive and growing imbalance between the power of America’s new economic elite and the capacities of the public officials who were supposed to defend broader interests. No era in American history featured concentrations of income, wealth, and power as extreme. By 1896, railroads alone accounted for a larger share of the economy than all levels of the US government put together. And ownership and control of these massive resources were increasingly in the hands of a few, as inequality rose to new heights that would not be approached again for a century.69
The consolidation of the corporate economy in the late nineteenth and early twentieth centuries challenged the republican vision of the Founders. Though members of America’s upper class, early American leaders nonetheless saw the broad distribution of wealth (among white men) as essential not just for economic dynamism but also for democratic equality. Jefferson (a slaveholder) wrote to Madison that “enormous inequality” caused “much misery to the bulk of mankind” and therefore that “legislators cannot invent too many devices for subdividing property.”70 John Adams believed that “equal liberty and public virtue” rested on making “the acquisition of land easy to every member of society,” in part because “a natural and unchangeable inconvenience in all popular elections” is “that he who has the deepest purse, or the fewer scruples about using it, will generally prevail.”71 Noah Webster, the writer and lexicographer who issued influential calls for federal union, lauded the proposed Constitution because he believed its design protected “the very soul of the republic” and “the whole basis of national freedom”—“a general and tolerably equal distribution of . . . property.”72
America’s expanding industrial economy didn’t exactly produce such a distribution. Most of America’s corporate giants were companies with huge economies of scale. In these sectors—such as railroads, steel, oil, timber, and their affiliated financial arms—the tendency toward monopoly was overwhelming. At their peaks, U.S. Steel controlled nearly 70 percent of its market; Eastman Kodak, more than 70 percent; General Electric, 90 percent; International Harvester, 70 percent; DuPont, 65 percent to 75 percent; American Tobacco, 90 percent.73 Strong economic pressures encouraged consolidation and bigness. Cunning and corruption ensured there would be no government pushback.
To be sure, with cunning and corruption came much creativity as well. Inventors and entrepreneurs such as Alexander Graham Bell and Thomas Edison played a critical role, though less critical than suggested by popular accounts that ascribe technological breakthroughs to a single visionary. The role of such enterprising figures rested on a public foundation, including the federal patent office.74 But America’s proto–mixed economy, like the much more advanced version that would follow, depended on private fortune seeking as well as public inducement. Congress funded the development of the telegraph industry, subsidized it through the Pacific Telegraph Act of 1860, and nationalized and expanded it during the Civil War.75 Yet private corporations—eventually, corporation, with the formation of the Western Union monopoly—ran the service and laid much of the cable. As it happened, this private monopolization of publicly seeded innovation allowed Jay Gould to take over the industry and add to his considerable fortune.76
So while the titans of the new industrial age were skillful in ways both laudable and despicable, they were also just plain lucky. They came along when national markets were finally possible, benefited from public land grants and loan guarantees, capitalized on economies of scale that allowed early movers to bury rivals, and then monetized future profits (likely or imagined) through volatile and manipulable financial markets. These fortunate circumstances help explain why “a group of divided, quarrelsome, petulant, arrogant, and often astonishingly inept men,” to quote one recent colorful description, managed to amass so much economic and political power.77
All of which suggests the most important point about the mythical age of laissez-faire: The monopoly capitalism that emerged was unsustainable—economically, politically, and, though few paid attention to it at the time, ecologically. Government policies were successful in promoting development. Without them, building the railroads likely would have taken decades longer, with a huge economic loss.78 But these policies fostered concentrated corporate power that the federal government lacked the capacity to govern effectively, and the costs of that incapacity to American society were skyrocketing.79
Moreover, increased interdependence brought new social problems. Workplace accidents soared as industrial and rail work expanded.80 The toxic financial assets of the era proved even more toxic than those of our own, making economic crises the norm. With industrialization came social and environmental costs that the nation could not ignore for long. To take one forgotten example, the railroads that Jay Gould and his ilk commandeered led to massive environmental degradation, especially in the plains, where bison were driven almost to extinction and replaced with open-range cattle farming that devastated wild habitats.81 Courts, of course, provided little recourse, whether to victims of fraud, monopolies, accidents, or tainted food or medicine. Buying justice was simply a cost of business for the powerful interests causing harm.82 And so long as government sat on the sidelines, the harms just kept multiplying.
It was time for the mixed economy.
Theodore and Franklin Roosevelt were distant cousins and very different men. But they shared a conviction that government had to be strengthened to rebalance American democracy and ensure broadly distributed gains. Either could have said what TR declared in 1910: “The citizens of the United States must effectively control the mighty commercial forces which they have called into being.”83
Theodore Roosevelt would not achieve that goal during his lifetime. The list of major reforms enacted in the first two decades of the twentieth century (under Woodrow Wilson as well as TR) is neither short nor trivial: enfranchisement of women, the direct election of senators, the nation’s first income tax, workers’ compensation, the Clayton Antitrust Act, the Federal Reserve, the first restrictions on money in politics, the first serious attempts at environmental preservation, and extensive new national regulations, including the Pure Food and Drug Act of 1906, which established the US Food and Drug Administration (FDA).84 Yet TR died in 1919 on the eve of another decade of financial speculation and runaway inequality, during which public authority decayed while problems festered—until, of course, an economic crisis made continued inaction untenable.
Despite the interregnum of the 1920s, however, it makes sense to think of the two Roosevelts as bookending a long Progressive Era. It was progressive because at crucial moments, nearly everyone in a position of high public leadership came to believe that the American social contract needed updating. It was long because challenging entrenched elites is so difficult, and only persistent agitation and huge disruptions to the American political order allowed the translation of these new beliefs into new governing arrangements.
TR’s Republicans—heirs to the Hamiltonian tradition—led the charge. Yet, as in the nineteenth century, Democrats often ended up completing the task (and would eventually switch ideological places with their partisan rivals). No wonder conservatives of our time, from Glenn Beck to George Will, so abhor Wilson. With the Federal Reserve, the FTC, the nation’s first income tax, the modern estate tax, and other transgressions against the antigovernment creed, Woodrow Wilson signaled more clearly than any economic tract could that the emerging mixed economy was a necessary adaptation to modern capitalism—one that no democrat, or Democrat, could long evade.85
Few figures were more important in this intellectual shift than Louis Brandeis. Nominated by Wilson to the Supreme Court in 1916, Brandeis was the first Jewish justice.86 Yet his appointment was controversial not so much because of the widespread anti-Semitism of his day as because Brandeis was so unpopular among business interests and the politicians closest to them. More clearly than any other legal thinker of the time, Brandeis saw that economic interdependence demanded public action to protect workers from abuse, manage systemic risks, and ensure that big corporations did not dominate politics. He also saw that such action required freeing government from the vise grip of powerful private interests. “We must make our choice,” Brandeis famously wrote. “We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.”
The question that consumed Brandeis was how to effectively regulate corporations in a national industrial economy.87 In a 1933 dissent, he lamented that “men of this generation” had acted at times “as if the privilege of doing business in corporate form were inherent in the citizen.” Corporations were possible only because of state law, he argued, and it was reasonable to regulate them to limit “monopoly,” “encroachment upon the liberties and opportunities of the individual,” “the subjection of labor to capital,” and, above all, the “concentration of economic power” through which “so-called private corporations are sometimes able to dominate the state.”88
Within three years of Brandeis’s impassioned dissent, the court had sanctioned the New Deal and, with it, the modern mixed economy. Most of us take the results so much for granted that it’s hard to grasp the scope of the transformation. But consider just a few of the profound ways in which the United States got a new deal.
The first and most important change was the creation and deployment of effective levers for managing the macroeconomy. Economists are not of one mind about what caused the Great Depression. But most agree that it was worsened greatly by the Fed’s failure to expand the money supply. Reorganized by statute in 1935, the Fed became an integral part of a set of national regulations, institutions, and practices that created remarkable new macroeconomic stability relative to the past.89 These reforms included bank deposit insurance, financial regulations that limited speculation and systemic risk, and, increasingly, the use of fiscal policy (taxes, spending, and deficits) as a counterweight when deep downturns loomed—the prescription urged by the great English economist John Maynard Keynes in his 1936 book The General Theory of Employment, Interest, and Money. Virtually overnight, bank failures ended, and an era of moderate economic cycles, rather than wild seesaws, began.90
While FDR did not end the Depression, the New Deal did stabilize the economy as its policies reorganized it. Expansionary fiscal policy was abandoned prematurely in 1937, driving unemployment back up. The fiscal sluice gates reopened, but states and localities continued to cut back as tax coffers stayed light, prolonging the downturn.91 Still, the 1930s and 1940s saw the fastest sustained growth in the productivity of the economy yet.92 At the same time, the rapid improvement in life expectancy that began around 1900 continued, even accelerating in the wartime 1940s. Between 1870 and 1900, life expectancy increased between one and two years each decade, on average. Starting in 1900, that number rose to between three and six years each decade.
Taxes and spending were used not just for macroeconomic management. They were also used to finance new programs, especially to provide relief and promote development. These efforts included the public works projects that built much of America’s infrastructure in the 1930s, the widespread electrification that made modern technological expansion possible, and, of course, the military buildup to fight and win World War II. The Works Progress Administration built more than a half million miles of road, more than a hundred thousand bridges, more than forty thousand miles of sewers and water mains, and more than a thousand airports. It also built places of governance and learning: courthouses, public offices, libraries—a reminder that not all new state activity was directed at recovery.93 Another reminder: In 1938 Congress transformed the FDA into the powerful agency it is today with authority to review drugs prior to sale, after at least seventy-three deaths from a single adulterated medicine.94
Taxes were also a tool of social reform. The national income tax had existed since Wilson’s presidency, but FDR and congressional Democrats used it more actively to tamp down extremes of inequality. And inequality did fall—first because so many fortunes vanished in the market crash, then because of Roosevelt’s policies, and finally because of the broad labor mobilization and corporate restraint that accompanied World War II.95 Yet populist rhetoric notwithstanding, taxes on all classes expanded. The Social Security payroll tax focused on blue-collar workers, and the income tax became, for the first time, a “mass tax” during World War II.96
The mention of payroll taxes brings up the most controversial aspect of Roosevelt’s legacy: the American welfare state. States and the federal government were more involved in the provision of social welfare prior to the New Deal than legend suggests.98 Still, it was only in the 1930s that the United States created a national framework of social insurance, available to workers and their families as a right rather than as charity. This development came later than in most rich democracies, in large part because of the limits on national authority in the United States prior to the Great Depression.99 But during the 1930s, the United States made up for lost time and was (briefly) seen as a world leader in the development of unemployment, retirement, survivors’, and antipoverty benefits, as well as work programs.100
The rationale for social insurance was, and is, that private markets cannot adequately insure many of the major risks that citizens face. The Great Depression laid bare the lack of individual savings for unemployment and retirement, and the unreliability of company-based benefits when the economy falters. Indeed, even most contemporary critics of these programs do not argue that such risks can be tackled with voluntary action; instead, they want government to compel individuals to insure themselves (through private Social Security accounts) or subsidize private efforts (with tax breaks for workplace health and retirement benefits). Or they argue that benefits should be less generous. Not even Friedrich Hayek, now a libertarian icon, contended that private insurance could do the job. In The Road to Serfdom, written in the early 1940s, he saw no reason “why the state should not be able to assist the individual in providing for those common hazards of life against which, because of their uncertainty, few individuals can make adequate provision.” Lest his readers doubt his meaning, he continued, “The case for the state’s helping to organize a comprehensive system of social insurance is very strong.”101
The New Dealers set out to rescue and reform capitalism, not replace it. The academic who oversaw the development of the Social Security Act, Edwin Witte, said of it, “Only in a very minor degree did [the Act] modify the distribution of wealth, and it does not alter at all the fundamentals of our capitalistic and individualist economy.”102 The welfare state softened the sharp edges of capitalism without tight restrictions on economic dynamism. “Necessitous men are not free men,” Roosevelt declared in 1936.103 With protests against the dismal economy rocking Washington, he didn’t have to add that necessitous men are not natural supporters of the market, either.
Among the popular forces pressing from the left was the nation’s growing labor movement, the final critical pillar in the emerging mixed economy. As we have seen, the 1930s brought fundamental policy shifts. Yet the long Progressive Era was also about shifting the balance of power, and labor unions were pivotal in this transformation.
The politician most responsible for elevating organized labor was not FDR but another New York progressive: Senator Robert F. Wagner. The four-term senator grew up in the immigrant tenements of New York City. From 1904 until the end of World War I, he served in state government, where he sometimes butted heads with a fellow Democrat named Franklin Roosevelt. While in state government, he investigated the Triangle Shirtwaist Factory fire, the 1911 tragedy that helped galvanize progressive reformers. By the time his former state senate colleague had become president, Wagner was in the US Senate. He would become, in FDR’s own words, “the copilot of the New Deal.”104
With regard to labor law, however, Wagner was mostly flying solo. FDR and his labor secretary, Frances Perkins, did not place as high a priority on new union protections. But as Wagner laid the groundwork for the 1935 law that appropriately bears his name, he won the administration’s strong backing.105 The Wagner Act (formally, the National Labor Relations Act of 1935) contained now-familiar provisions: the National Labor Relations Board (NLRB) and its process for union certification, guarantees of collective bargaining rights, and bans on employer practices that contravened them. But its core message was that workers could band together legally to serve as a countervailing power to corporations and their associations.
The results were dramatic. Unions were struggling at the end of the 1920s, their membership having declined from over 5 million to under 3.5 million in a decade—a tenth of the nonagricultural workforce (roughly where total union membership stands today, though private-sector membership is significantly lower than it was in 1929).106 By the mid-1940s, with the Wagner Act’s protections in place and wartime labor markets tight, unions covered a third of the workforce, and an even larger share of nonagricultural households had a union member in the family.107
Most business leaders had little affection for unions. But many would, in time, come to accept and even work with them. With their broad membership in the most concentrated sectors of the economy, unions proved a valuable source of countervailing power to the large industrial organizations of the era, helping to ensure that increased worker productivity translated into rising wages, as it had not during the 1920s.108
It would be a mistake, however, to see the effects of unions as limited to union members, much less as purely economic. As a growing body of research shows, unions reduced the sharp inequalities of the 1920s among nonunion workers as well as union members.109 Even more important, labor unions constituted a unique political movement that, for all its shortcomings—stubborn racism, mob connections, poor leadership—did more than any other organized force in American politics to address the concerns of less affluent citizens. In “the tripartite arrangement of a robust labor movement, an active state, and large employers,” explains the sociologist Jake Rosenfeld, unions did not simply “counterbalance corporate interests at the bargaining table.” They also served “as a powerful normative voice for the welfare of nonelites” and the “core equalizing institution” during “the ‘golden age’ of welfare capitalism.”110
Let’s look at the world they helped create.
Vannevar Bush, the conservative who became FDR’s national science czar, was never all that fond of organized labor. As an MIT professor, he witnessed the Boston police strike of 1919, and cheered on Governor Calvin Coolidge as he replaced the strikers. But reflecting on Coolidge’s presidency at the end of the 1960s, Bush was harsh: “In the late twenties, there was ballyhoo that business was good, and it wasn’t. . . . Who would go back to the old days? Perhaps we are overdoing the welfare state. . . . But one cannot view the whole movement without feeling a bit more confident that man is learning how to govern himself, and that a political system with a growing prosperous middle class which knows its power still aims to care for the weak.”111
Vannevar Bush had a right to feel satisfaction. He had helped create the public-private partnership that fostered that “prosperous middle class.” The main contours of this engine of development are now familiar. But let us examine more closely the goose that laid the Golden Age.
Feeding the goose, of course, were taxes—and never before had tax policy changed as much as it did between 1939 and 1943. Before the war, income taxes had brought in no more than 2 percent of national income. By 1943, they raked in 11 percent. Meanwhile, the share of Americans paying income taxes skyrocketed from 7 percent to 64 percent.112 If, as Adam Smith believed, taxes were a “badge of liberty,” many more Americans were wearing that badge after the outbreak of World War II.
Most of the money went to the war effort, of course. But research in universities and industrial labs was among the main beneficiaries, too. Although Vannevar Bush’s effort to keep science from falling “flat on its face” ramped up programs and organizational forms already in place, the scale of investments dwarfed what had come before. And it paid off with major advances in medical treatment, computer technology, chemical engineering, aeronautics, communications, and much else. The seed corn for more than a generation of productivity advances and human betterment was stored up.
Crucially, just as scientists flocked to US universities to join in the action, young Americans poured into college with funding from the GI Bill.113 Science did not just build better technologies and treatments but also better-trained minds capable of capitalizing on and furthering these advances.
Rivaling these investments, both in their impressive scale and their enormous social returns, were vast government outlays for highways, airports, waterways, and other forms of infrastructure that allowed goods and people to move faster than ever. The interstate highway system, the benefits of which were described briefly in chapter 2, began with Eisenhower’s 1956 National Interstate and Defense Highways Act, which dedicated over $200 billion (in current dollars) to the cause and authorized a nationwide gas tax for highway financing. Along with continuing refinement of macroeconomic tools, postwar investments propelled the fastest sustained growth in history—which, unlike the rapid growth of Coolidge’s day, was shared broadly. Indeed, incomes increased slightly faster among families in the middle and at the bottom than among those at the top.114
New Deal programs of economic security expanded as well. With Eisenhower’s strong support, Congress extended Social Security to cover almost all Americans and made it generous enough to pull more of the elderly out of poverty, even as disability protections were added.115 By contrast, national health insurance—proposed by Truman but opposed by the growing private health industry—never made it to the floor of Congress. Nonetheless, wartime price and wage controls that permitted supplemental benefits, the spread of collective bargaining, and tax breaks for health insurance helped push private coverage up to an eventual peak of around three-quarters of Americans by the mid-1970s.116 The federal government also subsidized and regulated private pensions that built on top of Social Security.117
As these tax breaks suggest, the new American state was no unchecked Leviathan. It commingled public and private, direct spending and indirect subsidies, central direction and decentralized implementation. It fostered pluralist competition for funds among researchers, contractors, and private intermediaries, as well as among states and localities. But it was enormously active and enormously successful—and soon its rewards would extend to groups that had yet to feel the warm sun of American prosperity.
In 1966 a scrappy basketball team from Texas Western College in El Paso made history by winning the National Collegiate Athletic Association (NCAA) Championship over the powerhouse University of Kentucky. Coached by the legendary Adolph Rupp, Kentucky was the overwhelming favorite. The top-ranked Wildcats had won four national titles. The Texas Western Miners—so named because the college had begun as a state mining school—were unheralded.118 Nobody but the players and their coach, Don Haskins, expected the upset. Certainly not Rupp: He called the Miners “loose-jointed ragamuffins” who were “hopelessly outclassed.”119 Yet the outclassed ragamuffins had a secret weapon that transformed basketball forever.
For those who know sports history, the secret weapon is no secret: The starting lineup of the Miners comprised five black men. Before they took to the court, no major-college team had started five blacks in any game ever.120 Coaches believed that teams had to be mostly white to win—a conviction grounded in powerful currents of racism. The storied Southeastern Conference, where Rupp’s teams flourished, did not have a single black player until 1967. That player, Perry Wallace (who went on to become a distinguished law professor), would say later, “Whites then thought that if you put five blacks on the court at the same time, they would somehow revert to their native impulses. They thought they’d celebrate wildly after every basket and run around out of control, [and] you needed a white kid or two to settle them down.”121
Prejudice was the Miners’ powerful advantage. As recounted in the movie Glory Road, the Miners’ rise becomes a feel-good story of smart coaching and racial progress. Above all, however, it was an indicator of just how much exclusion stifled success. Yes, Coach Haskins had given the black players new skills. But they were good already—they just didn’t have a chance to play at the top. Kentucky’s president had encouraged Rupp to put an African American on the roster. He refused, complaining to his assistant coach, “That son of a bitch is ordering me to get some niggers in here.”122 Looking at the top ranks of basketball today, one can see that Rupp’s racism was ignorant as well as shameful: He was fielding much worse teams than he could have.
We now look back on the 1960s and early 1970s as a turbulent period of redistribution: of rights, of income, of national priorities. But as with the entry of blacks into college basketball, the era also marked a long-overdue recognition of huge social costs that had been ignored or denied. Failure to grant equal rights to racial minorities and to women was not only horribly wrong but also spectacularly inefficient. Failure to give all Americans the opportunity to attend college or a shot at a middle-class life was not only unfair, it also made our nation as a whole poorer.
In expanding rights for women and minorities—through statutes, through judicial action, and through the government’s own example (most profoundly, in the armed services)—the nation was finding money on the table. When future Supreme Court Justice Sandra Day O’Connor graduated third in her class from Stanford Law School in 1952, law firms would consider her only for the secretarial pool.123 But who can doubt that the blinkered male law partners, like so many others who resisted the rights revolution, were shortchanging their own companies? Arizona Republicans were the beneficiaries: She became the first female majority leader of a state senate.
Government policies also boosted the skills and opportunities of the least advantaged, where the returns on such investments were highest. We forget that most of LBJ’s War on Poverty was about expanding opportunity, not lifting poor families’ incomes directly.124 With the move toward integrated schools and increased investment, educational prospects became notably more equal, if still far from equivalent, across lines of race and class. Just as black ballplayers could now achieve their potential, the abilities of more than half the nation’s population had a greatly improved chance of being recognized and cultivated. Our society benefited as a result.
The seismic shift can be seen in the changing composition of students at elite universities. As the columnist David Brooks observes, at these bastions of old-money privilege, “admissions officers wrecked the WASP establishment.”125 Before the shift, two-thirds of all applicants to Harvard were admitted—90 percent of all applicants whose fathers were alums—and average SAT scores were dismal.126 If you came from the right background, you had a ticket. Everyone else, including, most notoriously, many brilliant Jewish students, was out of luck. But as Ivy League schools opened up, talented students from a much broader range of backgrounds had the opportunity to access the best of American higher education. Exclusion had meant mediocrity as well as marginalization. At top schools, SAT scores skyrocketed.
At least as big a change was the expansion and enhancement of public universities. A marker was the elevation of the celebrated labor economist Clark Kerr to the presidency of the University of California system in the late 1950s. Pragmatic and professorial, Kerr was nonetheless passionate about the ability of well-run organizations to foster economic and social progress. He was, in the words of journalist Nicholas Lemann, “a supreme rationalist who believed that a system could always be devised to solve a problem,” that “government ought to be the highest, biggest, and best system,” and that scientists and their students were needed to “help make it run properly.”
“Universities today are at the vital center of society,” Kerr proclaimed in his inaugural address in 1958, “We must again concern ourselves with educating an elite—if I may use this word in its true sense, free of the unhappy connotations it has acquired. But this time we must train an elite of talent, rather than one of wealth and family.”127
To drive home his message, Kerr gave his speech at the university’s obscure Riverside campus, a commuter school that was giving California’s working- and middle-class young a chance to enter the once-locked gates of higher education.128 Riverside was just one of the new public institutions rising up on the fast-changing educational landscape: In the twenty years after Kerr’s speech, the number of colleges and universities in the United States increased by more than half, the number of professors nearly doubled, and the amount of public spending (both state and federal) on higher education nearly doubled, too. The share of twenty-five- to twenty-nine-year-olds with a college degree shot up from around 5 percent in 1940 to around 25 percent in the mid-1970s (where it largely stayed for the next two decades even as other nations continued to improve, and, in some cases, race past the United States).129
Although support for this transformation came from many quarters, the federal government was the crucial catalyst. Starting with the GI Bill in 1944 and continuing through the National Defense Education Act in 1958, federal aid poured into higher education through a series of landmark initiatives that culminated in the creation of Pell Grants in 1972. The average Pell Grant covered all the tuition at an average four-year public university, with some left over for housing and other expenses. (Today it covers roughly half of tuition alone.)130 And though Claiborne Pell was a Democrat, the legislation was thoroughly bipartisan: Republicans and Democrats voted for it in almost exactly equal proportions.131 Supporters of the law believed, as Senator Pell put it, that “any student with the talent, desire, and drive should be able to pursue higher education.”132 Our nation is richer for that conviction.
As the federal government expanded, it did not merely extend opportunities to individuals on the periphery of prosperity. It also extended opportunities to places on the periphery of prosperity, injecting assistance and employment, housing and highways, development projects and defense jobs into regions previously left behind by modern economic growth. And nowhere were these enormous federal investments larger and more consequential than in the American South.
In 1938 FDR declared, “The South presents right now the nation’s number one economic problem—the nation’s problem, not merely the South’s.”133 His administration’s subsequent Report on the Economic Conditions of the South did not mince words: “The low income belt of the South is a belt of sickness, misery, and unnecessary death.” “The paradox,” the report concluded, “is that while [the South] is blessed by Nature with immense wealth, its people as a whole are the poorest in the country.”134
The South was certainly poor. When Vannevar Bush was at MIT in the 1930s, his fellow Massachusetts residents had an average income of over $12,000 (in current dollars). By contrast, residents of Mississippi, then and now the poorest state in the nation, had annual incomes of less than $4,000.135 Among African Americans, poverty in the South was even more extreme—“pathological” was the description of the Swedish social scientist Gunnar Myrdal, who visited the region in the 1940s.136
Whether southern poverty was a “paradox” is more debatable. The main sources were no mystery. The South’s once impressive wealth came from enslavement and extraction, not innovation. Southern economies mostly missed out on the industrial revolution. With agricultural production reliant on artificially low wages even after slavery’s end, the incentives for mechanizing and diversifying production were weak. And without effective political competition in the Jim Crow South, there was little political pressure to expand the scope of opportunity to those left behind—not just blacks, who were brutally excluded from civic life, but also the majority of whites.137
By the late 1970s, however, Mississippi had come within hailing distance of Massachusetts in economic terms, with average per capita incomes roughly 70 percent of the Bay State’s.138 Nor was it alone in its rapid movement toward northern standards of living: All of the poorest American states saw dramatic income growth in the middle of the twentieth century.139 By any economic measure, these were years of remarkable convergence: The rich states got richer; the poor states got richer even faster.
Of course, convergence is what you’d expect in an integrated market with free movement of labor and capital. Yet as Roosevelt’s 1938 plea for action suggests, it was far from an automatic process. During and after the New Deal, the federal government invested massively and disproportionately in developing the South. There was rural electrification and highway building; new social programs with benefits that were scaled inversely to personal income; public health, worker safety, and environmental efforts that had their greatest positive effects in the least developed regions; and enormous earmarked funding to bring the poorest states toward a national standard.140 On top of all this, FDR and later Truman deliberately steered defense spending to the region to promote continued modernization.141
Today critics of the federal government often cite the economic success of low-tax states—more specifically, low-tax southern states—as an indictment of federal policies and proof of the superiority of limited, localized government. But this assertion is completely backward. Not only do these states continue to trail their high-tax counterparts on many measures of prosperity, they have also relied on immense resources from the federal government to promote their partial catch-up. Though these earmarked investments have declined, the region continues to receive huge net transfers from the federal government, mainly because it still remains comparatively poor. For every dollar that Mississippians pay in taxes, $2.34 comes back in federal spending. South Carolina—like Mississippi, home to some of the nation’s most virulently antigovernment politicians—receives over $5 for every $1 its taxpayers send to Washington. By contrast, the five richest states—Connecticut, Massachusetts, Maryland, New Jersey, and New York—receive an average of 77 cents. New Jersey receives less than 50 cents; Delaware, just 31 cents.142
Just as America became rich not in spite of, but because of, government, the American states became rich not in spite of, but because of, government. Indeed, the states that are most likely to be held up as exemplars of free enterprise today are those that benefited the most (and continue to benefit the most) from this active federal role.
Even as the mixed economy opened its doors to tens of millions of previously neglected citizens, America’s leaders made another vital contribution to the country’s rising prosperity after World War II: a revolutionary push to address market failures associated with an increasingly dense, interconnected, and complex commercial society. The most obvious breakthroughs concerned pollution, which rapidly came to be seen as a fundamental threat to the quality of life, requiring vigorous regulation of markets.143 The federal government also improved protections for worker safety, and in response to the growing profile of activists such as Ralph Nader, it paid greatly increased attention to vulnerable consumers in areas ranging from tobacco to automobiles.144 Negative social costs imposed on others were only one focus; regulations also sought to address myopic behavior that caused grievous harm to individuals and the nation—and nowhere was this more true than with regard to smoking.
Fifty years after the War on Poverty, many conservatives are quick to declare the effort a failure (though their case is based largely on the failure of our antiquated poverty measure to include most public aid to the poor, such as the Earned Income Tax Credit, also known as the EITC, and supplemental nutrition assistance, aka food stamps).145 But they are notably silent about another fifty-year landmark that also rankled many on the right at the time: the first surgeon general’s report on the health risks of smoking.146 The silence is revealing. Today no sensible observer doubts that discouraging cigarette smoking is good policy. Yet the major efforts of federal and state governments to reduce tobacco use were opposed at every turn in the name of market freedom.
By the middle of the twentieth century, the accumulating evidence of tobacco’s toxicity—much of it revealed by federally supported research—was overwhelming. Yet manufacturers denied the evidence, spewed out misinformation, and insisted that consumers could make their own “informed” decisions.147 Consumers couldn’t. Even after government warnings were slapped on cigarette packs, and the majority of Americans recognized that smoking causes cancer, most adult smokers remained ill-informed about the risks they were taking. Moreover, most smokers start this addictive habit before age eighteen, when the future risk of dying seems even smaller and more distant.148
In the half century since the 1964 report, cigarette smoking has caused an estimated twenty million premature deaths in the United States.149 Today about half a million lose their lives each year—one in five annual deaths in the United States—and smoking-related medical costs and lost productivity exceed $300 billion a year.150 Yet the toll would be far higher were it not for active government. At the time of the surgeon general’s report, over four thousand cigarettes were smoked annually for each adult American.151 In the decades since, following a ban on broadcast ads, rising federal cigarette taxes, increasingly strong warning labels, repeated FDA actions to promote smoking cessation, and a string of settlements between states and tobacco companies (along with various state-level antismoking policies), annual consumption has dwindled to just over a thousand cigarettes per adult. According to a recent careful estimate, efforts to reduce tobacco consumption over the past half century prevented more than eight million premature deaths, extending average life expectancy by a remarkable one and a half to two years.152 Whether the 1964 report was a critical catalyst or merely one important contribution to this broad shift in science and policy, it is hard to disagree with the surgeon general’s fifty-year assessment: “The epidemic of smoking-caused disease in the twentieth century ranks among the greatest public health catastrophes of the century, while the decline of smoking consequent to tobacco control is surely one of public health’s greatest successes.”153
The 1964 report defined a new model of federal scientific involvement: the independent commission, guided by scientists, seeking to provide recommendations in the best interests of the nation. It was the model that informed and guided the massive investments in medical research made after World War II, as well as the new regulations designed to protect individual health and safety, the environment, and, later, the planet itself. As the fifty-year appraisal concluded, the 1964 report did more than start the nation down a healthier path. It was also “a pioneering step toward anticipating a much larger role for government, in collaboration with scientists, to use science to inform regulatory and other policies.”154
Vannevar Bush would have been proud.
The story of America’s rise to richness is a story of an ongoing rebalancing of political institutions and economic realities, of public policies, social knowledge, and democratic demands. But the arc of that history bends toward a more extensive role for government, and for good reason: As the United States has changed from an agricultural society into an industrial society and then a postindustrial society, the scale of economic activity and the interdependence and complexity of that activity have expanded, along with the scope of the harms that this activity can yield. As America’s leaders responded to these challenges and to pressures for action and inclusion from below, they came to recognize that making Americans healthier, better educated, and freer to pursue their own dreams—regardless of race, gender, and ethnicity, whatever the circumstances of their birth—made America richer, too. For all the barriers yet to be broken, for all the ignorance and indifference that remains, it is no longer possible to pretend that exclusion is costless.
As far from its origins as it now is, America’s mixed economy still bears the marks of its founding: in the ideas, so often misunderstood, of a revolutionary generation that insisted on unified authority that could defend a rising nation and build a national economy, and in the interventionist nineteenth-century republic of bountiful land rather than bountiful spending that built on this foundation. Still, the mixed economy is unmistakably a twentieth-century creation, a modern social technology. And it is arguably the greatest social technology not just of its time but of all time.
Vannevar Bush died in 1974, on the eve of the computer age, but he would not have been surprised by the miraculous technologies his work fostered. In a 1945 article for the Atlantic Monthly, titled “As We May Think,” Bush wrote of a device with “translucent screens, on which material can be projected for convenient reading,” along with a keyboard to control access to that information.155 Data would be stored in files, each with “trails” (we would now call them links) that would allow vast amounts of knowledge to be sorted and searched at the touch of a button. Scientists could find any article, lawyers any case, doctors any clinical study, inventors like Bush any patent. The “Memex,” as he called it, is familiar to us today as the personal computer and the internet. What we too often forget is the enormous public investments that made this technology, and so many other contributors to our productivity, possible.
Near the end of World War II, an aspiring scientist named Doug Engelbart came across Bush’s 1945 article while stationed in the Philippines. Inspired, he returned to complete his undergraduate degree at Oregon State (a land-grant university); found a job at a government aerospace lab in California, where he refined his thinking; and then completed a (highly subsidized) PhD at UC Berkeley (another land-grant university). He eventually set up a research group at the university that Leland Stanford founded, with funding from the US Air Force, NASA, and the Advanced Research Projects Agency—the Defense Department arm that would create the predecessor of the internet, ARPANET.156 Twenty-three years after seeing Bush’s Memex in his mind’s eye, at a San Francisco conference bringing together more than a thousand of the world’s best computer scientists, Engelbert would hold his fellow technophiles in rapture as he showed how Bush’s vision could be made real. When Engelbart died in 2013 at the age of eighty-eight, his obituary in the New York Times captured the moment:
For the event, he sat on stage in front of a mouse, a keyboard, and other controls and projected the computer display onto a twenty-two-foot-high video screen behind him. In little more than an hour, he showed how a networked, interactive computing system would allow information to be shared rapidly among collaborating scientists. He demonstrated how a mouse, which he invented just four years earlier, could be used to control a computer. He demonstrated text editing, video conferencing, hypertext, and windowing.157
Unlike Vannevar Bush, Engelbart lived to see his futuristic ideas become mass-market products. He never received much money or credit from the companies and CEOs who reaped billions from his vision. But he was not alone: neither did the federal government.
Given that technological breakthroughs are at least as much a result of the right environment as the right person, perhaps it’s fitting that neither Vannevar Bush nor Doug Engelbart is much remembered today. Yet we should not forget Bush’s legacy: Bush the Republican, conservative, probusiness, free-market advocate was also Bush the apostle of the mixed economy. He was not the only architect of active government who now seems an unlikely advocate. Along America’s path to the mixed economy, plenty of leaders within the business community and the Republican Party joined him as well. Undergirding a successful mixed economy was a successful politics.