Before the Civil War, the central impetus for economic development was the growth of markets. Simply put, people bought and sold more on regional, national, and international markets than they had done in the colonial period. In the eighteenth century, high transportation costs, low population densities, and shortages of cash kept many Americans from participating in these markets. A wave of nineteenth-century transportation projects (known to contemporaries as “internal improvements”) made it far easier to exchange goods and services with more distant places. Farmers took advantage of turnpikes, canals, river improvements, and railroads and produced greater surpluses of cash crops. Farmers specializing in cash crops demanded more textiles, furniture, clocks, books, and other consumer goods. Demand from the countryside created a large market for merchants and manufacturers that fueled the growth of cities and industries. Scores of newly chartered banks provided capital to entrepreneurs taking advantage of these new opportunities. Historians have called the great expansion of commerce the “market revolution.” Economists have labeled the process “Smithian growth,” after the famous eighteenth-century economist Adam Smith. Smith argued that economic specialization and the division of labor—the crux of what he called the “wealth of nations,” the title of his classic treatise—crucially depended on large markets.
The expansion of markets rested as much on political decisions as on economic actions. Scholars have increasingly viewed “institutions”—what might be thought of as the myriad formal and informal rules that shape economic behavior—as a central component of market exchanges. Nineteenth-century Americans self-consciously adopted rules that favored the creation of markets, whether these were permissive policies toward corporate chartering or the removal of barriers to interstate trade. The “rules of the game” that Americans chose reflected a consensus that commercial and technological progress generated widespread benefits. If Americans generally agreed that commercial progress was important, they disagreed on precisely how national, state, and local governments would shape commercial markets. Two distinct visions dominated economic policy. One called for large-scale federal involvement—including national banks, national internal improvements, and a high tariff—while the other favored a decentralized approach in which state and local governments would take the lead in encouraging economic development. These two different visions of economic change would help shape political parties and economic policy before the Civil War.
When ratified in 1788, the Constitution established a favorable framework for the growth of markets. The Constitution prohibited the states from establishing trade barriers and thus established a large internal “free-trade zone” that allowed commerce to flourish. Contracts, including debts, made in one state were enforceable in another, giving businesses confidence to engage in interstate commerce. The Constitution also authorized a patent system that protected the rights of inventors. During the nineteenth century the federal government made patents cheap to obtain and easy to enforce—at least relative to European nations—which encouraged inventive activity among thousands of individuals, from humble mechanics to professional inventors. While generally favorable to development, the Constitution was also ambiguous about the role of the federal government in promoting economic development. Could the federal government establish a central bank to help stabilize the nation’s financial system? Could the federal government build roads, canals, and other internal improvements? Could the federal government enact tariffs to protect domestic industry from foreign competition?
The Federalists, led by nationalists such as Alexander Hamilton and John Adams, believed that strong national institutions could bind the fragmented republic together. A large national debt, for example, ensured that the prominent financiers and merchants who owned government securities would have a vested interest in the security and prosperity of the nation. In similar fashion, a national bank that would handle the business of the federal government, including the deposit of tax receipts, would give merchants and manufacturers a source of capital while regulating state and local banks. The Federalist vision of a strong, activist state had an undeniable modernizing element, but it sprang from a decidedly eighteenth-century view of politics and society. The Federalists self-consciously sought to concentrate economic and political power in the hands of a small group of wealthy men with the experience, expertise, and leadership to run the nation’s economy. In the Federalist vision ordinary voters would act as a check to ensure that these men did not abuse these powers, but Federalists presumed that most men of modest means would defer to their economic and social superiors.
The Federalists prevailed in the 1790s, and with Hamilton’s leadership they nationalized the Revolutionary War debt (some of which had been owed by individual states), levied new taxes, and established a national bank. In the pivotal election of 1800, however, Thomas Jefferson’s Republican Party triumphed and would dominate electoral politics for the next quarter century. Republicans advocated a decentralized approach to economic policy, leaving most power in the hands of states and localities. Jefferson and his adherents interpreted the Federalist program as a ploy to centralize power within the hands of a power-hungry cabal of would-be aristocrats. Republicans believed, for example, that a national bank would use its considerable financial resources to reward its friends and punish its enemies, thus setting into motion a cycle of corruption that threatened the very existence of the republic. Republicans tied their support for expanding democracy (at least for white males) with their critique of Federalist economic policies: ordinary farmers and artisans, not elitist financiers, should be the real drivers of economic development. Once in power, Jefferson cut taxes, decreased government spending, and reduced the size of the government debt. President James Madison, a close friend of Jefferson and a stalwart Republican, allowed Hamilton’s Bank of the United States to expire in 1811.
Despite the success of the Jeffersonians in limiting the economic role of the federal government, conflict with Great Britain (culminating in the War of 1812) led some Republicans to embrace elements of economic nationalism, albeit without Hamilton’s overt elitism. The fusion of economic nationalism with Jeffersonian democracy is best represented by Henry Clay’s “American System.” Clay argued that protecting American industry from foreign manufacturers would create large internal markets for cotton, foodstuffs, and other agricultural products. Clay and other advocates of the American System argued that high tariffs would benefit all sectors of the economy. Manufacturers would flourish, safely protected from cheap foreign goods. Farmers and planters would also prosper as a safe, dependable home market replaced uncertain foreign markets as the major outlet of American agricultural products. Merchants who coordinated the exchanges between cities and the countryside would see their business grow as well.
By 1832, Clay added two other major elements to the American System: a government-financed system of transportation improvements and a national bank. As committed nationalists, Clay and his allies—who eventually became known as National Republicans and then Whigs—believed that federal funding of roads, canals, and railroads would connect all American localities to the domestic market. Clay and his allies also vigorously defended the Second Bank of the United States, which they believed would regulate state banks, ensure a stable currency, and supply businesses with much-needed capital. A national system of internal improvements and a national bank would also strengthen commercial ties and provide a shared set of common economic interests that would transcend regional loyalties. The economic and nationalistic appeals of the American System became an important part of Clay’s platform during his presidential bids in 1824, 1832, and 1844. The nationalistic wing of the Jeffersonian Party embraced the American System in the 1820s. The National Republicans became a core constituency of the Whig Party, which formed in the early 1830s to oppose Andrew Jackson. Influential editors and writers such as Hezekiah Niles and Henry Carey strongly supported the Whig agenda, which also found considerable support in commercially developed areas with substantial manufacturing interests.
Much as Clay sought to revitalize elements of Hamilton’s nationalistic program, Andrew Jackson resuscitated the Jeffersonian critique of activist government. Elected in 1828 on a populist appeal to democracy, Jackson aggressively attacked elements of Clay’s American System. In 1830 Jackson vetoed the Maysville Road Bill, which would have provided federal funding for an important project in Clay’s own state of Kentucky. Even more important, Jackson vetoed, in 1832, a bill to recharter the Second Bank of the United States. Jackson soundly defeated Clay in the presidential election of 1832, effectively dooming the American System. Most voters (especially those in the South and West) apparently shared Jackson’s fears that the Whig economic program, in mixing economic power with political centralization, would invite political corruption. Traditional fears of political corruption reflected concerns that state activism was a zero-sum game in which some interests won and other interests lost. Such a political environment made it especially difficult for the federal government to fund transportation projects. Cities, states, and regions perceived to be on the short end of federal funding became a powerful voting bloc to oppose government investment. Federal funding thus proceeded on an ad hoc basis with little systematic planning.
A distinctly regional critique of activist government developed in the South. High tariffs were especially unpopular there: planters opposed paying higher prices for manufactured goods for the benefit of northeastern manufacturers. Some Southerners also feared that the centralization of power inherent in the American System presented a long-term threat to slavery. If the federal government had the power to protect industry, build canals, and regulate banking, they reasoned, then it might have the power to abolish slavery as well. During the winter of 1832–33, South Carolina took the dramatic step of nullifying the Tariff of 1828, which was known in the state as the “tariff of abominations.” South Carolina eventually backed down, but the state’s response indicated that Southerners saw an activist national government as a threat to slavery. As debates over slavery became more intense during the 1840s and 1850s, southern opposition to activist policies (at least those emanating from the federal government) hardened, resulting in a legislative deadlock that eliminated any real hope of passing the Whig’s economic agenda.
Jeffersonians and Jacksonians, however critical of national initiatives, eagerly supported economic development at the state and local level. Leaving most economic policy in the hands of state and local governments, in fact, helped democratize institutions such as the corporation. European nations tended to tightly restrict corporate chartering, often giving established corporations monopoly privileges in return for sweetheart deals with government officials. In the United States, though, intense competition between states and cities worked to loosen corporate chartering, as legislatures sought to please a variety of influential local interests. The more decentralized and democratic political culture in the United States encouraged logrolling rather than legislative monopoly. Banking is an excellent example. The number of state-chartered banks accelerated sharply so that, by 1820, the United States was well ahead of Great Britain and other European nations in bank capital per capita. These state banks were hardly the exclusive domain of wealthy financiers. Tens of thousands of ordinary individuals—including a good many prosperous farmers and artisans—invested in state-chartered banks. Political institutions in the United States, in short, managed to channel powerful “bottom-up” forces into economic development.
The decentralized nature of economic policy in the United States sometimes led to greater government involvement in the economy. Transportation improvements are a case in point. Fearing that private investors would avoid the risk associated with large-scale projects, the New York legislature authorized the construction of the Erie Canal as a state-run enterprise in 1817. The canal, built along a 363-mile route from Albany to Buffalo, promised an all-water route from the Great Lakes region to New York City. Completed in 1825, the Erie Canal was a stunning success. The bonanza of canal revenues allowed New York to finance politically popular branch lines. Prodded by New York’s example, state governments in Pennsylvania, Maryland, and Virginia attempted to build their own canal systems to improve links with the Trans-Appalachian West. From 1817 to 1844 Americans invested nearly $131 million to build 3,360 miles of canals. Most of the investment came from state governments eager to spur economic development, either directly from canal revenues or indirectly via property taxes on rapidly appreciating land.
The failure of most canals to meet the grandiose expectations of their supporters led to fiscal retrenchment in many states. Part of the problem was that the Erie Canal proved exceptionally profitable. Most other canals, traversing more mountainous terrain or serving smaller populations, barely covered the cost of upkeep and repairs. Heavy investments in canals created large state debts that led to a backlash against state investment. In the 1840s and 1850s, important northern states such as New York, Pennsylvania, and Ohio adopted constitutional amendments that banned government investment in private corporations. The backlash against state canal investment—combined with the continued development of financial centers such as New York City and Philadelphia—left most railroad construction in private hands. Faster and more flexible than canals, railroads captured the imagination of both private capitalists and the general public. By 1840 Americans had built 2,818 miles of track; by 1860 that figure ballooned to 30,626 miles.
Although private capital financed much of the nation’s railroad network, railroads were hardly examples of free enterprise. In the North, municipal and county governments often purchased railroad stock to encourage local companies. In the South, the region’s slave economy prevented the growth of large financial centers that might finance railroad construction. State investment thus remained the norm. The public character of railroads, even when they were privately financed, generated a host of disputes that courts and local governments often decided. Should eminent domain compensation be determined by local juries (which often favored landowners) or special commissioners (which often favored companies)? Should railroads compensate farmers when locomotives struck livestock that were crossing tracks? Could cities regulate railroad operations to reduce noise, congestion, and pollution? Judges and legislators typically attempted to strike compromises that mediated the government’s considerable interest in encouraging railroad construction with individual and public rights.
For the most part, the federal government played a secondary role to state and local governments in the proliferation of transportation companies. National policies—especially land policy—nevertheless had an important indirect impact. The federal government made land grants to various canals and railroads in the West and South. Although small compared to the investment of state governments, these land grants nevertheless aided individual enterprises, such as Indiana’s Wabash and Erie Canal. More important, federal policy regarding land sales affected the rate of western settlement, thus had an important impact on internal improvements and the rest of the economy. Some easterners (including Northerners and Southerners) feared that rapid western expansion would undermine their own political influence. They therefore supported relatively high prices for the vast tracts of western land that the U.S. government owned. Not only did high prices slow western settlement, but they also provided an important source of revenue for the federal government. Popular pressure for cheap western land, though, led to the passage of the Land Act of 1841, which allowed settlers the right to buy a 160-acre homestead for as low as $1.25 per acre.
In the 1850s, the emerging Republican Party embraced the nationalistic ideas of Hamilton and Clay. Abraham Lincoln and many other Republicans had long admired Clay and his ideas. Republicans eagerly supported generous land grants to transcontinental railroads, a highly protective tariff, and a new national bank. Unlike Clay, the Republicans made slavery an economic policy issue. Believing that slavery inevitably discouraged hard work and enterprise, Republicans blamed the institution for the Souths lack of development. While southern slaveholders often generated large profits from plantation agriculture, the South lagged behind the North in urbanization, manufacturing, and other benchmarks of economic development. Republicans, fearing that slavery would poison the national economy if allowed to spread, thus opposed the extension of slavery into the West. They instead supported a homestead act that would allow free-labor households to acquire federal land at little or no cost. Debates over federal land policy played a crucial role in the coming of the Civil War.
See also business and politics; economy and politics since 1860.
FURTHER READING
Lamoreaux, Naomi R. Insider Lending: Banks, Personal Connections, and Economic Development in Industrial New England. New York: Cambridge University Press, 1994.
Larson, John Lauritz. Internal Improvement: National Public Works and the Promise of Popular Government in the Early United States. Chapel Hill: University of North Carolina Press, 2000.
Majewski, John. A House Dividing: Economic Development in Pennsylvania and Virginia before the Civil War. New York: Cambridge University Press, 2000.
Meyer, David R. The Roots of American Industrialization. Baltimore, MD: Johns Hopkins University Press, 2003.
Rothenberg, Winifred Barr. From Market-Places to a Market Economy: The Transformation of Rural Massachusetts, 1750–1850. Chicago: University of Chicago Press, 1992.
Wright, Gavin. Slavery and American Economic Development. Baton Rouge: Louisiana State University Press, 2006.
JOHN MAJEWSKI
Since the founding of the republic, Americans have debated about how to make the economy work for the general welfare. Making it work, however, has always been a means to larger social and political ends. Growing the economy has never been an end in itself.
Liberty was, at the founding, construed as a simple function of property ownership: if you owned property, you were economically independent of others, and thus could be a self-mastering individual—a citizen. Americans have since redefined liberty, equality, and citizenship, but most still conceive of “real politics” in terms of their economic status.
The politics of the period from 1860 to 1920 can be viewed as a debate about how to reorganize or reject an Atlantic economy, then as a way of imagining the political future of a more cosmopolitan, a more inclusive—maybe even a global—economy. Thus conceived, the politics of the period became an argument about how, not whether, to include the federal government in the sponsorship and supervision of economic arrangements.
The nonmilitary legislation of the 1860s, for example, which enacted the Republican platform on which Abraham Lincoln ran for president, was a detailed blueprint for industrialization on a continental scale. It could be implemented only because the South had removed its representatives from the national legislature. The Morrill tariff of 1861 was the first step. It reversed the trend toward free trade, which began under President Andrew Jackson in 1832 and culminated under President James Buchanan in 1857. The new tariff imposed specific duties on imports such as steel rails and thus allowed American manufacturers to undersell their more efficient British competitors.
Other steps included the Homestead Act, which excluded slavery from the territories by encouraging white settlers to head for the new frontier in the trans-Mississippi West; the National Banking Acts of 1863 and 1864, which, by forbidding state and private banks to issue money, created a uniform national currency and a new stratum of bankers with vested interests in manufacturing and transportation; the Immigration Act of 1864, which flooded the late-nineteenth-century American labor market with exactly the kind of ornery, adventurous, and ambitious individuals who might have led the class struggle back home; and the Aid to Railroads Acts of 1864–68, which subsidized the construction of transcontinental trunk lines, thus building the necessary infrastructure of a tightly knit national market for both finished goods and investment capital. Finally, the three “freedom amendments” to the Constitution—the Thirteenth, Fourteenth, and Fifteenth Amendments—abolished slavery, guaranteed due process of law to all persons (regardless of race), and armed the freedmen with the vote. Together they created a unitary labor/property system and prevented the restoration of the South to its prewar political preeminence.
This last step, a result of Radical Reconstruction, was more important than it may seem, because the repeal of the Republican Party’s blueprint for modern America was a real possibility. Indeed, it was a programmatic imperative of the Democratic Party in the North. Even moderate Republicans with constituencies in the seaboard cities of the Northeast were anxious to restore, not reconstruct, the abject South, because the merchants and bankers who had organized and financed the antebellum cotton trade from their offices in New York, Philadelphia, and Boston wanted an immediate return to a free-trade Atlantic economy in which the pound sterling and the gold standard ruled. Many thousands of other Northerners who had been employed in and by this trade wanted the same thing.
When a majority of these merchants and bankers realized that there were more lucrative outlets for their capital in the “great and growing West” than in the restoration of the South and the resurrection of the cotton trade, the possibility of a return to a free-trade Atlantic economy was dead. So, too, was the Democratic Party’s preference for “do-nothing government” (or “laissez-faire”). So, finally, was the related notion that the future of the American economy would be determined by the scale and scope of British demand for agricultural raw materials. In this sense, the abdication of merchant capital, the political power behind King Cotton’s throne, turned agriculture into a mere branch of industry. As a result, the once regnant, even arrogant South had to reinvent itself as a political supplicant and a colonial appendage of the northeastern metropolis.
Politics in the late nineteenth century was mostly a matter of answering the money question, the labor question, and the trust question, as they were then called. Of course, party coalitions were based as much on ethnic loyalties and rivalries as on class allegiances. In the United States, race and ethnicity are never absent, or even distant, from the calculations of politicians. But the public discourse of the period 1860–1920 was a language of class conflict that kept asking not whether but how to use the powers of the state and federal governments to promote equitable economic development.
The political stalemate of the post-Civil War period had both a class component and a regional one. The regional component was a continental version of imperial politics. The northeastern metropolis stretching from Boston to Chicago could not simply impose its will on the South or the West in the 1880s and 1890s. Too many states and too many people resisted the juggernaut of industrial capitalism, sometimes with electoral initiatives and sometimes with extra-electoral activity that involved armed struggle, as in the terrorist campaigns of white supremacists in the South during Reconstruction and after.
The class component of the “great stalemate” was more important, however, because it had more profound intellectual as well as political effects. The short version of the argument is that the workers were winning the class struggle of the late nineteenth century, and the capitalists knew it. Certainly the rise of “big business” was a crucial phenomenon in the period, but corporations could not translate their obvious market power into political legitimacy without a struggle. When workers went on strike, for example, local populations usually sided with them.
Many observers of the economic scene proposed, therefore, to change things, so that labor and capital could share more equitably in the benefits of industrial accumulation. From the standpoint of capital, this change would mean that labor relinquished its control over machine production and allowed for greater efficiency and for a more minute division of labor. From the standpoint of labor, this change would mean that capital relinquished or reduced its claim to a share of national income, on the grounds that it produced nothing—that capital merely deducted its income from the sum of value produced by others.
This was the labor question of the late nineteenth century: how to allocate the benefits of economic growth in such a way that all social classes, all social strata—not just capital, not just labor—might benefit. It shaped political discourse because the answers determined where the parties stood but also where other organizations and institutions situated themselves, including state governments and federal courts.
The subtext was a political question: Could republican government survive the eruption of class conflict and the emergence of class society? Arthur T. Hadley, a Yale University economist who was also a member of the Pennsylvania Railroads finance committee, answered this way: “A republican government is organized on the assumption that all men are free and equal. If the political power is equally distributed while the industrial power is concentrated in the hands of the few, it creates dangers of class struggles and class legislation which menace both our political and our industrial order.”
The pacification of the epic class struggle of the late nineteenth century, a struggle that was more open, violent, and sustained in the United States than anywhere else, occurred only when the labor movement accepted capital as a legitimate claimant on a share of national income. This accomplishment was largely, but not only, the result of the American Federation of Labor (AFL), founded in 1886, and it was not complete until 1914, with the Clayton Act (which is usually interpreted as a concession to labor). Even then, labor was still the junior partner of capital, and would remain so until 1935, when the Wagner Act gave the federal government the power to punish businesses for refusing to deal with unions.
The money question was more complicated, and reached more constituencies, but it addressed the same problem: how to equitably allocate the benefits of economic growth. Was the money supply a cause or an effect of economic growth? What did money represent, anyway?
From the standpoint of capital, the money supply had increased in the late nineteenth century because substitutes for cash and coin had supplemented these meager (and shrinking) means of exchange—substitutes like checks, drafts, bills, securities, futures—all the financial devices that people called “credit” and understood as the foundation of a new, corporate economy. From the standpoint of labor and farmer activists, the money supply had shrunk because, after 1873, silver was no longer a precious metal to be treated as the backing for currency or the stuff of coins. The volume of national banknotes did, in fact, decline during and after the 1880s, but the procapitalist position was correct—the money supply had increased in spite of this decline.
For the critics, money was merely a means of exchange. To increase its quantity was to increase demand for goods, thus employing more labor and growing the economy and to make sure that price deflation didn’t force borrowers to pay off their loans in money that was worth more than they originally took from the bank. From the standpoint of capital, money was multifarious, mysterious—it was credit, a system unto itself. As Edward Bellamy explained in Looking Backward (1888), his best-selling Utopian novel that created something of a political cult, “Money was a sign of real commodities, but credit was but the sign of a sign.”
Close regulation of the money supply and the ability to manage economic crises with such financial devices as interest rates and reserve requirements were finally accomplished with the creation of the Federal Reserve System in 1913. The answer to the money question, like the resolution of the labor question, was an adjunct to the trust question, for it made the corporation the central problem, and promise, of the politics of the period.
The trust questions went like this: Can we regulate and discipline these large corporations? Are they natural monopolies or unlawful restraints of trade? (The common law made this distinction, and the political discourse of the time seized on it.) Do they signify industrial serfdom? Can we be mere employees of their huge bureaucracies and still be free men and women? If they are artificial persons, fragile artifacts of the law, does that mean we are, too?
The turn of the wheel—the end of the Great Stalemate—came in the late 1890s, when the AFL could provide its new answer to the labor question and when Democrat-Populist William Jennings Bryan lost his bid for the presidency in 1896. “Free silver” became something of a joke, and the money question was resolved in favor of a gold standard.
After 1896, the urban industrial area of the nation was the primary scene of political discourse and party conflict. The relation between the corporations and the larger society was the question that shaped, even dominated, the Progressive Era. Some historians argue that, ironically, procorporate ideologues, executives, intellectuals, and journalists wrote a lot of the legislation that regulated the new corporations—the products of the great merger movement of 1898–1903. But there is no irony in the simple fact that large business enterprises need stability and predictability, and therefore want regulation of market forces.
The anarchy of the market is anathema for all participants, in capital as well as in labor. The issue was not whether but how to regulate it, through public agencies such as the Federal Reserve and the Federal Trade Commission (FTC) or through private organizations such as trade unions.
The Progressive Era was a time when the corporation was finally “domesticated,” to borrow historian Richard Hofstadter’s term. It was finally recognized as a normal part of economic and political life in the United States. By 1920 it was subject to close regulation, juridical supervision, and consistent public scrutiny. Woodrow Wilson had successfully split the difference between William Howard Taft and Theodore Roosevelt in the 1912 campaign by rejecting both Roosevelt’s program of statist command of the market and Taft’s program of fervent trust-busting in the name of renewed competition. The FTC was the emblem of his different answer to the trust question, combining executive power, antitrust law, and regulatory zeal without a trace of nostalgia for the small business owner.
The “domestication” of the large corporation in the Progressive Era also changed the world. For if imperialism was possible in the absence of corporate capitalism, the new corporate order could not survive in the absence of imperialism. The corporations were simply too efficient in reducing costs and increasing output without a comparable increase of labor inputs. Their “scientific” managers redesigned the shop floors to economize on the costs of skilled labor (the advent of electric power helped), and improved productivity to the point where the growth of the industrial working class ceased by 1905. What then? Where would the surplus go?
By the 1880s, the domestic market was saturated with the output of American industry. New overseas markets were imperative. Industry leaders and intellectuals were trying, by the 1890s, to think through an imperial model that would avoid the idiocies of European colonialism. The point was to avoid military conquest and occupation.
At the end of the nineteenth century, U.S. policy makers and a new stratum of such “public intellectuals” as Charles Conant, Arthur T. Hadley and Jeremiah Jenks were inventing a new kind of imperialism, so new that it inaugurated the “American Century.” With the international circulation of John Hay’s Open Door Notes in 1899–1900, there was a doctrine to go with the new thinking.
The doctrine had six core principles. First, all people, all cultures, were capable of development—there were no racial barriers to advanced civilization. Second, in view of the insurrections staged in the American South and West in the late nineteenth century, as well as the Boxer Rebellion in China, economic development was the key to creating modern social strata where there were none. It was a risky strategy because rapid growth always displaces vested interests and creates new ones, but the alternative was recurrent insurrection on behalf of people deprived of the fruits of development. Third, development meant direct investment in, or transfers of technology to, less-developed parts of the world. Trade was important, but nowhere near as important as investment.
Fourth, the sovereignty of every nation, including China, then a large but crumbling empire, was inviolable. The American experience of the nineteenth century had taught U.S. policy makers like John Hay, the secretary of state under Presidents William McKinley and Theodore Roosevelt, that a nation could absorb enormous quantities of foreign capital and still thrive as long as its political integrity was kept intact. Fifth, the seat of empire was shifting to the United States. The question was, could this shift be accomplished without war?
Yes, Hay and others said, and their answer was the sixth core principle—that of an anticolonial, open-door world. If economic growth was the means to pacify domestic class conflict and avoid international conflict over the allocation of shares of world income, they claimed, then a world without barriers to trade and investment was a world of peace. If the volume of world income grew quickly enough, there would be no fighting about respective shares. An open-door world—without exclusive “spheres of influence,” without colonies—was then the way to a postimperialist future.
This is the world for which the United States, under President Woodrow Wilson, went to war in 1917. It was not the world that resulted; the European victors were not about to give up their colonial possessions because the United States asked them to. But it is the world that American policy makers described as their goal until the end of the twentieth century, when neoconservatives proposed to go back to a military definition of power.
American entry into World War I was predicated on the politics of the new imperialism, which would have been almost entirely economic in nature. With U.S. troops in battle, the stalemate between the warring sides would be broken. A socialist revolution could be forestalled and German victory prevented. So, too, would the reinstatement of European colonialism proposed by the Allies, for the United States would no longer be a neutral party and could negotiate an early version of decolonization through a system of mandates.
The results were as good as could be expected in the absence of a prolonged American military presence in Europe, something that was unthinkable in 1919, when the war ended. For that reason, the United States could never have joined the League of Nations: enough senators refused to deploy American military power, such as it was, to sustain empires already in place.
Still, the future was written in the second decade of the twentieth century. An open-door world became the aim of American foreign policy. Organized labor came of age during and after the war; it waned and it waited in the 1920s, to be sure, but it was ready for power when depression struck in the 1930s. Finally, the problem of structural unemployment became crucial to the thinking of economists and politicians alike.
Here, the issue of corporate efficiency was again the driving force. The new corporations could increase output without any increase of inputs—whether of labor or of capital. So the pressing questions became how to employ a growing population and what would happen to the intellectual and political status of profit. If employment can’t be provided to a growing population through private investment, what justification is there for profit? And, how else could people be employed? Public spending? These were the questions that surfaced in the second decade of the twentieth century, and we are still trying to answer them.
See also banking policy; business and politics; labor movement and politics; tariffs and politics.
FURTHER READING
Bensel, Richard. The Political Economy of American Industrialization, 1877–1900. New York: Cambridge University Press, 2002.
Chandler, Alfred D., Jr. The Visible Hand: The Managerial Revolution in American Business. Cambridge, MA: Belknap Press of Harvard University Press, 1977.
Hofstadter, Richard. The Age of Reform. New York: Vantage, 1955.
Livingston, James. Pragmatism and the Political Economy of Cultural Revolution, 1850–1940. Chapel Hill: University of North Carolina Press, 1994.
McGerr, Michael. The Decline of Popular Politics: The American North, 1865–1928. New York: Oxford University Press, 1986.
Sanders, Elizabeth. Roots of Reform: Farmers, Workers, and the American State, 1877–1917. Chicago: University of Chicago Press, 1999.
Sklar, Martin J. The Corporate Reconstruction of American Capitalism, 1890–1916. New York: Cambridge University Press, 1988.
Wiebe, Robert. The Search for Order, 1877–1920. New York: Hill and Wang, 1966.
JAMES LIVINGSTON
In 1920 Republican presidential candidate Warren Harding announced that “America’s present need is not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration.” Spoken after an alarming year of widespread labor unrest, the vicious Red Scare, and a bruising congressional battle over the country’s postwar foreign commitments, the promise of “normalcy” soothed his constituents’ jangled nerves and provided them with hope for a swift end to the dislocations stirred up by the Great War. Yet Harding’s neologism signaled more than an assurance of domestic peace and a commitment to shut down the international adventures. Not only should Americans put the war and its unsettling aftermath behind them, his words suggested; so, too, should the country dispense with the previous decade of dangerously socialistic economic experimentation.
The Republican Party, as most business conservatives saw it, had been rudely unseated from national power by Democrat Woodrow Wilson’s presidency from 1913 to 1921. During that time, an unexpected alliance of rural representatives and urban labor interests had provided the congressional muscle for a sweeping array of federal regulatory legislation, transferring the Progressive reform energy of the states and municipalities to the national level and there consolidating its reach and permanence. Even more jarring to the self-appointed protectors of property and sound money in both parties was the specter of a political coalition of reform-minded Theodore Roosevelt Republicans joined in unholy partnership with agrarian inflationists indoctrinated by the Democratic-populist credos of William Jennings Bryan.
But for the next ten years, party standpatters could put these worries to rest. A decade of unsurpassed prosperity muted the previous era’s concerns for economic justice and responsive government. Both Harding and his successor, Calvin Coolidge, brought to the White House a political philosophy oriented toward private accumulation and proud of governmental inaction. “The chief business of the American people is business,” Coolidge famously intoned, a dictum perfectly matched to the country’s diminished concern with corporate concentration. Big businesses, now depicted in splashy advertisements, provided valuable and essential services that distributed everyday comforts and conveniences, brought families closer together, and satisfied the individual tastes of discerning consumers.
Indeed, after a brief recession that ended in 1921, the American economy certainly appeared capable of raising living standards quickly and dramatically, and without government interference. By 1930 manufacturing output had increased fourfold over 1900 levels. Real industrial wages, which rose almost 25 percent during the 1920s, put a portion of the accompanying productivity gains directly into the worker’s pocket. The technological and organizational feats of Fordist mass production turned out low-cost consumer durables at prices average Americans could afford. High wages and low prices most famously brought automobile ownership within the reach of millions; at decade’s end, one out every five Americans was driving a car. But they were not only crowding the roads with their modern marvels; by 1930 almost two-thirds of the nation’s homes received electric service. Electrical appliances—refrigerators, ovens, radios, and vacuum cleaners—had become regular features of domestic life.
The nation’s rapidly growing urban areas, which housed a majority of the population after 1920, provided the primary setting for such dizzying consumption. The decade’s very prosperity, in fact, pushed the economic questions of the Progressive Era to the back burner and brought to the political forefront a simmering cultural divide between small-town America and the cities with their bustling immigrant communities, beer-soaked politics, and increasingly cosmopolitan tastes. Nothing showcased these geographical frictions better than the Democratic Party, reduced during the 1920s to infighting between its urban, working-class members and a Protestant, native-born, and prohibitionist wing. Furthermore, the party’s rural faction was anchored—even dominated—by the segregationist, antiunion South, whose cotton barons and low-wage employers suppressed the region’s populist heritage, warily eyed the big-city machines, and accrued ever more congressional seniority. Fistfights broke out on the 1924 Democratic Party convention floor over a petition to condemn the Ku Klux Klan and brought the proceedings to a days-long standstill. Irish Catholic Alfred E. Smith, backed by New York City’s Tammany Hall and an emblem of all the Klan reviled, lost that nomination, only to claim it four years later and to suffer a humiliating national defeat in 1928.
The culture war falls short as a guiding framework for the political world of the 1920s, however. What separated the town and the country was not merely a mismatch of values and social customs but a gaping economic imbalance. Industrial America had not yet vanquished its agrarian past. Farmers still constituted nearly one-third of the nation’s workforce, but earned, on average, one-fourth the income of industrial workers. While other Americans experienced rising wages and dazzling consumer choices, farmers received an unwelcome taste of the Great Depression to come. Emboldened by high prices and government encouragement, they had expanded production during World War I, often pushing into marginal lands where success was far from certain. Agricultural incomes dropped precipitously when overseas demand plummeted after 1920. Besieged by the postwar contraction, farmers found themselves caught between the low prices they received for farm products and the high prices they paid for nonfarm items; wheat and other commodity markets suffered from overproduction throughout the decade.
This agricultural distress prompted urgent calls for national assistance from a new congressional bloc of southern Democrats and farm-state Republicans. Their discontent found concrete expression in the McNary-Haugen Farm Bills, the focal point of farm relief legislation during the 1920s. The legislation proposed to raise domestic prices directly by selling surplus stocks abroad. Its proponents believed that this two-price system would not entail special protection for agriculture; it would simply extend the benefits of the tariff (already enjoyed by American manufacturers) to farmers. New Deal policy makers would soon criticize this export dumping and prescribe a purely domestic agricultural rescue package, but not before the bills faced decisive opposition from President Coolidge as well as Herbert Hoover, his Republican successor.
The political significance of the agricultural depression, though, lay not in the farm bloc’s defeats during the 1920s but in the growing potential to resurrect an alliance between rural Democrats and Progressive Republicans. The absence of farm prosperity and the search for a method to address the economic imbalance kept alive an issue tailor-made for uniting rural representatives of all political stripes: public power. Private utilities and electric companies had decided that most farmers simply did not have the income to put them in the customer class; it was far more profitable to serve more densely populated urban areas. No one represented the progressive’s passion for public power more than Nebraska Republican George Norris, who battled to bring the nation’s rivers and waterways under state-sponsored, multiple-purpose control. Throughout the 1920s, Senator Norris labored to keep the government’s World War I—era hydroelectric facility at Muscle Shoals, Alabama, in public hands and to use it as the starting point for a more ambitious scheme to develop the entire Tennessee River basin—a project realized eventually in the creation of the Tennessee Valley Authority. However, like the proponents of the McNary-Haugen Bills, the proponents of public power also faced decisive presidential opposition before the New Deal.
While the agricultural depression aroused Progressive like calls for government assistance during a decade whose governing ethos rejected such intervention, it also contributed to the Great Depression. Of the many domestic causes of the Depression’s length and severity, the inequitable distribution of wealth ranks very high. The maladjustment was not only apparent in the disparity between rural and urban incomes; workers’ wages, despite their increase, had failed to rise in step with industrial output or with corporate profits, thus spreading too thinly the purchasing power required to run the motors of mass production. The lower 93 percent of the population actually saw their per-capita incomes drop during the latter part of the 1920s. Furthermore, far too many excited buyers had purchased all of those consumer durables on installment plans. Entire industries balanced precariously on a shaky foundation of consumer debt.
The automobile and construction sectors felt the pinch as early as 1925, and by the end of the decade, businesses nesses struggled with unsold inventories that had nearly quadrupled in value. A speculative frenzy on Wall Street ended with a stock market crash in the autumn of 1929. Less than 5 percent of the population owned securities in 1929, but many more Americans held bank deposits, and lost those in the wave of bank failures that followed. Still, the Depression was not merely the result of income inequality or credit-fueled overexpansion. It also emerged from a complicated tangle of domestic and international circumstances—an undercapitalized banking system with too few branch operations; a Federal Reserve that restricted the money supply instead of easing credit; an unassisted Europe that defaulted on its war debts; and a world trading system that fell prey to fits of economic nationalism. Much of the blame, in fact, could be placed on the doorstep of the United States, which emerged from World War I as the world’s leading creditor nation but evaded the responsibilities of this leadership. Instead, during the 1920s, the U.S. government hindered the repayment of European debt and restricted the global movement of labor, goods, and capital by halting immigration and raising tariffs to astronomically high levels—even while aggressively promoting its own products overseas. The president to whose lot the economic emergency first fell was Herbert Hoover, a widely respected engineer who had orchestrated European food relief efforts during World War I and who had served as secretary of commerce under Harding and Coolidge. Unlike his predecessors, Hoover was no orthodox disciple of laissez-faire. He claimed that the increasing complexity of modern society required the federal government to gather information and suggestions for the nations improvement. He insisted, though, that these tools be used primarily to assist the voluntary activities of those people, businesses, and local governments interested in collective self-help.
By the spring of 1933, nearly 25 percent of the labor force was unemployed, and millions more worked only part-time. Construction had slowed, manufacturing had stalled, and new investment had virtually come to a halt. Traditional sources of assistance—mutual aid societies, municipal treasuries, even national charities—were crippled by such unprecedented need. Unlike his treasury secretary, Andrew Mellon, whom Hoover later criticized as a “leave-it-alone liquidationist,” the president at first demonstrated considerable flexibility—he cajoled industrial leaders to resist lowering wages; requested public works funds from the U.S. Congress; persuaded the Federal Reserve to ease credit (though, as events would prove, not by nearly enough); and set up the Reconstruction Finance Corporation to provide billions of dollars to banks and businesses. On a few critical issues, however, Hoover dug in his heels: he refused to sanction direct federal assistance for either farmers or unemployed people, and he opposed any proposal that required the government to produce and distribute electricity without the assistance of private business.
Into this political opening marched Franklin D. Roosevelt, the Democratic governor of New York, who challenged Hoover for the presidency in 1932. Roosevelt not only pledged relief to the urban unemployed but—unlike his gubernatorial predecessor Alfred E. Smith—also demonstrated remarkable acuity for rural issues. During his governorship, Roosevelt had begun a statewide program of soil surveys and reforestation, and attempted to provide rural New Yorkers with inexpensive electricity from government-run hydroelectric power facilities. In his lustiest presidential campaign rhetoric, he promised federal development of prime hydroelectric sites across the country. The federal government would not only build the required infrastructure but would distribute the electricity to the surrounding communities at the lowest possible cost. Roosevelt also committed his party to a program of direct farm assistance that would combine production control with higher prices and natural resource conservation.
Roosevelt put rural issues up front in his campaign not simply to distinguish his record from Hoover’s but because he and his advisors attributed the Depression to an absence of farm purchasing power. Low agricultural incomes, they argued, kept factories idle and urban workers unemployed. Roosevelt had addressed the question of the urban unemployed as governor by instituting work relief and welfare benefits, anticipating the similar programs of the New Deal as well as the federal governments recognition of organized labor. Still, the model of the Depression that he initially adopted cast the farm sector as the primary problem, with the low purchasing power of the working class a complementary concern. Whatever the genuine economic causes of the Depression—and Roosevelt can certainly be faulted for embracing an exclusively domestic analysis, especially an agrarian fundamentalist one—the underconsumptionist model was political gold. The president intended to use the nations economic distress to transcend the poisonous cultural divide between the country and the city and to create a more lasting reform coalition within the Democratic Party.
In the election of 1932, Roosevelt won a significant plurality of votes in the nation’s major cities, swept the South and the West, and polled his largest majorities in the farm regions. Congress, pressured by constituents at home to do something quickly, submitted to executive leadership and engaged in a remarkable flurry of legislative energy. It first passed measures repairing the country’s money and credit. To restore confidence in the financial system, Roosevelt authorized the federal government to reorganize failing banks and to issue more currency. Congress also created the Federal Deposit Insurance Corporation (FDIC) to insure regular Americans’ savings accounts. The rural wing of the Democratic Party had historically agitated for inflation so that farm prices might rise and debts be repaid with easier money. Roosevelt moved in this direction by partially severing the currency from the gold standard, which had provided a basis of trust for international financial transactions but impeded domestic growth. Funneling assistance to those in need was an equally urgent task, and the administration set up aid programs that distributed cash grants through state relief agencies, put the unemployed directly to work with government jobs, and employed needy Americans indirectly by investing in public works projects.
The primary new departure in federal policy was the New Deal’s attempt to raise prices and wages simultaneously—to get farms and businesses producing again and to supply consumers with the incomes to assist in that recovery. The Agricultural Adjustment Administration (AAA) sought to curb overproduction and raise prices by restricting the acreage planted in key commodities, providing farmers with support payments collected from taxes on agricultural processors. Such intervention was necessary, the AAA argued, to increase farm purchasing power and spur national economic recovery. On the industrial end, the National Recovery Administration (NRA) sought to foster cooperation among government, management, labor, and consumers, empowering planning boards to issue regulatory codes governing wages, prices, and profits. The NRAs enabling act, the National Industrial Recovery Act, also included funds for the Public Works Administration (PWA). The theory was that public works money would operate alongside the AAA’s cash benefits to increase the total number of purchases nationwide, while the NRA would see to it that labor and capital shared the fruits of recovery. Policy makers initially embarked on such intricately managerial experiments because they never seriously considered the socialist alternatives—nor, for that matter, other liberal options such as massive government spending or aggressive taxation and redistribution. For most of the 1930s, balanced budgets remained the orthodoxy among New Dealers, including the president, and long-standing traditions of limited government and self-help continued to shape the beliefs of most liberals. Such ideas also continued to shape legal opinion; the Supreme Court overturned the NRA in 1935 and the AAA in 1936.
Rural issues remained central to New Deal policy. Congress enacted the creation of a second AAA, now financed with revenues from the general treasury. The government purchased thousands of acres of marginal and tax-delinquent farmland and added it to the national preserves and forests. Farmers also received cash benefits and technical assistance to halt soil erosion—a cause soon rendered even more urgent after the nation absorbed shocking images of the Dust Bowl—and the government financed construction of massive hydroelectric dams, irrigation facilities, and power distribution systems. Beginning with the Tennessee Valley Authority Act of 1933, every public dam-building agency (such as the TVA, the Bureau of Reclamation, and the Army Corps of Engineers) was required to produce and distribute power to serve surrounding rural areas, a task accomplished with the assistance of other New Deal creations such as the Rural Electrification Administration and the PWA. Such generous financing for rural development especially benefited the South and the West, the “Sunbelt” regions that emerged in the postwar era as economic counterweights to the Northeast and the Midwest.
While agriculture and rural resource policy remained central to New Deal policy, farm politics quickly gave way to a more urban orientation within the Democratic Party. Beginning with Roosevelt’s landslide reelection in 1936, urban majorities became decisive in Democratic victories. Roosevelt’s urban appeal lay in the federal government’s work relief and welfare benefits, and in the New Deal’s recognition of formerly marginalized religious and cultural groups. Though some reformers denounced the Democratic Party’s ties to city machines, municipal officials and urban constituencies often liberalized the party, pushing to the forefront questions of workplace justice and civil rights. Nothing better signaled the urban liberal turn in national politics than the rise of the labor movement, officially sanctioned in 1935 by the National Labor Relations Act, which guaranteed workers’ right to organize, hold elections, and establish closed shops. While business leaders resisted labor organizing, unions won enormous victories in the nations major industrial centers. Legal recognition and government-mandated bargaining between labor and management put the nations industrial policy on a more lasting footing: the government would set labor free as a countervailing power to business, rather than attempt to dictate industrial relations or to redistribute wealth directly to the working class.
The New Deal’s initial political partnership between rural and urban America assumed that each party well understood its interdependence and the need to raise purchasing power among both farmers and workers. But the urban liberal tilt in national affairs—especially the rapid escalation of labor strikes and challenges to management—reignited long-standing divisions between countryside and city. Many southern Democrats and western progressives disdained what they viewed as the nonemergency New Deal: public housing experiments, relief for the “shiftless,” unnecessary government projects, and programs that helped the rural poor—programs that only occasionally benefited African Americans but nonetheless convinced white Southerners that a direct federal assault on Jim Crow could not be far behind. The rising prices that accompanied the nation’s agricultural and industrial policies also irked middle-class consumers. After Roosevelt’s ill-judged attempt in 1937 to “pack” the Supreme Court, these tensions crystallized in Congress with the formation of a conservation coalition composed of Republicans and anti—New Deal Democrats (mainly Southerners, whose seniority granted them a disproportionate number of committee chairs). An antilynching bill backed by civil rights leaders and urban representatives drove a wedge further into the Democratic Party. Roosevelt declined to back the bill, citing the irate Southerners’ strategic positions in Congress, and instead attempted to “purge” the conservatives from his party by intervening personally in the primary elections of 1938. Not only did this effort fail miserably, but voters all over the country also rebuked the president by replacing a significant number of Democratic representatives and governors with Republicans.
Compounding the political stalemate was an economic downturn that began in the autumn of 1937. Though the economy had never regained pre-Depression levels of employment and investment, enough progress had been made after 1933 that the administration could claim some success, however uncoordinated or dictatorial the New Deal appeared to its critics. But the recession shook this confidence. As in 1929, it began with a crash in the stock market and prompted unwelcome comparisons between Roosevelt’s stalled agenda and the political fumbles of Hoover before him. More significant, it set off a debate within the administration over the causes of the downturn and the proper methods for economic revival. Some liberals blamed businesses for failing to invest; others intended to launch new legal actions against corporate monopolies. Still others drew lessons from the previous year’s budget cuts, arguing that government spending had shored up consumer purchasing power from 1933 to 1937 and that its recent reduction posed the problem. In a line of economic argument that became known as “Keynesian” after British economist John Maynard Keynes, these liberals put forward the idea that consumption, not investment, drove modern industrial economies, and that public spending was the best vehicle for stimulating demand in a downturn. This analysis probably owed more to domestic champions of consumer purchasing power than to Keynes himself, whose complicated ideas Americans embraced only gradually and partially. Still, the reasoning nudged the New Deal in a different and less reformist direction. Roosevelt authorized increased spending programs in 1938 and justified these as important in their own right rather than as needed benefits for particular groups. No doubt Roosevelt also anticipated the political appeal of fiscal policy, which demanded less government intervention in the private decisions of firms, farmers, or consumers.
Public expenditures in 1938, only 8 percent of the national gross domestic product (GDP), were timid by later standards. Not until World War II, when federal spending reached 40 percent of the GDP, would public funds really ignite economic recovery. Between 1940 and 1944, the United States enjoyed the greatest increase of industrial output in its history—almost overnight, full employment returned and wages rose without government prodding. Clearly, the New Deal alone did not restore prosperity, nor had it substantially reduced income inequality. But it did remove much of the risk from market capitalism, providing a certain measure of security to farmers and businesses, homeowners and financiers, employers and employees.
This security was built along distinctly American lines. The New Deal’s signature triumph, Social Security, departed from other models of universal social insurance by financing its benefits with regressive payroll taxes and dividing its assistance into two tiers: the retirement pensions, administered at the national level and pegged to national standards; and the unemployment and supplemental welfare programs administered by the individual states, which set their own eligibility requirements and benefit levels. Along with union recognition, laws regulating hours and wages, and World War II—era support for home purchases and education, Social Security provided a framework for the upward mobility of the postwar American middle class. But this framework also operated along distinctly American lines. African Americans were denied many of these benefits owing to the continued power of southern congressmen, who excluded from protection the occupations in which blacks were most represented, insisted on local administrative control, and prevented the passage of antidiscrimination provisions.
During the Depression, it was not uncommon for Americans to question the national faith in unlimited expansion. Some analysts had even wondered whether the idea of a “mature economy” might not describe the country’s predicament; perhaps it was time to come to terms with an economic system that had reached the limits of its capacity to grow. But the lightning-quick recovery generated by World War II altered this mindset fundamentally and permanently. Migrants streamed across the nation to take up work in the booming defense centers in the West and Midwest, many settling in the new suburbs spreading outward from urban cores. African Americans, eager to share in the promise of economic mobility but prevented from joining this workforce on equal terms, threatened a march on Washington to protest the injustice. Emboldened by Roosevelt’s subsequent executive order forbidding discrimination in defense industries, blacks launched a “double V” campaign to fight for victory over enemies both at home and abroad. While victory on the home front would remain far more elusive than victory over Germany and Japan, the war energized a new, more assertive generation of civil rights activists.
Wartime mobilization also strengthened the conservative coalition in Congress, which targeted “nonessen-tial” spending and dismantled relief agencies such as the Works Progress Administration, the National Youth Administration, the Civilian Conservation Corps, and the Farm Security Administration. Conservatives combined these attacks with more pointed critiques of government planning, drawing exaggerated parallels between the New Deal and the fascist regimes that had fomented the war. They also grew bolder in challenging organized labor, which faced diminished public and political support owing to unauthorized wildcat strikes. Wartime opportunities had raised labor’s hopes for rising wages and perhaps even an equal share of managerial authority, but pressured by government officials, labor lowered its sights and agreed to a “no strike” pledge and to wage increases that rose in step with inflation but not higher. Now on the defensive along with their liberal allies, union leaders adapted to the anti-statism of the war years by more firmly embracing a Keynesian model of government intervention that steered clear of micromanaging private economic institutions and instead used macroeconomic fiscal tools to support mass consumption.
The war not only swept away any lingering doubts about the economy’s ability to expand; it also buried the agrarian analysis of the Depression’s origins. Suddenly policy makers grappled with commodity shortages, rising prices, and a labor deficit in the countryside—a sharp turnaround from earlier questions of overproduction, low prices, and a potentially permanent labor surplus. Rural policy makers moved away from the idea that the nation’s economic health depended on stabilizing the existing rural population and instead defended a less reformist but still aggressive government role in expanding the country’s industrial base and increasing its aggregate purchasing power. Clearly the future was in the factories and defense plants that were running at full capacity, not on America’s small farms. But if the nation’s economic future lay in urban and suburban America, politics at the end of the war still reflected long-standing divisions between the country and the city. The Democratic Party would emerge from the war as the majority party, but one nonetheless destined to rediscover its historic fault lines of region and race.
See also banking policy; business and politics; tariffs and politics; taxation.
FURTHER READING
Badger, Anthony J. The New Deal: The Depression Years, 1933–1940. New York: Hill and Wang, 1989.
Brinkley, Alan. The End of Reform: New Deal Liberalism in Recession and War. New York: Knopf, 1995.
Cohen, Lizabeth. Making a New Deal: Industrial Workers in Chicago, 1919–1939. New York: Cambridge University Press, 1990.
Jacobs, Meg. Pocketbook Politics: Economic Citizenship in Twentieth-Century America. Princeton, NJ: Princeton University Press, 2005.
Katznelson, Ira. When Affirmative Action Was White: An Untold History of Racial Inequality in Twentieth-Century America. New York: Norton, 2005.
Kennedy, David M. Freedom from Fear: The American People in Depression and War, 1929–1945. New York: Oxford University Press, 1999.
Leuchtenberg, William E. Franklin D. Roosevelt and the New Deal, 1932–1940. New York: Harper and Row, 1963.
____. The Perils of Prosperity, 1914–1932. Chicago: University of Chicago Press, 1958.
Patterson, James T. Congressional Conservatism and the New Deal: The Growth of the Conservative Coalition in Congress, 1933–1939- Lexington: University of Kentucky Press, 1967.
Phillips, Sarah T. This and, This Nation: Conservation, Rural America, and the New Deal. New York: Cambridge University Press, 2007.
Rauchway, Eric. The Great Depression and the New Deal: A Very Short Introduction. New York: Oxford University Press, 2008.
Schulman, Bruce J. From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South. New York: Oxford University Press, 1991.
Smith, Jason Scott. Building New Deal liberalism: The Political Economy of Public Works, 1933–1956. New York: Cambridge University Press, 2006.
SARAH T. PHILLIPS
The political economy of the United States during the first three decades after World War II can best be characterized as a liberal Keynesian regime. From the 1940s through the 1970s, the living standards of both the working class and the middle class doubled in an era that also experienced relatively rapid economic growth. This achievement rested on several key pillars: a legal framework favoring strong trade unions, a liberal welfare state entailing government promotion of economic security and labor standards, large expenditures for both the military and civilian infrastructure, and an interventionist state that regulated finance, investment, and trade. These policy initiatives were largely upheld during both Democratic and Republican administrations. As the income gap between the rich and poor narrowed—and social movements successfully extended citizenship rights to African Americans and other excluded groups within the population—the United States became a place of much greater social equality.
The New Deal created the terrain on which all post-World War II political struggles took place. It put security—economic security—at the center of American political and economic life. The enactment of federal mortgage assistance, bank deposit insurance, minimum wages, Social Security, and laws bolstering labors right to organize created social and economic entitlements that legitimized the modern state and vitalized an expansive citizenship within new strata of the population. New Dealers identified economic security as a grand national project, “a great cooperative enterprise” among “the citizens, the economic system, and the government.” Security necessarily entailed an element of public power. Though Roosevelt-era policy makers initially excluded many women, Latinos, and African Americans from the new entitlement state, President Lyndon Johnsons Great Society did much to expand the social citizenship concept, even as it engendered a fierce, debilitating backlash.
Moreover, Keynesians believed that such policies would generate high levels of employment by boosting aggregate demand in the economy. Through manipulating government spending and the money supply, the federal government could stimulate the economy when it lagged or slow growth when inflation threatened. The booms and busts that had been a routine feature of American economic life would instead be turned into steady growth and more widely shared benefits.
World War II mobilization policies completed social tasks begun in the Great Depression and set the stage for a postwar social Keynesianism. In exchange for labors unimpeded participation in continuous war production, the federal government settled labor disputes through a National War Labor Board and facilitated union growth with a “maintenance of membership” policy that required every worker in a unionized workplace to join the union and pay dues. With millions of workers pouring into industrial manufacturing, national union membership soon jumped to 15 million, about 30 percent of non-farm employment.
For American workers the fight against fascism during World War II had significance at home as well as abroad. As the new union movement, the Congress of Industrial Organizations (CIO), put it, their hard work and sacrifice could also be seen as a struggle to “Insure Your Future … For Peace—Jobs—Security.” Urging American workers and their families to “vote for Collective Bargaining and Full Employment, Lasting Peace and Security,” the CIO saw the rights of the National Labor Relations Act (NRLA), Social Security, and the international struggle for democracy as inseparable: “The broad problems of world security and the personal problems of individual security seem to merge.” President Franklin Roosevelt further legitimized the idea of a right to security when, in 1944, he proposed a Second Bill of Rights, protecting opportunity and security in the realms of housing, education, recreation, medical care, food, and clothing.
During the war the government kept a cap on both wages and prices as part of its anti-inflation strategy. American business flourished and productivity leaped forward, the consequence of nearly five years of government-subsidized investment in plant, machinery, scientific research, and physical infrastructure. Not unexpectedly, workers were ready to demand their share of wartime profits and prosperity. This demand took on particular urgency once it appeared that President Harry Truman planned to end government price controls, thereby unleashing an inflationary pulse that threatened to reduce real income for most working-class wage earners.
A huge strike wave swept through manufacturing industries in 1945 and 1946. Americas workers had also gone on strike at the end of World War I, but at that time employers, relying on the armed force of the state, crippled the postwar union movement in steel, meatpacking, and coal mining, and on the railroads. But after World War II, unions were far more secure, with almost 35 percent of nonfarm workers enrolled. The frontier of trade unionism did not advance into agriculture, domestic service, the lower ranks of factory supervision, southern textiles, nor most white-collar occupations. Throughout the manufacturing, transport, utility, and construction sectors, however, legally sanctioned collective bargaining became a permanent feature of the U.S. political economy. Along with the New Dealers now in seemingly permanent command of the government policy-making and regulatory bureaucracy, union leaders contended that the key to economic growth must rest upon a regime of mass consumer purchasing power. Workers throughout the economy should be able to buy all the new goods the economy could produce, including housing, medical care, leisure, and entertainment.
In the late 1940s, labor and its New Deal allies saw the movement for greater purchasing power and security as a battle on two fronts: economic and political. Facing corporate employers, labor demanded not only higher wages, protections against inflation, and job security but also “fringe benefits”: paid vacation, sick leave, health insurance, and pensions. The National Labor Relations Board (NLRB), created by the New Deal NLRA in 1935, eventually sustained by the Supreme Court, endorsed this bargaining strategy, thus forcing management to accept a widening sphere for collective bargaining. The strategy proved most successful in oligopolistic sectors of the economy (auto, steel, tire and rubber, airplane manufacturing), where the market power of the corporations and the bargaining clout of the unions made it possible to take wages out of competition. This generated “pattern bargaining”: when a leading corporation signed a new contract, the other top companies in the sector agreed to almost all of its main provisions. Moreover, companies that did not want to become unionized but sought to maintain stable workforces, such as Kodak, DuPont, Johnson & Johnson, and Colgate Palmolive, also followed the pattern established by the big unions. Consequently, the presence of a strong, dynamic union movement helped drive wages up in primary labor markets across the economy. As inflation ebbed and women entered the workforce in a proportion that exceeded even that at the height of World War II, median family incomes rose in dramatic fashion, doubling in just over a generation.
Labor had a partner in the New Deal state. The Keynesian approach to economic management was embodied in the 1946 Employment Act. The federal government would be responsible for promoting “maximum employment, production, and purchasing power.” Rather than waiting for economic crisis to provoke a government response, a new Council of Economic Advisors would have a continuing role developing macroeconomic policy aimed at these goals. The federal government also drove economic growth through investment in an interstate highway system, hospitals and medical schools, universities, and hydroelectric power. Economic modernization was not to be achieved at the expense of the working class or middle class. Building on the national welfare state of the New Deal, the Servicemen’s Readjustment Act of 1944 (known more commonly as the GI Bill) offered veterans government support for home mortgages, vocational or university education, and small business start-up. The act was administered locally, however, and black and female veterans were often passed over for these benefits.
Harry Truman had initially sought to extend the New Deal welfare state as soon as the war was over, pushing forward a comprehensive agenda that included a higher minimum wage, federal commitment to public housing, increased unemployment insurance, and national health insurance. Although the Democrats lost badly during the 1946 congressional elections, Truman and a broad-based coalition of labor, liberals, and small farmers demonstrated the majority status of the Roosevelt coalition during the 1948 elections. Interpreting his surprising reelection as a vindication of the New Deal project, Truman declared early in 1949, “Every segment of our population and every individual has a right to expect from our government a fair deal.” But a congressional bloc of Republicans and southern Democrats, known as “Dixiecrats,” turned back many Fair Deal proposals. Dixiecrats, increasingly wary that Fair Deal labor policies and civil rights initiatives would threaten white supremacy in the states of the Old Confederacy, formed a generation-long alliance with anti—New Deal Republicans.
In the eyes of American business leaders, Truman’s endorsement of a strong alliance between labor and the state smacked of European-style social democracy. They did not want to see further erosion of what they considered “managerial prerogatives,” nor a further expansion of the welfare state. American employers recognized the social and political premium placed on security—economic security—as vividly as did the Democratic Party, the labor movement, and the proponents of national Social Security. While they were willing to accede to workers’ demands for security, the link between union power and the federal government would have to be severed. As Business Week warned in 1950, “management, for the first time, is faced with a broad social demand—the demand for security. But if management does not use it wisely, the worker is likely to transfer his demands from the bargaining table to the ballot box.” In order to outflank the political mobilization of labor, especially in its demands for health insurance and greater social security, corporate executives imitated the state: companies would now provide social security through private pensions and insurance benefits. Mimicking the standards set by the state, American business firms and commercial insurance companies became partners in creating and expanding private alternatives to public social insurance and community-controlled social welfare institutions. “The American working man must look to management,” Ford Motor Company vice president John Bugas told the American Management Association.
This viewpoint reached fruition in 1950, when General Motors (GM) signed an unprecedented five-year contract with the powerful United Automobile Workers, an agreement that Fortune magazine dubbed “The Treaty of Detroit.” GM agreed to assume health insurance and pension obligations for its workers, blue collar and white collar, forging an employment template that many other U.S. corporations soon followed. Henceforth paid vacations, sick leave, health insurance, and pension became standard features of blue-collar and white-collar employment. GM president Charles Wilson claimed to have achieved “an American solution for the relations of labor and industry.” The government agreed. A 1956 Senate report labeled employee benefits programs “a tribute to the free enterprise system.” By extending such security to employees, America’s largest companies headed off political alternatives, including national health insurance, more progressive public pensions, and even union control of firm-based benefits.
Those elements of the Fair Deal that stayed within the parameters of the welfare state already set—minimum wage, means-tested public housing for the poor, improvements in Social Security pensions—passed into law. Proposals that would expand the welfare state in scope and curtail emerging private markets, like health insurance, went down to permanent defeat.
Also stymied in the immediate postwar years was the government’s commitment to racial justice. African Americans had begun to make economic progress during World War II through the CIO, a newly aggressive NAACP, President Franklin D. Roosevelt’s nondiscrimination executive order for defense work, and the Fair Employment Practices Committee (FEPC). With the FEPC, the federal government legitimized African American demands for equal opportunity and fairness at work. Although the FEPC itself had little power, African Americans mobilized around it and used it to pry open the previously insulated realms of segregated employment. After the war, CIO leaders and the NAACP pushed for a permanent FEPC. Some states did establish commissions on fair employment, but the Dixiecrats made sure the U.S. Congress never did. Instead, it took a massive, direct-action social movement, sweeping through the South two decades later to force Congress and the president to prohibit employment discrimination, on the basis of sex as well as race, with Civil Rights Act of 1964.
By the time Republican Dwight Eisenhower took office, the New Deal welfare state, labor reforms, and state regulatory policies were firmly established. Although a Republican with strong ties to corporate elites, Eisenhower shared the essential premises of liberal Keynesian-ism and the New Deal order. He signed a raise in the minimum wage, oversaw new amendments to Social Security that added disability pensions and expanded old-age coverage to new groups, including agricultural and domestic workers, and created a new cabinet department, Health, Education, and Welfare.
Eisenhower, however, viewed with skepticism the rapid expansion of another pillar of the Keynesian state, the military-industrial complex. After a brief period of demobilization at the end of World War II, national security affairs became the largest, fastest growing sector of the American government. Truman not only turned the War Department into the Defense Department but created domestic national security agencies. As tensions with the Soviet Union heightened in the late 1940s, and the United States took on an increasingly interventionist role in Western Europe, Greece, Turkey, Korea, and elsewhere, defense spending became a permanent, rising part of the federal budget. The defense budget hit $50 billion a year—half the total federal budget—when the United States went to war in Korea. For the first time, the United States maintained permanent military bases in over three dozen countries. The militant anti-Communist agenda abroad also translated into political purges of leftists at home.
Liberals and conservatives alike let go of their sacrosanct commitment to balanced budgets and instead came to believe that by pumping money into the economy, Americans could have “guns and butter too.” Defense spending in southern California and the Southeast—on military bases, weapons production, scientific research—built up entire local and regional economies. Military spending drove suburbanization, well-paid employment, and mass consumption in places like Orange County, California, during the 1950s and 1960s.
The American South, while eagerly digesting government largesse for military bases, defense contracts, and universities, contested the liberal political economy based on rising wages and labor rights. Through the New Deal and especially the war, Southerners had finally experienced a vast improvement in their standard of living. The Fair Labor Standards Act (which established the national minimum wage and 40-hour workweek), rural electrification, wartime spending, and economic modernization brought southern wages and consumer purchasing closer than ever to northern standards. Southern textiles plants continuously raised wages in order to stay a step ahead of the Textile Workers Union of America, primarily based in the North but seeking to organize the South. After 1948, however, southern states took advantage of the Taft-Hartley Act to ban the union shop through so-called right-to-work laws, while southern employers quickly made use of Taft-Hartley’s grant of “free speech” for management during union elections. Within a few years, southern employers had stopped the postwar union movement in its tracks. By the latter half of the 1950s, it was clear the South would remain a lower-wage region. American business took the cue and began moving plants south of the Mason Dixon Line. Over the next decade and a half, leading corporations like Westinghouse, DuPont, and RCA and a wide range of smaller companies making textiles, light fixtures, chemicals, and auto parts relocated production to southern areas where states promised lax regulation, minimal taxation, low wages, and a union-free environment.
For new migrants to American cities, such as African Americans, Puerto Ricans, and Native Americans, capital flight had frustrating and devastating effects. Five million African Americans had migrated from South to North, and now industrial jobs went to precisely the repressive and impoverishing places they had recently fled. Companies like General Motors and Ford Motor relocated plants to suburbs and small communities where African Americans were shut out by racial exclusion. In places such as Detroit, the “arsenal of democracy” that had drawn so many black migrants, black unemployment shot up as high as 18 percent by 1960. Federal “termination” and “relocation” programs moved Indians off reservations to cities like Chicago, while Chicago lost over 90,000 jobs. This new urban working class ended up in low-wage service sector jobs, as janitors, domestics, or hospital workers, or in public sector jobs, as sanitation workers, school custodians, and home-care aides.
Cresting with the civil rights movement of the 1960s, though, public sector workers, long-excluded from the NLRA, began to win union organizing rights. Two decades of public investment had dramatically expanded the government workforce at all levels. President Johnson’s War on Poverty and Great Society created even more social service jobs—in health care, education, job training. In a wave of union militancy not seen since the 1930s, public sector workers struck and won unions for teachers, hospital workers, police, social workers, and sanitation workers. These new unions fought for higher wages, better working conditions, dignity, and respect for tens of thousands of female and minority workers but also for expanded public services and social welfare. This movement created 4 million new union members, but it marked the last period of sustained union growth in the twentieth century.
For two generations, the public and private welfare state grew in tandem. Yet critical observers, journalists, liberal economists like John Kenneth Galbraith, civil rights activists, and feminists insisted with increasing urgency in the early 1960s that many Americans had been left out of this postwar prosperity—that poverty persisted amid plenty. John F. Kennedy, a cold warrior, focused his attention on foreign policy, although he had begun to take programmatic steps to address unemployment through job-training programs. Lyndon Johnson, however, was a New Deal liberal. When he became president after Kennedys death, Johnson saw an opportunity to complete the project of the New Deal—and this time, to ensure that racial justice would not be pushed to the sidelines. Declaring “unconditional war on poverty” in 1964, LBJ oversaw the passage of the Economic Opportunity Act, which established Job Corps, Neighborhood Youth Corps, Adult Education Program, Volunteers in Service to America (VISTA), and Work Experience for those on AFDC. Believing that macroeconomic policy had solved the problems of economic growth, War on Poverty liberals more often sought to reform workers than to restructure the labor market. Poverty and unemployment, they argued, could be overcome through expanding individual opportunity without substantial redistribution. More broadly, Johnsons Great Society program included national health insurance for the elderly, Medicare, and medical assistance for the poor, Medicaid; public housing; Fair Housing to overcome decades of racial discrimination in housing markets; education funding and college grants and loans; and the elimination of national origins quotas for immigrants.
It also stressed a kind of participatory democracy. Through War on Poverty grants, community action agencies pressured city governments for better services, jobs, and housing. The Office of Economic Opportunity’s Legal Services mobilized welfare rights activists to press for due process, supplemental grants, and basic citizenship rights for public assistance recipients; in California, it teamed up with United Farm Workers to win civil rights for Mexican Americans. Women and African Americans acted through collective mobilization and unions to achieve the promises of affirmative action and Title VII of the 1964 Civil Rights Act declaring discrimination based on “race, color, religion, sex, or national origin” to be an “unlawful employment practice,” and backed it up with new means for redress. In the following decade, domestic workers won inclusion in the Fair Labor Standards Act. Women’s groups used class-action lawsuits to force colleges and universities to change employment and admissions policies. The enactment of the Occupational Safety and Health Act (OSHA) brought the reach of the New Deal regulatory state into toxic and dangerous workplaces where many minorities worked. OSHA represented the last gasp of the New Deal.
The Keynesian New Deal order foundered in the 1970s, as economic growth slowed significantly, corporate profitability stagnated, energy costs soared, and inflation took off unabated. Manufacturing firms, which had already spent two decades shifting production to lower wage areas within the United States, sought yet another spatial fix, moving production out of the country. Unemployment hit levels not seen since the Depression, reaching 8.5 percent in 1975. High oil prices, unemployment, and inflation produced a new toxic brew—“stagflation”—and Keynesian policies of economic stimulus did not seem to work. Thus, in part, the conditions that had sustained the liberal Keynesian order changed.
But there was an increasingly successful political assault on it as well. A new conservative political movement had been taking shape and coalescing throughout the 1960s—in corporate boardrooms, new think tanks and foundations, and churches; among suburban housewives in the Sunbelt and college students in Young Americans for Freedom. Conservative activists began shifting the Republican Party rightward with the presidential nomination of Barry Goldwater in 1964 and the election of Ronald Reagan as California governor in 1966. Inspired by Friedrich Hayek and Milton Friedman, they set out to liberate the free market from the shackles of the welfare state, regulation, and labor unions. In the 1970s, conservatives successfully chipped away at the New Deal. Employers hired anti-labor management “consulting” firms to disestablish unions. The U.S. Chamber of Commerce launched aggressive campaign financing and lobbying operations and, along with new corporate political action committees, stopped liberal reforms in Congress. President Richard Nixon attempted to check the growing tide of welfare state spending. At the same time, wages and incomes stagnated, fueling well-orchestrated tax revolts. A Democratic president, Jimmy Carter, helped set the mold for the “Reagan revolution” of the 1980s—initiating deregulation and workers’ concessions at the bargaining table, through the federal rescue plan of the ailing Chrysler Corporation. A path had been cleared for the rise of a new free-market ideology.
See also business and politics; era of confrontation and decline, 1964–80; era of consensus, 1952–64.
FURTHER READING
Biondi, Martha. To Stand and Fight: The Struggle for Postwar New York City. Cambridge, MA: Harvard University Press, 2003.
Cobble, Dorothy Sue. The Other Women’s Movement: Workplace Justice and Social Rights in Modern America. Princeton, NJ: Princeton University Press, 2004. Freeman, Joshua B. Working-Class New York: Tife and Labor Since World War II New York: New Press, 2000.
Jacobs, Meg. Pocketbook Politics: Economic Citizenship in the Twentieth Century. Princeton, NJ: Princeton University Press, 2004.
Klein, Jennifer. For All These Rights: Business, Tabor, and the Shaping of America’s Public-Private Welfare State. Princeton, NJ: Princeton University Press, 2003.
Lichtenstein, Nelson. Walter Reuther: The Most Dangerous Man in Detroit. Urbana: University of Illinois Press, 1997.
MacLean, Nancy. Freedom Is Not Enough: The Opening of the American Workplace. Cambridge, MA: Harvard University Press, 2006.