Chapter 13
The Glorious Thirty Years

The 1929 crash, the slow recovery of 1930, and the ensuing spiral descent into an abyss of unemployment, bank failures, and commercial paralysis was not corrected by market processes. . . . Out of the crisis was born the American economic republic as we know it today.

—Adolf A. Berle, 19631

In Germany, the period was called the Wirtschaftswunder, the economic miracle. In 1958, John Kenneth Galbraith coined the phrase “the affluent society.”2 Looking back, Americans today often call it the Golden Age. In a 1979 book, the French demographer Jean Fourastie called the era from 1946 to 1975 les trente glorieuses, the glorious thirty, a reference that evoked the Trois Glorieuses, or glorious three days of revolution in France of July 27–29, 1830.

During the thirty glorious years, the Western democracies experienced similar combinations of high growth and rapid expansion of mass middle classes, underpinned by high labor-union membership, middle-class welfare states, and highly regulated economies. By the 1950s, all the democratic societies of the North Atlantic world had “settlements” or “new deals” that were similar in combining various forms of social insurance with increased government regulation of or ownership of banking and industry. The term “mixed economy” was sometimes used for an economy that blended private enterprise with public regulation, redistribution, and in some cases public ownership. Shared prosperity gave electorates in America and other white-majority democracies the confidence to eliminate the racial-caste systems that had long made a mockery of their ideals.

The restructuring of the economy after World War II in the United States and other nations coincided with the maturation of the technologies of the second industrial revolution, electricity and the internal combustion engine. In the 1950s and 1960s, the United States completed its electrical grid system and put it under the dull but safe management of local utilities that replaced rival corporate empires. The United States also undertook one of the greatest works of civil engineering in history—the construction of the interstate highway system. Together, the two grids transformed the landscape and the American way of life, by permitting the decentralization of production, work, shopping, and homes.

While the electric grid and the highway grid were visible manifestations of the second industrial order, the unseen motors hidden in household appliances were just as revolutionary. The industrialization of the household with the help of labor-saving appliances like refrigerators, washing machines, and dishwashers allowed the members of America’s new mass middle class to spend time formerly devoted to household chores on other activities, including listening to the radio, watching television shows, and going to the movies.

Other countries caught up rapidly, as they rebuilt themselves after the devastation of the second global war in a generation. All the industrial democracies, from Japan to North America to Western Europe, created mass middle classes by using the technologies of the second industrial era for liberal and democratic ends. But it was in the United States that the promise of the liberating technologies of the second industrial revolution was first fulfilled.

THE PAX AMERICANA AND THE COLD WAR

In their planning for the postwar world, the Roosevelt and Truman administrations assumed a rapid demobilization by the United States. The US military sought to possess new island bases around the world on the assumption that there would be no permanent US military garrisons in Asia and Europe once the occupations of the defeated Axis powers came to an end. These assumptions were slowly abandoned, as hopes for postwar cooperation between the Soviet Union and the United States were thwarted by the tensions of the Cold War.

Although it included proxy wars in Korea, Indochina, Afghanistan, Latin America, Africa, and the Middle East, arms races, espionage, sabotage, and propaganda, the Cold War waged by the United States was essentially a war of economic attrition. The United States sought to cripple the Soviet economy by two methods. The first was an embargo of advanced dual-use, or civilian-military, technology, under the direction of the Coordinating Committee for Multilateral Export (CoCom), which was established in 1949 and which governed allied trade with the communist bloc until it was dissolved in 1994. The second was preventing the Soviets from controlling or intimidating West Germany and Japan and other industrial nations.

From the American perspective, the central assets at stake in the Cold War were the factories of defeated Germany and conquered Japan. As the American diplomat George Kennan pointed out, the strategy of the United States was to deny the Soviet Union control of or influence over the other centers of military-industrial power in the world, which at the time were located in Germany/Europe, Japan, Britain, and North America. If the United States directly or indirectly controlled all the centers of manufacturing outside of Soviet borders, then its internal resources alone would not allow the Soviet Union to mount a military challenge to the United States without risking bankruptcy.

During World War II, the Roosevelt administration considered a proposal by Secretary of the Treasury Henry Morgenthau to permanently “pastoralize” or deindustrialize Germany. But in early 1947, Herbert Hoover, sent to Germany on a fact-finding mission by President Harry S. Truman, was shocked by the hunger that he found. Hoover’s argument that European economic recovery depended on Germany’s economic recovery prevailed. In June 1947, at a Harvard commencement address, secretary of state George Marshall outlined what became known as the Marshall Plan, an offer of substantial amounts of aid to the countries of Europe, including the Soviet Union.

Stalin rejected Marshall Plan aid and denied it to the Eastern European countries under the control of the Red Army. As East-West tensions rose, the American, British, and French zones of occupied Germany were merged into the Federal Republic of Germany, leaving the eastern section of Germany to become a separate country, the German Democratic Republic. The two Germanies would be reunited only after the Cold War ended, in 1990.

The ultimate solution to “the German problem” was integration—the integration of German industry into multinational systems, first the European Coal and Steel Community in 1951 and then the European Economic Community, and the integration of a rehabilitated German military into the North Atlantic Treaty Organization (NATO) alliance in 1955. The extension of multinational control over Germany’s coal and steel resources ensured that they would not be enlisted again in the service of military aggression.

West Germany and Japan rearmed, but they remained semisovereign states, subordinate to the United States and, in the case of Germany, to the other members of the European community and NATO. Having ceased to be independent military powers, these great powers devoted their energy to running up manufacturing export surpluses. By the end of the twentieth century, their new specialization would produce problems for the world economy, but in the 1950s Americans were glad to encourage their former enemies to devote their efforts to making cameras and cars instead of Zeros and Panzers.

America’s economic superiority allowed it and its allies to prevail in the Cold War, as in the world wars. The United States was able to bankrupt the Soviet Union, which spent between a third and a half of its smaller economy on the military, while spending no more than an average of 7.5 percent of GDP on defense between 1948 and 1989.3

THE GLOBAL OIL CARTEL

Among the global public goods that the United States, as the hegemonic power, supplied to its allies and protectorates was secure access to cheap and abundant petroleum, which was essential to the second industrial economy. In seeking to secure energy supplies for itself and its friends, the United States sacrificed its support of free markets and democracy to the imperatives of geopolitical strategy.

A global oil cartel dated back to the 1920s. After new production in Texas and Oklahoma caused prices to collapse, a cartel was organized by the three largest global oil companies, Standard Oil of New Jersey, Royal Dutch Shell, and Anglo-Iranian Oil (British Petroleum). At a meeting in Scotland, the companies negotiated the “As Is” agreement. The accord provided that, outside of the United States, where antitrust laws applied, the companies would preserve their proportional shares of the global oil market.

This was followed in 1932 by an agreement to fix quotas, enforced by fines and rebates set by a central organization in London. The cartel now included Gulf Oil, Texaco, and Standard Oil of New York (Socony/Mobil). When Standard Oil of California (Socal/Chevron) developed oil fields in Bahrain, it was pressured into joining the cartel and using the marketing organization of the existing cartel member, Texaco.4

The global oil cartel might not have survived without regulation of production in the United States, which was the world’s leading producer during World War II and for some time afterward. The greatest danger was that many small operators would lack the money and technology to cap wells, leading to the waste of a precious natural resource. The problem was illustrated on January 1, 1901, when a well drilled into the salt dome called Spindletop under Beaumont, Texas, blew. By the time the well was capped on January 19, the well had spouted a two-hundred-foot column of oil into the air, wasting seventy thousand barrels a day.

The danger of waste was apparent to Herbert Hoover, who proposed a federal regulatory commission to supervise oil production. In the fall of 1930, an independent oil operator or “wildcatter,” seventy-one-year-old C. M. “Dad” Joiner, discovered the largest oil field yet known in Rusk County in East Texas; he sold out to H. L. Hunt, later one of the biggest and most reactionary Texas oil tycoons. What followed was anarchy.

Texas politicians were painfully aware of a crisis of waste, overproduction, and prices too low to encourage investment. The Texas Railroad Commission (TRC), founded in 1891 to regulate railroads in Texas, was given responsibility to regulate Texas oil and gas in 1919. By the 1930s, Texas produced half of the crude oil in the world. In 1931, the TRC began to engage in “prorationing,” that is, issuing production quotas. To enforce the railroad commission’s orders, Governor Ross Sterling, a former president of Humble Oil Company, ordered the Texas National Guard to impose martial law on East Texas oil fields.

When the smuggling of “hot oil” produced in violation of the quotas became a problem, the Roosevelt administration sought to coordinate oil production under the oil and gas sections of the NIRA. Influential Texans in Washington, including Roosevelt’s first vice president, John Nance Garner, Senator Tom Connally, and Congressman Sam Rayburn, fought back. The Connally Hot Oil Act gave the federal government the power to enforce directives of the TRC in interstate commerce. State governments and oil companies formed a public-private cartel, the Interstate Oil Compact (IOC). The US Bureau of Mines in the Department of the Interior and the American Petroleum Institute, a trade association, collected data about the industry. This awkward, jury-rigged system succeeded from the 1930s until the 1970s; it was critical to the success of the Allied forces in World War II and became an integral part of the US-dominated global oil system in the early Cold War.

THE POWER POLITICS OF ENERGY

During World War II in 1943, a leading American geologist, Everette Lee DeGolyer, was sent to evaluate the oil-producing potential of the Persian Gulf. Previously, there had been discoveries of oil in Iran (1908), Iraq (1927), and Bahrain (1932). In 1938, Anglo-Persian and Gulf Oil had discovered oil in Kuwait, while Chevron and Texaco in the same year found oil in Saudi Arabia. DeGolyer told the Roosevelt administration, “The center of gravity of world oil production is shifting from the Gulf-Caribbean area to the Middle East—the Persian Gulf area.”5

Britain considered the Middle East to be its sphere of influence, and competed with the United States to win the favor of the Saudi monarch Ibn Saud. In 1944, the United States and Britain negotiated a petroleum agreement, which the leader of the British delegation called “a monster cartel,” while the American government used the euphemism “commodity agreement.” The agreement was withdrawn from submission to the Senate for approval because of intense opposition from independent oil producers and liberal antimonopolists.6

In 1953, the United States and Britain sponsored a coup in Iran to overthrow its democratically elected prime minister, Mohammad Mossadegh, before he could nationalize the Anglo-Iranian Oil Company. Installed as ruler of Iran until he was overthrown in the revolution of 1979, Muhammad Reza Shah Pahlavi presided over a system in which the country’s petroleum industry, nominally owned by Iran, was managed and developed by a consortium of Western oil companies.7 The major oil companies similarly collaborated in developing Saudi oil through the Arab-American Oil Company (Aramco).

In the generation that followed World War II, the global oil industry was dominated by Texaco, Gulf, Standard Oil of New Jersey (Exxon), Royal Dutch Shell, British Anglo-Persian Oil Company (British Petroleum or BP), Standard Oil of New York (Mobil), and Standard Oil of California (Chevron). They were called le sette sorrelle, the Seven Sisters, by an Italian oil man, Enrico Mattei.

According to the Federal Trade Commission, in 1952 the Seven Sisters controlled 88 percent of global oil reserves outside of the United States and the USSR.8 In 1960, the seven controlled 60 percent of global oil.9 When the Justice Department’s Antitrust Division planned a criminal suit against the Seven Sisters, President Truman ordered the investigation to be suspended. His successor, Dwight D. Eisenhower, also suppressed the litigation, although the investigation survived until it was abandoned in 1968.10

Texas alone provided more than a third of the oil used by the United States until the 1970s. The Organization of Petroleum Exporting Countries (OPEC) was founded in Baghdad in 1960 and by the early 1970s power had shifted to OPEC from the TRC. In 1973, during the Arab-Israeli war, OPEC demonstrated its power by imposing an oil embargo on the United States and other allies of Israel.

GIANT POWER

The construction by the United States of a stable global oil infrastructure was accompanied by the completion of the second industrial revolution at home. While Lincoln’s Second Republic of the United States was built on railroads, Franklin Roosevelt’s Third Republic was built on electric grids and interstate highways. The two grids of the second industrial era permitted the decentralization of factories and people, within regions and across the continent.

In the 1920s, three-quarters of the electric-utility sector was controlled by eight holding companies, including Samuel Insull’s. The holding companies eluded state-level regulation and used layer on layer of complexity to baffle regulators and investors alike. Many were as highly leveraged as the House of Insull, which controlled $500 million in assets with only $27 million in equity. Gifford Pinchot, the progressive Republican governor of Pennsylvania, campaigned for Giant Power, a scheme that would replace commercial electric power empires with public utilities. The power companies promoted their private alternative, Super Power.

When the Depression struck, the fragile, overleveraged structures of holding-company systems like Insull’s came tumbling down. Insull lost his fortune, while many of the 600,000 investors in his utility empire lost their life savings. Roosevelt campaigned for the presidency in 1932 denouncing “the lone wolf, the unethical competitor, the Ishmael or Insull whose hand is against every man’s.”

Insull’s fall from grace was swift and dramatic. Shunned in Chicago, Insull and his wife fled to Europe. When Cook County indicted him for embezzlement, larceny, and other offenses, Insull was in Greece. The Greek government refused to extradite him to the United States, but it refused to renew his visa, forcing him to board a steamer to Romania, which turned him away. The Turkish government jailed him and extradited him to the United States, where, after all his travails, he was finally acquitted after standing trial. His health and spirits broken, he moved with his wife to Paris, where he died after collapsing in a Paris metro station. As a final indignity, a Parisian thief took his wallet from his corpse.11

Following Roosevelt’s election, the federal government revolutionized the electric utility business. Responding to the abuse of holding companies as a method of escaping state utility regulation, the Public Utility Holding Company Act of 1935 put the new Securities and Exchange Commission (SEC) in charge of regulating utility holding companies, while the Federal Power Act of 1935 subjected utilities engaged in interstate activities to the jurisdiction of the Federal Power Commission (FPC).12

Rural electrification was another priority of the Roosevelt administration. The Rural Electrification Administration (REA), created in 1936, provided financial assistance to rural electric cooperatives. The number of farm homes with electricity tripled between 1932 and 1941. Thanks in large part to the rural-electrification efforts of the New Deal, the number of farm households with access to electricity and electric appliances like electric lighting, refrigerators, and radios increased from 10 percent at the beginning of the 1930s to 90 percent by 1950.13

The federal government also went into the electrical-power-generation business itself. The best-known example is the Tennessee Valley Authority (TVA), which constructed a series of dams to bring electricity to a depressed rural region as large as Western Europe. Roosevelt adapted as his own a proposal by Nebraska senator George Norris to create a public agency responsible for using hydroelectric power to catalyze the development of the depressed Tennessee Valley region.

The TVA was an independent public agency modeled on the Panama Canal Commission.14 Its second director, David Lilienthal, helped to create the Electric Home and Farm Authority to provide low-interest loans for manufacturing and purchasing electric appliances like waffle irons. With forty-two dams and reservoirs, the TVA lowered the price of electricity, improved freight transportation, controlled flooding, and provided irrigation and fertilizers for the farms in the river valley. In World War II, the TVA provided the power for the Oak Ridge laboratory that helped to develop the first atomic bombs as part of the Manhattan Project. The Lower Colorado River Authority (LCRA), sponsored by Franklin Roosevelt’s dynamic young protégé, Texas congressman Lyndon Johnson, was a similar success in bringing cheap energy and economic development to the impoverished Hill Country of central Texas.

Like the South, the American West was transformed by federally sponsored hydroelectric power. In the middle of the 1930s, all five of the biggest structures in the world were dams that were being built in the United States—the Hoover Dam (Colorado River), the Grand Coulee (Columbia River), the Bonneville (Columbia River), the Shasta Dam (Sacramento River), and the Fort Peck Dam (Missouri River). Another massive water project in the West was the Central Valley Water Project, which diverted water supplied by the Shasta Dam on the Sacramento River and the Friant Dam on the San Joaquin River by way of a series of canals to farms in California’s Central Valley.

The Grand Coulee Dam—more than 550 feet tall and four-fifths of a mile in width—was the largest structure ever built.15 The dam held the record for size until it was surpassed by the Itaipu Dam along the Brazil-Paraguay border in 1982, which in turn was surpassed by China’s Three Gorges Dam in 2006. In World War II, the electricity generated by the dam powered aluminum plants for aircraft. The Bonneville Power Authority hired the folk singer Woody Guthrie to celebrate the Columbia River projects with a collection of songs, including “Roll on, Columbia” and “The Grand Coulee Dam”:

Well the world has seven wonders that the trav’lers always tell,

Some gardens and some towers, I guess you know them well,

But now the greatest wonder is in Uncle Sam’s fair land,

It’s the big Columbia River and the big Grand Coulee Dam. . . .

Uncle Sam took up the challenge in the year of thirty-three,

For the farmer and the factory and all of you and me,

He said, “Roll along Columbia, you can ramble to the sea,

But river, while you’re rambling, you can do some work for me.”

Now in Washington and Oregon you can hear the factories hum,

Making chrome and making manganese and light aluminum,

And there roars the flying fortress now to fight for Uncle Sam,

Spawned upon the King Columbia by the big Grand Coulee Dam.

As impressive as they were, the dams of the New Deal era contributed only a fraction of the energy consumed by industries and homeowners in the boom years that followed World War II. Coal-powered utilities provided the majority of electricity. As an icon of modernity, the hydropower dam was soon overtaken by the nuclear power plant. The Atomic Energy Commission (AEC) worked with the private sector in the 1950s to promote the peaceful use of atomic power. In 1957, the first commercial nuclear power plant opened in Shippington, Pennsylvania. To keep up with growing demand, utilities ordered more nuclear plants in the following decades. Following the accident at Three Mile Island in 1979, public opinion turned against nuclear energy. Even so, by 2000 there were more than a hundred nuclear power plants in the United States, which provided nearly a fifth of the nation’s electricity.

THE INTERSTATE HIGHWAY SYSTEM

In 1915, the wealthy socialite Emily Post wanted to take a trip by automobile across the United States. When she asked a friend who had made several cross-country trips by car what the best route would be, she was told, “The Union Pacific.”16

But change was coming. As early as 1893, the Post Office Rural Free Delivery Act helped to create a constituency for good roads by permitting farmers to shop using mail-order catalogs like the Sears, Roebuck catalog. (In order to prevent postmasters from telling local merchants what they were doing, mail-order customers sometimes asked for the catalogs to be delivered in plain wrappers.) In the early 1900s, farmers, bicyclists, and early automobile-club members had united in a campaign for better roads, whose first success was the Federal Aid Road Act of 1916, which provided matching federal grants to states for road construction.

The American highway system found a champion in Thomas Harris MacDonald, known as the “Chief,” who headed the Bureau of Public Roads from 1918 until 1953, when he was eased out by the Eisenhower administration. Born in Leadville, Colorado, MacDonald grew up in Iowa, where he saw his father, a lumber and grain dealer, forced to pay a “mud tax” because of poor road conditions. MacDonald’s greatest legacy would be the interstate highway system, for which the Eisenhower administration, which fired him, gets the credit. It was MacDonald’s opposition to toll roads that ensured that the federal highway system would be paid for by gasoline taxes.

Although what became the interstate highway system began under Franklin Roosevelt, Eisenhower supported the legislation that completed it. In 1919, a US Army convoy made an arduous trip that began in Washington, DC, and ended sixty-two days later in Oakland, California, having traveled chiefly on dirt roads west of Kansas City. The lesson was not lost on Eisenhower, one of the young soldiers. Later he was impressed by the German autobahn system. So intense, however, was the resistance to a modern highway infrastructure in the antistatist United States that the legislation was only passed in 1956 when it was named the National Interstate and Defense Highways Act and backed by the prestige of the former commander of America’s armies in Europe in World War II.

DECENTRALIZING POPULATION AND INDUSTRY

Like many progressives of his time, including his uncle Frederic Delano of the Regional Planning Association and his own National Resources Planning Board, Roosevelt favored the decentralization of population and industry. In January 1933, Roosevelt had said that it was necessary to “get [the unemployed] out of the big centers of population, so that they will not be dependent on home relief.”17 Harry Hopkins, director of the Federal Emergency Relief Administration (FERA), observed, “It would be a good thing for America if large cities disappeared and their industries were scattered in a thousand small communities.”18 In September 1933, a TVA publication advocated the dispersion of small industries to “smaller communities” in order to “avoid the unfortunate social consequences of excessive urbanization.”19

The two key technologies of the second industrial revolution, electricity and the internal combustion engine, permitted decentralization to be carried out. The dream of mass-producing low-cost homes had been shared by many liberals, including Roosevelt, as head of Secretary of Commerce Hoover’s American Construction Council in the 1920s. With the help of the Truman administration, the Lustron Corporation sought to mass-produce stainless steel houses. But that experiment ended in failure and scandal. Instead of Lustron, Levittown became the symbol of postwar suburban housing, thanks to the success of the Levitt brothers of Long Island in modeling the construction of housing developments on mass-production techniques in the auto industry.

The combination of Federal Housing Administration (FHA) loans and the GI Bill created the postwar suburbs that sprang up alongside new federal and state highways. Before the New Deal, only around 40 percent of the American population lived in a single-family home. Typical down payments were 30 to 35 percent, interest rates were often around 8 percent, and loans had to be repaid in five to ten years. Following the creation of the Home Owners’ Loan Corporation in 1933 and subsequent housing measures, the federal government standardized the thirty-year loan with down payments of 10 percent and interest no higher than 5 percent.20 In 1990, 46 percent of the US population lived in the suburbs.

The golden arches of McDonald’s became the international symbol of America’s postwar suburban middle-class lifestyle. The restaurant chain was founded in the 1930s by two brothers, Dick and Maurice “Mac” McDonald, who created a glass-walled drive-through restaurant that made hamburgers by assembly-line methods. A former milk shake salesman from Chicago, Ray Kroc, and a financial genius, Harry Sonneborn, turned McDonald’s into the largest owner of retail real estate on the planet by the 1980s—and even opened a Hamburger University. “The french fry would become almost sacrosanct for me, its preparation a ritual to be followed religiously,” Kroc explained in his autobiography.21

The conspiracy theory that automobile companies deliberately destroyed mass transit in order to force Americans to rely on cars has no basis in fact and has been debunked repeatedly by scholars.22 Mass-transit ridership peaked in 1919 and began its rapid decline as millions of Americans bought cars.23 Cities began to switch to buses because rail transit was losing ridership and buses were more flexible than fixed-rail lines. A similar shift has taken place in other industrial countries. As automobile ownership spread in Europe and Japan, mass-transit ridership declined as well, although it is still at higher levels than in the United States, which was the first nation with mass automobile ownership.24

The new automobile-centered culture was derided by American intellectuals, many of whom were downwardly mobile children of affluent parents who could afford to live and play in expensive bohemias in New York, San Francisco, and other big cities. By the late twentieth century, the American intelligentsia was all but united in its snobbish disdain for America’s working-class and middle-class suburbs, claiming that “sprawl” deadened the spirit and threatened the environment.

FROM DIXIE TO THE SUN BELT

The greatest triumph of government-sponsored decentralization of production during the thirty glorious years was the incorporation of the South into the American mainstream. Before the New Deal the United States had been two countries—a developed or developing industrial core in the Northeast and Midwest, and a poor and primitive periphery in the South and West that served as a resource colony for the manufacturing region. The federally sponsored infrastructural and agricultural modernization of the South and West, followed by the civil rights revolution, turned these two economies into a single national economy for the first time. From Texas to California to Florida, the formerly impoverished hinterland gave way to the booming new Sun Belt. Immigrants, businesses, and investors poured into parts of the South and West that their predecessors in earlier generations had avoided. One southerner quipped, “Cotton is going West, Cattle are coming East, Negroes are going North, and Yankees are coming South.”25

In 1938, President Roosevelt explained to an audience in Fort Worth, Texas, one of the purposes of the Fair Labor Standards Act in creating a national minimum wage: “You need more industries in Texas, but I know you know the importance of not trying to get industries by the route of cheap wages for industrial workers.”26 Federal minimum wages, by making labor more expensive, forced southern planters to mechanize farming, while the farm-price-support system and federal aid to farmers helped them to carry out long-term investments in productivity. Many of the black and white farm laborers and tenants displaced by mechanization made their way to cities in the South and other parts of the country, where many found better lives even as some were trapped in urban poverty. In 1910, 89 percent of black Americans lived in the South; by 1960, that number dropped to 53 percent, with 39.5 percent in the North and 7.5 percent in the West.27

In 1920, nine of the ten largest cities were in the Northeast or Midwest. In 1990, six of the ten largest—Los Angeles, Houston, Dallas, Phoenix, San Diego, and San Antonio—were in the Southwest.28 A new technology, air-conditioning, accelerated the migration of Americans from the Northeast and Midwest to the booming states of the new Sun Belt like California, Texas, and Florida. But climate was less important as a factor than federal public investment. During World War II and the Cold War, the federal government built great numbers of army bases, air bases, naval facilities, and defense-production plants in the South and West. The Sun Belt was also the Gun Belt.

New Deal liberalism succeeded in its goals of raising the incomes of farmers and industrial workers and raising the wage and living standards of the South. Between 1919 and 1940, the income of workers in agriculture was only 47 percent of the national average; by 1950–1955 it had nearly doubled, to 76 percent.29 In 1930, per capita income in the South was only 55 percent of the national average; by 1960, it had risen to 78 percent of the national average.30

THE MOTORIZED HOUSEHOLD

While the electric grid, the interstate highway system, and the pattern of suburban development that they enabled provided striking icons of the second industrial era, the motorized appliances that colonized the postwar American household had an equally transformative effect on how Americans lived. The aesthetic conservatism of most Americans ensured that postwar houses would be built in styles that evoked one or another historical tradition—colonial, Tudor, or ranch style, the last based loosely on the hacienda architecture of Old California. But while most Americans chose not to live in futuristic metal domes like the Dymaxion house designed by the engineer R. Buckminster Fuller, behind the traditional facade the new American house had utility systems and devices that would have been found only in science fiction a few decades earlier.

Nothing changed life more than the convenience of indoor plumbing. At the beginning of the twentieth century, water had to be lugged into most homes from wells or creeks, at the risk of spreading typhoid or cholera. Ninety-eight percent of white households and 92 percent of black households in 1970 had running water in the home, compared to only 24 percent in 1890 (the disparity among blacks and whites in 1970 was the result of black rural poverty).31

The self-cleaning oven and then the microwave, along with the dishwasher, the washing machine, and the drier, eliminated much of the drudgery of daily life. Between 1900 and 1975, the average time per week spent on meals and cleaning dropped from forty-four hours to ten, while the hours devoted to laundry fell from seven to one.32

In 1900, most families had to load stoves with wood or coal and lamps burned kerosene or coal oil. By 1950, more than 95 percent of households had electric lighting and central heating.33 Sixty-three percent of households had air-conditioning by 1987.34

Servants became as rare and anachronistic as vaudevillians and country peddlers. Thanks to labor-saving appliances, minimum-wage laws, and the restriction of mass immigration before its resumption in the 1960s, domestic servants became rare in the United States. In 1910, there were twenty million households and two million domestic servants. By 1970, only 1 percent of forty-six million households had a live-in servant.35

Postwar Americans enjoyed more personal space. In 1970, fewer than 8 percent of households had more than one person per room, compared to more than half in 1900.36 And they enjoyed more personal time. From an average work schedule for nonfarm workers of ten hours a day, six days a week, the workweek had declined to forty hours, thanks to greater productivity and New Deal labor laws.37

From the global level to the interior of the household, the second industrial revolution was reaching its maturity. Inhabiting a new structure built on the foundations of electricity and oil, ordinary Americans experienced a generation of unprecedented progress and prosperity.

THE VIRTUAL NIRA

The institutional as well as the physical underpinnings of the American economy were rebuilt in the New Deal era between the 1930s and the 1970s. According to conventional histories of the New Deal and the postwar era, the associationalist arrangements of the NIRA were forgotten following the Supreme Court’s destruction of the NIRA and AAA in the midthirties. Liberals, it is said, reconciled themselves to a combination of free enterprise and Keynesian demand management.

Nothing could be further from the truth. Keynesian demand-management policies were pursued inconsistently under presidents Truman, John F. Kennedy, and Johnson and hardly at all under Eisenhower. Nor was the postwar economy based on free markets, as those are usually defined. The major sectors of the economy were either organized as government-backed cartels or dominated by a few oligopolistic corporations. Unions were concentrated in the same sectors.

Following the demise of the NIRA in 1935 and the AAA in 1936, the Roosevelt administration and Congress quickly created a number of “mini-NIRAs” and “mini-AAAs” in a number of sectors of the American economy. As we saw in the previous chapter, an NIRA-like cartel was set up in the oil industry. Another miniature NIRA was created in the bituminous coal industry by the Bituminous Coal Conservation Act (the Gaffey Act) of 1937, which created an NIRA-like commission to supervise labor standards and prices. The Agricultural Adjustment Act (AAA) was ruled unconstitutional by the Supreme Court in United States v. Butler (1936), a year after the Court struck down the NIRA. But aspects of the AAA were reincarnated with the help of the Soil Conservation and Domestic Allotment Act of 1936, the Agricultural Marketing Act of 1937, and the Agricultural Adjustment Act of 1938.

With the exception of bituminous coal, oil, and agriculture, which was highly regulated and subsidized, most of the government-sponsored cartels were found in essential infrastructure industries where price volatility and ruinous competition could not be tolerated. As we have seen, the Public Utility Holding Company Act of 1935 replaced private electric-power consortiums with regional, publicly regulated utility companies. The Civil Aeronautics Act of 1938 created a price-and-entry cartel, labor standards, and a supervisory commission, the Civil Aeronautics Board (CAB). The Interstate Commerce Commission (ICC) cartelized the trucking industry with price-and-entry regulation. In telecommunications, the seven-member Federal Communications Commission (FCC) allocated radio and later license facilities, regulated rates, and oversaw the regulated monopoly of AT&T. Following World War II, 15 percent of the US economy was regulated by these agencies and others, including the Securities and Exchange Commission (banking and finance), the Federal Power Commission (natural gas, hydro dams, and nuclear energy); the Department of Agriculture’s Farm Bureau (agribusiness); and the Federal Maritime Commission (shipping).38

Far from being an embarrassing aberration, the NIRA was the unacknowledged blueprint for the prosperity of the Golden Age between World War II and the 1970s. Resurrected in the wages and hours laws of the NLRA, the price supports and subsidies of the agricultural sector, the infrastructure industries with price-and-entry regulations, and the oligopolies of the industrial sector with de facto tripartite bargaining among business, labor, and government, together created what might be called a “Virtual NIRA” structured much of the American economy after World War II until it was partially dismantled between the 1970s and 1990s.

THE AGE OF OLIGOPOLY

In concentrated manufacturing sectors like automobiles, steel, and rubber, employer-union agreements and the power of the dominant oligopolies to set prices made formal mini-NIRAs unnecessary. As in the depressions of the nineteenth century, the Great Depression produced greater concentration of industry as surviving companies devoured victims. Between 1947 and 1968, the share of value added in manufacturing of the two hundred largest industrial corporations in the United States rose from 30 percent to 41 percent, while their share of total corporate manufacturing assets increased from 47.2 percent to 60.9 percent.39 In 1976, the combined gross income of the three biggest industrial corporations—Exxon, General Motors, and Ford—exceeded the total income of all US farms, including government subsidies.40 In the same year, two companies—AT&T and General Motors—employed 2 percent of the US civilian labor force.41

Television was dominated by the Big Three (CBS, NBC, and ABC) and the automobile industry by another Big Three (General Motors, Ford, and Chrysler). In steel, there were United States Steel, Republic, and Bethlehem; in chemicals, DuPont, Allied Chemical, and Union Carbide; in food processing, General Foods, General Mills, and Quaker Oats; and in jet engines only two major companies, General Electric and Pratt & Whitney. Observing the scene at the time, Adolf Berle and John Kenneth Galbraith agreed that most of the midcentury American economy was largely planned, by the public sector in the regulated industries and by private management in the industries dominated by corporate oligopolies.42

Many firms in the industrial sector benefited from large-scale privatization of government property after World War II. A significant portion of America’s postwar 1945 industry consisted of privatized government factories. During the war, the Defense Plant Corporation, a branch of the RFC, funded the construction of numerous industrial plants.43 At the time it was dissolved on June 30, 1945, the corporation owned 10 to 12 percent of US industrial capacity. This included 96 percent of the capacity of the American synthetic rubber industry, 90 percent of capacity in the magnesium industry, and 58 percent of capacity in the aluminum industry, as well as significant portions of the capacity in the iron and steel, gasoline, machine tool, and radio industries.44 The War Assets Administration, succeeded by the General Services Administration, oversaw the sale of these assets to the private sector at a fraction of their actual cost, providing a massive transfer of public resources to private industry. Among the beneficiaries of the government sell-off was the rubber industry. During the war, the federal government spent $700 million building fifty-one factories to produce the ingredients for synthetic rubber. These government factories were sold to private industry by the middle of the 1950s.45

America’s oligopolistic corporations were both stable and prosperous. Between 1954 and 1976, fewer than five of the one hundred largest industrial corporations lost money, with the exception of a two-year period.46

Private R&D in the United States was dominated by a small number of large, oligopolistic companies. In 1974, three-fourths of all industrial R&D was performed by 126 companies with more than twenty-five thousand employees. The four companies with the largest R&D efforts were responsible for 19 percent of all industrial R&D.47

The Brandeisian strain of liberalism did not die out completely. During World War II, the need for government-business cooperation prevailed but, following the war, the Truman administration adopted a vigorous antitrust policy. By 1949, nearly half of the one hundred largest industrial companies, including Alcoa, DuPont, and US Rubber, were confronted with antitrust prosecutions.48

The Celler-Kefauver Act of 1950 ordained that, to avoid antitrust prosecution, businesses could not engage in horizontal mergers with firms in related businesses. The consequences of this law for American business and the American economy will be discussed in the next chapter. Here it is enough to observe that trust-busting provided a minor counterpoint to the main theme of big business and industrial concentration in the 1950s and 1960s.

FROM FINANCE CAPITALISM TO MANAGERIAL CAPITALISM

The Glass-Steagall Act and the creation of the Securities and Exchange Commission (SEC) brought an end to the era of American finance capitalism symbolized by J. P. Morgan. Investment bankers were separated from commercial bankers and forbidden to be on the boards of directors of corporations.

The dominance of productive industry over finance during this period is shown by the reduced dependence of corporations on Wall Street. According to a congressional report in 1962, following World War II, industry was financed primarily from within. Borrowing from banks and issuing securities accounted for only a quarter of capital: “As compared with the 1920s, corporate financing in recent years featured a higher reliance on internally generated funds, a modest rise in the importance of long-term borrowing, and a sharp reduction in stock flotations.”49 The post-1945 boom, by allowing many firms to finance themselves to a greater degree from retained earnings, reduced the influence of the financial sector over corporate America even more.

Freedom from autocratic founders and financiers provided the managers of big companies with stability of tenure. In 1952, three-quarters of eight hundred senior executives in three hundred industrial, railroad, and utility companies had been with the same corporations for more than twenty years.50 There was no market for executives, of the kind that developed at the end of the twentieth century. In the concentrated sectors, lifetime employment was the norm for managers and employees alike. Promotion was typically from within the company. Because executive salaries were restrained, the perquisites of office—the key to the executive bathroom or the corner office—were the goals of competition. On retirement, managers and many if not all employees of large companies could look forward to a defined-benefit pension plan.

Many of the executives who worked their way up to the top of companies began as engineers. In the early 1900s, Thorstein Veblen had called for a “soviet of engineers.” James Burnham, Leon Trotsky’s deputy in the United States and later a founder of the conservative movement, argued that in all industrial societies a “managerial elite” was displacing capitalists. In 1932, in The Modern Corporation and Private Property, Adolf Berle and Gardiner Means had called for managers to become a “neutral technocracy.” In his 1964 book The American Economic Republic, Berle argued that most economic power was now exercised by corporate managers, the managers of pensions and mutual funds, government officials, and scientific experts.51 In his 1966 Reith Lectures for the BBC, which became his book The New Industrial State, John Kenneth Galbraith made a similar argument that a “technostructure” of managerial firms had replaced the market with private planning in the concentrated, capital-intensive sectors of the economy.52

“WHAT WAS GOOD FOR OUR COUNTRY WAS GOOD FOR GENERAL MOTORS”

When President Eisenhower nominated Charles Erwin “Engine Charlie” Wilson, the president of GM, to be secretary of defense in 1953, Wilson was asked at a Senate confirmation hearing about conflicts of interest. Wilson replied: “I cannot conceive of one because for years I thought what was good for our country was good for General Motors, and vice versa. The difference did not exist. Our company is too big. It goes with the welfare of the country.”53 In the popular press and history books, Wilson was often accused of having said something different: “What is good for General Motors is good for America.” But in his actual statement he put the good of the country first.

Wilson was Veblen’s engineer and Burnham’s manager rolled into one. An electrical engineer by training, he worked for Westinghouse and then General Motors, where he became president in January 1941. Wilson’s career symbolizes the collaboration of the military and the private sector in promoting technological innovation in the twentieth century. During World War I, Wilson helped Westinghouse develop radio generators and electric motors for the US military. During World War II, as head of General Motors, he directed the company’s war production.

Wilson was typical of the “organization men” who dominated the American economy during the Golden Age of high growth and widely shared prosperity from 1945 to 1973. What has come to be known as the “stakeholder” conception of the corporation was widely shared during the Golden Age. In 1951, the chairman of Standard Oil of New Jersey explained: “The job of management is to maintain an equitable and working balance among the claims of the various directly affected interest groups . . . stockholders, employees, customers, and the public at large. Business managers are gaining professional status partly because they see in their work the basic responsibilities that other professional men have long recognized in theirs.”54 Thomas Murphy of GM explained: “The UAW may have introduced the sit-down strike to America, but in its relationship with GM management it has also helped introduce . . . mutually beneficial cooperation. . . . What comes to my mind is the progress we have made, by working together, in such directions as providing greater safety and health protection, in decreasing alcoholism and drug addiction, in improving the quality of life.”55

Founded in 1942, the Committee for Economic Development (CED) was a spin-off of the Business Advisory Council of the Department of Commerce. Dominated by representatives of large and medium-size firms, the CED carried on a version of the associationalist tradition, in opposition to the hard-edged antistatism of the National Association of Manufacturers and its small-business constituency. William Benton, the founder and vice president, explained: “The historic attitude of business has been to use government if it could, and abuse it if it couldn’t. Philosophically, business was committed to the doctrine that, ‘that government is best which governs least.’ ” The CED attitude was that “government has a positive and permanent role in achieving the common objectives of high employment and production and high and rising standards of living for people in all walks of life. . . . This is our present answer to the European brands of socialism. Long may it thrive.”56

The role of government in the era of American stakeholder capitalism was not limited to serving as an umpire enforcing neutral rules of competition and transparency in a free market. The government was part of a tripartite system that also included management and labor, which, if less formal than the aborted NIRA system of the 1930s, was similar in spirit. Unofficial and fragmented as it was, interest-group pluralism was the successor to NIRA corporatism and the equivalent of more formal tripartite government-business-labor bargaining systems in other Western democracies. What Galbraith later called “countervailing power” was described by the journalist John Chamberlain as the essence of the “broker state” created by the New Deal: “The labor union, the consumers’ or producers’ co-operative, the ‘institute,’ the syndicate—these are the important things in a democracy. If their power is evenly spread, if there are economic checks and balances to parallel the political checks and balances, then society will be democratic. For democracy is what results when you have a state of tension in society that permits no one group to dare to bid for total power.”57

THE TREATY OF DETROIT

Following World War II, the uneasy peace between government and business was joined by a shaky rapprochement between business and organized labor. During the New Deal era, organized labor reached a degree of influence that it did not possess before or afterward. After the Supreme Court struck down the NIRA, some of the prolabor provisions that it contained were included in the National Labor Relations Act, signed in June 1935 by Roosevelt, and the 1938 Fair Labor Standards Act. The Taft-Hartley Act of 1947 shifted the balance of power back toward management, while leaving the structure of labor relations created during the Roosevelt years intact.

In 1950, Walter Reuther’s United Auto Workers (UAW) signed a five-year agreement with the Big Three automobile makers of Detroit—Ford, GM, and Chrysler. The UAW agreed to forgo strike activity in return for inflation-adjusted salaries, health and retirement packages, and other benefits. The five-year contract that GM signed with the UAW in 1950 that provided generous benefits to auto workers became known as “the Treaty of Detroit.” Ford and Chrysler signed identical agreements and many lesser companies followed their lead.

Even in the absence of the sector-wide bargaining that the NIRA-code authorities had sought to promote, contracts in the concentrated industrial sector served as informal models for wage-and-benefit policies in other industries until the 1970s. The percentage of Americans who belonged to labor unions, only 2.7 percent in 1900, had risen to 12.1 percent in 1920, only to fall to 7.4 percent in 1930. By the mid-1950s, roughly one in three American workers belonged to a union.

Because US economic growth slowed in the late twentieth century, pensions, health insurance, and other company-based benefits imposed crippling costs on America’s automakers and other companies. For this reason, it is often argued that unions crippled US manufacturing industries. But the corporations, not the unions, preferred employer-based benefits systems. They spurned UAW leader Walter Reuther’s proposal that the car companies “go down to Washington to fight with us” for universal federal health care and retirement benefits. In the 1950s and 1960s, the Big Three and other corporations were confident that they could always pass on the cost of benefits to consumers. In addition, they preferred to increase benefits rather than wages. For example, in 1961, GM was able to raise wages by only 2.5 percent by increasing pension obligations in the future by 12 percent.58

BANKING AS A PUBLIC UTILITY

The revolutionary changes in the economy and society that occurred during and after World War II were accompanied by a degree of stability in the financial sector without precedent in American history. In the mature New Deal order between the 1940s and the 1970s, finance went from being the master of American industry to being its servant. The Glass-Steagall Act and other New Deal–era reforms made commercial banking a boring, safe sector dominated by small banks serving local and state businesses. A scaled-down investment-banking sector played an essential but limited role in raising capital for large corporations.

Investment banks were partnerships, not the corporations with limited liability that they became at the end of the century (before they metamorphosed into “bank holding companies” in order to be rescued by the Federal Reserve following the crash of 2008). Before the 1970s, the SEC required investment banks to be partnerships, so that the prospect of personal liability for firm losses would encourage the partners to be risk averse and prudent.

Between World War II and the 1980s, American banking was safe and dull. Commercial banks took deposits and made loans. Commercial banks were also the main source of credit to most companies except for the largest, which could raise money in the bond market. Savings and loans (S&Ls) facilitated mortgage lending.

Another reform intended to steer finance away from speculation into productive investment was Regulation Q, passed as part of the Glass-Steagall Act of 1933 and reaffirmed by the Banking Act of 1935. Regulation Q banned the provision of interest payments on checking accounts at commercial banks. It also allowed the Federal Reserve to set interest-rate ceilings on savings accounts and time deposits such as certificates of deposit. Like the other elements of the New Deal system, the regulation sought to limit competition among banks for customers that might drive banks into risky practices. Another goal of Regulation Q was to encourage community banks to lend to productive local businesses and enterprises, rather than hold large balances with big metropolitan banks that might use the money for unproductive speculation.59 Ruinous competition among banks was also checked by the 1927 McFadden Act, which preceded the New Deal and imposed restraints on interstate and intrastate branch banking. Banks were stable, protected public utilities, not dynamic firms in a competitive marketplace. Banking had become a boring business, symbolized by the “3–6–3” rule that summed up the banker’s job: “Borrow at 3 percent, lend at 6 percent, golf at 3 p.m.”

In addition to regulating finance, the federal government acted as a banker in its own right. Although the Reconstruction Finance Corporation was effectively abolished in 1953, its offspring continued to provide the country with public-investment banks for particular purposes. The Farm Credit System; the Federal Home Loan Banks, Fannie Mae and FHA; the Export-Import Bank; and other public-development banks helped to provide inexpensive credit on fair terms to farmers, small businesses, and the new home-owning majority.

DEMOGRAPHIC CHANGE

The 1920 census was the first in which urban Americans outnumbered rural Americans. By 1950, two-thirds of native-born white Americans lived in urban areas, whereas two-thirds had lived in rural areas in 1900.60

In 1950, 88.7 percent of whites worked in nonfarm occupations, compared to only 11.3 percent on farms. The white workforce was divided between blue-collar manual jobs (46.2 percent) and white-collar jobs, including sales and clerical jobs along with managerial and professional jobs (45 percent).

Only 8.8 percent of whites worked as domestic servants or in other menial services. Blacks, however, were heavily represented in those services (17.1 percent), farms (18.8 percent), and manual labor (52.1 percent), and were underrepresented in white collar jobs (11.9 percent).61 Jobs remained highly segregated by gender, with women largely limited to the occupations of secretary, teacher, and nurse. Clerical jobs accounted for three-tenths of the nonfarm jobs for women in 1950.62

Mid-twentieth-century Americans were more educated than previous generations had been. Before 1910, the average white American had only an elementary school education. As a result of the establishment of high schools, which occurred along with the prohibition of child labor, by 1950, both native-born and second-generation white men who were twenty-five or older had an identical 9.8 years of education, while blacks had 6.4 years.63

Medical advances in combating polio, influenza, typhoid fever, and tuberculosis led to reductions in mortality. In 1950, black life expectancy was eight years less than that of whites.64

What came to be known as “the Pill” was the new medical technology with the greatest impact on society. Fertility was affected both by reductions in child mortality and by access to contraception and abortion. The combination of falling fertility and restricted immigration produced a slowdown in population growth from a level of 2 percent a year in 1905–1910 to a low of 0.7 percent in the 1930s.65 The baby boom that followed World War II was a temporary blip in the long decline of native fertility, which fell below replacement levels among native-born whites in the United States, as in Europe.

THE CIVIL RIGHTS REVOLUTION

By the 1950s, the United States was no longer a nation of immigrants, but rather a largely closed economy with relatively little trade or immigration and whose population was overwhelmingly native born. The reduction in European immigration caused by World War I, and then by legal restrictions in 1921 and 1924, reduced the average annual rate of immigration to the United States between 1915 and 1950 to only one-fifth of its earlier rate.66 The foreign-born declined as a percentage of the US population from 13.4 percent in 1900 to only 6.7 percent in 1950.67

In the sixth edition of his famous Economics textbook, Paul Samuelson wrote: “After World War I, laws were passed severely limiting immigration. Only a trickle of immigrants has been admitted since then. . . . By keeping labor supply down, immigration policy tends to keep wages high.”68 It is doubtful that organized labor could have enjoyed as much success in the mid-twentieth century without the low levels of immigration that existed following the restriction of European immigration in the 1920s.69

In 1950 the nonwhite population, at 10 percent, was lower than in 1900, when it had been 11.6 percent, and was overwhelmingly made up of black Americans, with only 0.5 percent of the population composed of members of the “other” category—chiefly Chinese, Japanese, and American Indians.70 The fact that the new, postethnic white population felt secure in its majority may have contributed to the acceptance by most white Americans of the dismantling of America’s version of white supremacy.

Led by Martin Luther King Jr. and his allies, the civil rights movement reached completion with the passage of the Civil Rights Act of 1964, the Voting Rights Act of 1965, the antiracist immigration reform of 1965, and the Fair Housing Act of 1968. During the civil rights era, Latinos as well as blacks struggled for economic rights. The battlefields were literal fields. Cesar Chavez, the leader of the United Farm Workers, used nonviolent protest tactics and boycotts like those of black civil rights champions to pressure California into outlawing el cortito, or el brazo Diablo, “the devil’s arm,” a short-handled hoe that inflicted pain on farmworkers and symbolized their oppression. In the thirty-five years after the short-handled hoe was banned, back injuries among California’s farmworkers declined by 34 percent, according to the California Rural Legal Assistance program (CRL).71

Another target of reform in the civil rights era was the exploitative Bracero (“arm worker”) program. Begun in 1942 to supply US farmers with migrant Mexican workers during the wartime labor shortage, the program had turned into a system of indentured servitude, permitting American agribusiness to use foreign nationals in serflike conditions rather than to hire Americans and legal immigrants to work in decent conditions for decent wages. At the time of the Bracero program’s creation, a Texas farmer said: “We used to own slaves, but now we rent them from the government.”72 Lee G. Williams, a Labor Department officer who supervised it, described the Bracero program as “legalized slavery.”73 Chavez explained: “The jobs belonged to local workers. . . . Braceros didn’t make any money, and they were exploited viciously, forced to work under conditions the local people would not tolerate.”74 Chavez and Dolores Huerta, the cofounder of the United Farm Workers, joined with labor and liberals to persuade Congress to abolish this form of indentured servitude in 1964. In testimony against other agricultural guest-worker programs before the US Senate on April 16, 1969, Chavez observed: “In abolishing the bracero program, Congress has but scotched the snake, not killed it. The program lives on in the annual parade of illegal and green carders across the United States-Mexico border to work in the fields.”75

THE WAR ON POVERTY

Ronald Reagan quipped that Lyndon Johnson declared war on poverty and poverty won. Conservatives succeeded in portraying the War on Poverty as a dismal failure. But the evidence indicates otherwise. Between the presidencies of Lyndon Johnson and George W. Bush, black poverty declined from a little more than 40 percent to 22 to 24 percent. More than half of that reduction came between 1966 and 1969 alone, when black poverty plummeted from 40.9 percent to 30.9 percent. Poverty among non-Hispanic whites dropped from 14.7 percent in 1962 to 6.1 percent in 2006.76 The overall poverty rate would have been even lower in the early twenty-first century if not for the mass immigration that followed the 1965 immigration reforms. Because US immigration was dominated by poor people from Mexico and Latin America, Latinos accounted for all the growth in poverty after 1990.77

In 1978, the economist Martin Anderson, who became a leading adviser in the Reagan administration, conceded that, despite problems with fraud and inefficiency, welfare programs had eliminated most poverty in the United States: “The ‘dismal failure’ of welfare is a myth. . . . But if we step back and judge the vast array of welfare programs, on which we spend billions of dollars every year, by two basic criteria—the completeness of coverage for those who really need help, and the adequacy of help they do receive—the picture changes dramatically. Judged by these standards our welfare system has been a brilliant success. The war on poverty is over for all practical purposes.”78

Like his mentor Roosevelt, LBJ preferred work to welfare and sought to combat poverty by means of training programs and jobs programs like the Job Corps and Volunteers in Service to America (VISTA). Describing the Economic Opportunity Act of 1964, which promoted jobs and training for the poor, Johnson said: “This is not in any sense a cynical proposal to exploit the poor with a promise of a handout or a dole. We know—we learned long ago—that answer is no answer. . . . We are not content to accept the endless growth of relief rolls or welfare rolls.” When the bill was being drafted, Johnson ordered one aide, Lester Thurow, to remove any cash-support programs and told another aide, Bill Moyers, “You tell [Sargent] Shriver, no doles.”79

UTILITY CAPITALISM AND THE MASS MIDDLE CLASS

On November 8, 1954, in a letter to his brother Edgar, President Dwight Eisenhower reacted angrily to the criticism that his administration was continuing the policies of his immediate predecessors Franklin Roosevelt and Harry Truman: “Now it is true that I believe this country is following a dangerous trend when it permits too great a degree of centralization of governmental functions. I oppose this—in some instances the fight is a rather desperate one. But to attain any success it is quite clear that the Federal government cannot avoid or escape responsibilities which the mass of the people firmly believe should be undertaken by it. . . . Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history. There is a tiny splinter group, of course, that believes you can do these things. Among them are H. L. Hunt (you possibly know his background), a few other Texas oil millionaires, and an occasional politician or business man from other areas. Their number is negligible and they are stupid.”80

At the beginning of the Depression, the federal government played almost no role in protecting Americans from economic hardship; by the 1950s, the federal program of Social Security and the federal-state unemployment and welfare programs had created a modern safety net for an urban, industrial society. Before the Depression, corporations had all but extinguished unionization in the United States; in the mid-1950s, following the New Deal, one in three American workers was unionized. When the stock market crashed in 1929, many American companies were controlled by powerful investment banks; following World War II, managers were powerful and the once-powerful financial sector was reduced to the status of a tightly regulated utility. The “class market” for automobiles, radios, refrigerators, and other inventions of the second industrial revolution became the mass market of the Eisenhower era.

The New Deal liberal system of trickle-up, demand-side economics succeeded in creating a mass middle class that was also a mass market for the products of American factories and farms. Thanks to the New Deal, working Americans were guaranteed a minimum income by minimum-wage laws and unemployment insurance, while retirees were guaranteed a minimum income in old age by Social Security. Union membership added an additional wage premium for Americans in organized industries. These income guarantees benefited American businesses in two ways. By removing the possibility that competitors would use starvation wages to their advantage, they permitted all businesses to compete on the basis of price and quality rather than success in exploiting labor. And they solved the pre–New Deal problem of the maldistribution of income and underconsumption by enabling sufficient levels of mass consumption by adequately paid workers and retirees.

The American economy between the 1940s and the 1970s was a version of the associationalist economy envisioned by the progressives of the 1900s, and embodied successively in the economic mobilization agencies of World War I, the voluntary associationalism promoted by Herbert Hoover as commerce secretary in the 1920s, and Franklin Roosevelt’s NIRA and the little NIRAs that re-created it piecemeal. The historians Jonathan Hughes and Louis P. Cain emphasize the extent to which the New Deal re-created the institutions of World War I: “The WIB would reappear in 1933 as the National Recovery Administration (NRA). The United States Grain Corporation would resurface in the 1930s as the Commodity Credit Corporation. The planning activities of the Food Administration would reappear in the two Agricultural Adjustment Acts. The Emergency Fleet Corporation came back as the National Maritime Administration. The Federal Housing Administration of the 1930s had been born first as the wartime United States Housing Corporation. The Fuel Administration under the Lever Act reemerged in the 1930s as the Bituminous Coal Division in the Interior Department.”81 Thus there is a direct line of descent from the economic mobilization of World War I to the NIRA and beyond to the highly regulated and cartelized economy of the United States in the mid-twentieth-century Golden Age of American capitalism.82

In the Schechter case of 1935 in which the Supreme Court struck down the NIRA, the “sick chicken” killed the Blue Eagle. But the Blue Eagle was reborn from the ashes like a phoenix. Before the deregulation efforts of Jimmy Carter and the union-busting campaign of Reagan in the 1970s and 1980s, the United States was governed by a virtual NIRA system characterized by oligopoly and unionization in major mass-production industries and regulated cartels in major energy, transportation, and utility industries. The vision of Fordism was realized, as high wages for workers translated into high aggregate demand for the products of American factories in a national economy little affected by foreign trade or investment. The New Deal turned a number of major industries including finance into regulated utilities.

Beginning in the 1990s, neoliberal Democrats and Republicans reversed this process. Correlation does not prove causation, but the historical record is suggestive. The American middle class enjoyed its zenith under a system of highly regulated, partly cartelized capitalism, and suffered under the less regulated capitalism that preceded it and followed it.