WHEN LYNDON JOHNSON SUCCEEDED to the presidency on the assassination of John F. Kennedy, he proved to be a very different president. A decade older than Kennedy, Johnson was fully a son of the New Deal, one with deep faith that government could solve social and economic problems. He also possessed what were perhaps the most surpassing legislative skills of any American president. These skills had made him the most effective majority leader in the history of the Senate, and he was determined to use them to achieve what he saw as the completion of the New Deal of his political hero, FDR. “In your time,” he told an audience at the University of Michigan on May 2, 1964, “we have the opportunity to move not only towards the rich society and the powerful society, but upward to the Great Society.”
With the help of an overwhelming electoral victory in November that year, Johnson prodded Congress to pass bill after bill. The Equal Opportunity Act (1964), Mass Transit Act (1964), Medicare and Medicaid(1965), Older Americans Act (1965), Appalachian Regional Development Act (1965), Head Start (1965), the Demonstration Cities and Metropolitan Development Act (1966), Higher Education Act (1967). Along with many other, smaller, programs that involved the federal government in areas of national life it had never before been concerned with, these caused a breathtaking rise in federal expenditures. Nondefense government expenditures rose by a third in just three years, from 1965 to 1968, from $75 billion to $100 billion. Two years later they were $127 billion. Meanwhile, the Vietnam War escalated quickly. In 1965 the defense budget had been $50 billion. In 1968 it was $82 billion.
Had the economy been underperforming, as it had been in the 1930s, the result of all this new spending would have been stimulating. But the economy in the mid-1960s was near full employment, so the inevitable result was that inflation began to increase. A vicious circle quickly developed. Increased inflation caused interest rates to rise as lenders wanted protection from the inflation. But the Federal Reserve, operating on a Keynesian model, was afraid that increased interest rates would cause economic growth to end, and so it expanded the money supply to keep interest rates low. An increased money supply, relative to the goods and services the money could buy, ineluctably caused further inflation.
The American economy began to deteriorate sharply. Unemployment, which had been at only 3.5 percent in 1968, rose to 4.9 percent in 1970 and to 5.9 percent in 1971. According to Keynesian theory, increasing inflation and increasing unemployment could not happen at the same time, and a new word was coined in 1970 to denote the unprecedented situation: stagflation.
The stock market, which had been rising steadily since the early 1950s, crossed the 1,000 mark on the Dow, intraday, for the first time in 1966. Then it stalled at that point. Four times in the next six years the Dow crossed 1,000 intraday before the market finally closed above 1,000, reaching a high of 1,051.70 on January 11, 1973. But it soon sank back in the worst bear market since the early 1930s. By the end of 1974 it was down to 577.60. The by-then-raging inflation masked how deep the fall of the stock market actually had been. Measured in constant dollars, the Dow-Jones Industrial Average was below its level of the early 1950s, when the great bull market had begun.
With the inflation, which was unprecedented in the peacetime history of the American economy, and the easy money policy of the Federal Reserve, the dollar became overvalued in terms of other currencies. This made American goods seem expensive to those using other currencies and foreign goods look cheap to Americans. The trade balance, which had been strongly in favor of the United States in the early years after the Second World War, had inevitably shrunk as foreign economies recovered. In 1959 it had shown a small deficit. But it began to deteriorate rapidly in the late 1960s, and in 1971 it fell into deficit once more and continued to deteriorate.
Because the dollar was the currency of world trade and, under the Bretton Woods Agreement, was convertible into gold at a fixed price of $35 to the ounce, dollars accumulated in foreign central banks and financial institutions. They often circulated abroad without ever returning to the United States. As American inflation increased, these “eurodollar” holdings began to seem precarious, and gold began to flow abroad in quantity for the first time since the early 1930s. The international currency market, a growing force in the international economy, began betting against the dollar.
On August 15, 1971, President Nixon acted decisively, if not necessarily wisely, to solve the increasing economic problems that confronted the country. First, he renounced the Bretton Woods Agreement and severed the link between the dollar and gold. The dollar would now float in value, and the gold standard, after 150 years, was dead. Second, he froze all wages, rents, and prices for a period of ninety days, to be followed by strict wage and price controls.
Wage and price controls have a long history, almost all of it bad. In a free market it is prices that signal, in their uncountable millions, where resources should be allocated and where opportunity lies, what is becoming scarce and what plentiful, allowing people to adjust their economic behavior accordingly. When prices are fixed, however, shortages and surpluses inevitably and quickly develop. That is why there is a permanent shortage of housing wherever there is rent control. Price controls also transfer power from free markets—in other words, the people—to politicians. Politicians, of course, are always tempted to use this power to benefit favored groups, while the disfavored continue to pursue their self-interests through black markets.
Price controls were first tried on a grand scale by Diocletian, who became emperor of Rome in the late third century. Earlier emperors had progressively debased the coinage with base metals, setting off a rampant inflation. Diocletian attempted serious reform of both the coinage and taxation, but lacked enough precious metal to create an adequate money supply. So he was forced to issue base-metal coins as well, with a wholly artificial value that he sought to enforce by law.
It didn’t work, of course; Gresham’s law makes it impossible. So Diocletian, unable to contain inflation by setting the price of money, tried to contain it by setting the price of everything else. His edict to that effect, which survives, is an invaluable historical window into the economy of the late Roman world, listing legal prices for all sorts of goods and services. But, as with all subsequent attempts to control by law prices that would otherwise be set by millions of people each pursuing his or her self-interest, it was a dismal failure, despite liberal use of the death penalty as a means of enforcement. Goods simply went into hiding, were bartered, or were traded illegally by parties who had a mutual self-interest in not informing the state.
Nixon’s price and wage controls fared no better. Within two years they were abandoned, and inflation, now unchecked by any link to gold, continued unabated around the globe. As a result, interest rates soared as lenders demanded protection against the rapidly falling value of the dollar and other currencies. William Zeckendorf, a famously risk-taking New York City real estate developer, often said, “I’d rather be alive at twenty percent than dead at the prime rate.” Four years after his death in 1976, however, the prime rate itself stood at 20 percent. The inflation in 1980 reached 13.5 percent, by far the largest peacetime inflation in the nation’s history.
Meanwhile, the American trade balance continued to deteriorate. As foreign countries recovered from the war and rebuilt their economic infrastructure, they often became the low-cost producers with their new plants. As transportation costs and tariffs fell relentlessly in the postwar years, these countries were more and more able to compete successfully in the American market with American companies.
This was also true with some raw materials, especially petroleum. The petroleum industry had been born in the United States, and the country remained a net exporter of oil until the 1950s. But by the 1970s, as rich American fields were increasingly depleted and new ones became ever more expensive to exploit, cheaper foreign oil began to flow into the country in larger and larger amounts. Naturally, it wasn’t long before the oil-exporting countries sought to take advantage of this situation, forming a cartel called OPEC (Organization of Petroleum Exporting Countries) to raise prices.
As a result of the 1973 Yom Kippur War between Israel and its Arab neighbors, many oil-exporting countries refused to export to the United States. Long lines formed at gas stations in what had always been the quintessential “land of plenty,” while prices for oil products rose steeply. It came as a profound shock to most Americans and to the American economy, as the cost of petroleum affects the price of nearly every other product.
It also came as a shock to what had been the linchpin of the American economy for sixty years, the automobile industry. The industry had had the American market almost entirely to itself since the war. It had evolved into a loose cartel, enforced by antitrust laws that kept the big three automobile companies, General Motors, Ford, and Chrysler, from aggressively seeking market share from one another.
With no need to take the risks and expenses of innovation, the industry had stagnated technologically. The last major technological advance had been automatic transmission, first introduced in 1948. Instead the automobile companies concentrated on appearance, size, and power. American cars in the postwar years became larger and larger and often sported such nonfunctional features as tail fins and much chrome. The evolution of American automobiles in these years is strikingly analogous to the tendency of living things isolated from competition on lush islands to evolve into giant and often grotesque forms. Just as markets are ecosystems, ecosystems are markets.
With the oil embargo of 1973, the isolation of the American automobile market ended abruptly. With gas in short supply, far more fuel-efficient European cars saw a sharp rise in demand. The Volkswagen Beetle, which had had only a niche market among college students and families with second cars, became an icon of the new automotive era and would have a longer production run than even the Ford Model T. Japanese cars also began to invade the American market, revealing starkly how poorly many American cars were manufactured and how inefficient American manufacturing had often become.
With the long period needed to redesign and retool, the American automobile industry would struggle for more than a decade to regain its footing. By the time it did, the automobile business had become one of the first heavy manufacturing industries to be thoroughly globalized. No car today is manufactured entirely in one country, and, increasingly, such words as “American,” “German,” and “Japanese” refer only to the location of corporate headquarters and the largest concentration of stockholders.
Many American factories closed as they became obsolete, and the term rust belt entered the American lexicon. But new factories, many built by foreign companies, were also opening, many in other areas of the country, especially the South and West. These new factories were often able to produce the same quantity of goods with less labor, due to increased productivity.
Because the media have a natural tendency to concentrate on the bad news, this was often perceived as a decline in American economic power, with the rust belt as its symbol. In fact, it was the beginning of a profound restructuring. Manufacturing had been declining as a percentage of the economy since the end of the Second World War. Then half the jobs in the country were manufacturing jobs. By the mid-1970s more than two-thirds of the jobs in the American economy were in services.
But this restructuring, which is continuing after more than two decades, produced wrenching change and much individual and local economic pain. The official poverty rate, which had fallen from 22 percent in 1959 to 11 percent in 1973, rose to 15 percent by 1983. The amount of steel produced in the country remained constant at about 100 million tons a year, but the number of steel workers declined from 2.4 million in 1974 to less than a million in 1998.
Also undergoing wrenching change were government fiscal and regulatory policies. The policies, largely in place since the New Deal, were characterized by a progressive income tax with high marginal rates on large incomes, and considerable regulation of major sectors of the economy, both to prevent the formation of monopolies and to limit “excess competition.” They had become the accepted wisdom for how to manage a modern economy, but it was becoming increasingly clear that policies developed in the 1930s simply did not work in the different economic universe of the 1970s.
THE FOUNDING FATHERS had perceived the executive as the major threat to financial prudence, not Congress. After all, Parliament had come into existence in the Middle Ages precisely to be a check on royal extravagance and to limit the taxing power of the king. It remained such a check in the late eighteenth century, as only men of property—taxpayers, in other words—held the franchise. The Founding Fathers expected Congress, also largely chosen by property owners in its earliest years, to fulfill the same function. But the coming of true democracy with universal male suffrage in the age of Andrew Jackson began to change Congress’s attitude toward spending.
While Congress as a whole has a collective obligation to control spending, each individual member has a personal self-interest in obtaining the most government spending in his or her own district or state. The phrase pork barrel entered the American political lexicon as early as 1904. With the coming of the New Deal and safety net programs such as Social Security, there also developed a pressure to vote for popular new benefits.
Presidents have always been at a disadvantage in budget negotiations with Congress, which has the sole power to appropriate money from the Treasury. While presidents can veto money bills just as they can any other bill, the veto is a very blunt instrument at best, as the entire bill must be accepted or rejected, not just the objectionable spending. Often the one means that the president—the only official in Washington elected by the entire country other than the powerless vice president—had to limit this spending was impoundment, first used by Thomas Jefferson and then by every president since. With impoundment, the president simply refused to spend the appropriated money.
As inflation began to heat up in the later Johnson years, Johnson attempted to control spending by impounding more and more money. In 1966 he impounded no less than $5.3 billion out of a total budget of $134 billion, including $1.86 billion in such popular programs as highways and education. While the Democrat-controlled Congress complained loudly, it did not relish a showdown with a Democrat president. Nixon did not fare so well. When he vetoed the Federal Water Pollution Act in 1972 because it was, in his view, too expensive, Congress passed the bill over his veto. Nixon then impounded the $6 billion the bill had appropriated. Congress saw this, not surprisingly, as a direct threat to its power of the purse.
As Nixon’s political leverage began to vanish in the Watergate scandal, Congress passed the Budget Control Act of 1974, perhaps the most misnamed major piece of legislation in American history. It removed the president’s power of impoundment, which had never had statutory authority, and created the Congressional Budget Office, which gave Congress much the same budgetary apparatus that the Office of Management and Budget gave the president and thus the power to dispute his estimates with estimates more congenial to Congress’s purposes.
With the Budget Control Act, the federal budget went out of control. The deficit in 1974 was $53 billion, the largest deficit in dollar terms since the middle of the Second World War. It increased the national debt by nearly 10 percent in a single year. By the end of the decade, the national debt was two and a half times as large as it had been in 1970, although, thanks to the accelerating inflation, it continued to fall as a percentage of GNP.
What did not fall was the percentage of GNP that passed through the government’s fiscal machine every year. Because, under a progressive income tax, higher incomes are taxed at higher rates, inflation had been pushing people into higher and higher brackets. Thus income taxes increased in real terms while incomes in real terms were often stagnant. To many in Washington, this was a highly satisfactory situation, as government revenues increased without Congress having to take the onus of voting to raise taxes.
The Democratic Party, which had dominated the country’s politics since 1932, had become increasingly out of touch with the electorate and failed to heed the increasingly clear signals of popular discontent as the American economy floundered in the 1970s. In 1978 the people of California began a “tax revolt” by sharply limiting by referendum how much local property taxes could be raised. This sparked tax revolts elsewhere and calls for reform in the increasingly complex and arbitrary federal tax code.
To help end the stagflation that was plaguing the American economy in the 1970s, Congressman Jack Kemp and Senator William Roth proposed cutting the marginal rates on personal income taxes, just as President Kennedy had done more than a decade earlier with great success, and to index tax rates to inflation so that people were not pushed into higher brackets when their incomes were not rising in real terms.
The Kemp-Roth tax proposal was ridiculed by the Democrats. President Jimmy Carter, running for reelection, tried to tie his opponent in 1980, Ronald Reagan, to the proposal by calling it the Reagan-Kemp-Roth proposal, a move that his opponent shrewdly welcomed.
Gas shortages reemerged in the late 1970s while inflation only increased. Stock prices, which had recovered from their disastrous 1974 low, began to decline again. American industry was having more and more trouble competing with other countries. New York City, which had carried the redistributionist model of social welfare much further than other cities and which was heavily dependent on taxes on financial services, went broke. With the city unable to borrow, its quality of life dropped alarmingly, with unkempt parks; graffiti-ridden, unreliable subways and buses; and a relentlessly rising crime rate.
And the American military, starved for funds by the now adamantly antiwar Democrats, who were shell-shocked by the results of the Vietnam War, was unable to respond effectively to the seizure of our embassy and more than four hundred hostages in Iran in 1979.
The Soviet Union, whose ambitions to world domination had been contained for three decades by American economic and military power, was flexing its muscles as it had never done before. In late 1979 it invaded Afghanistan to secure a shaky puppet regime. There seemed little the United States could do in response. The most powerful nation in the world seemed to be becoming a helpless giant. Many wondered if the American century was coming to a premature end.
As a result, for the first time since Herbert Hoover, an elected incumbent president was turned out of office in a landslide. The American people voted decisively for change, and they got it. Ronald Reagan would prove to have the most consequential presidency of the twentieth century, save only for that of Franklin Roosevelt, a man and a president he greatly admired.
WHILE RONALD REAGAN is often given the credit for deregulation and lower taxes, they were, in fact, already under way when he took office, although he helped powerfully to continue and increase the restructuring of the American political economy. Much of the federal regulatory apparatus established as early as 1887, with the creation of the Interstate Commerce Commission, and greatly expanded under the New Deal, had evolved into cartels that protected the interests of the industries they regulated more than the interests of the economy as a whole.
The Civil Aeronautics Board (CAB), which regulated routes and fares for interstate air travel, kept fares on these routes far higher than those on comparable noninterstate routes. In 1978 its power to set rates and routes was taken away by Congress, despite the ferocious opposition of both the airline companies and the airline unions (when both management and labor oppose a change in regulation, it’s a sure sign that a cartel is in operation). The airline business, regulated since its infancy, underwent a painful period of adjustment as airlines began to compete by means of fares. A hub-and-spoke route system soon evolved, and air fares changed frequently as price wars broke out among the airlines. Many old airlines, such as Pan American, Eastern, and Braniff went bankrupt, and many new ones, such as Southwest Airlines, entered the business. Air fares fell drastically on average and air travel increased rapidly.
The Motor Carrier Act of 1980 freed the trucking business to compete, and the Staggers Act the same year freed the railroads to do the same. The railroad business, in decline for most of the century, began to revive, and pointless inefficiencies—most trucks had to return to their place of origin empty, for instance—were quickly drained out of the transportation business. In 1980 transportation amounted to about 15 percent of GDP. By the 1990s it had dropped to 10 percent. Since transportation is what economists call a transaction cost—necessary expenses that do not add to the intrinsic value of the product, such as advertising and packaging—this was a pure gain for the economy as a whole.
The most important deregulation of the 1970s was on Wall Street. The New York Stock Exchange had begun as an agreement among brokers to set minimum prices for stock trading, and commissions had been fixed ever since. But on May 1, 1975, under orders from the SEC, most commissions were allowed to be set by competition for the first time in 183 years.
Fixed commissions, which were calculated as a percentage of the share price, had been under pressure for years as the number of large trades had increased substantially. There had been only an average of 35 trades a day involving more than ten thousand shares in 1965. By 1975 the average was a 135 such trades a day. (Today there is an average of more than 5,000.) These trades cost little more to execute than hundred-share trades, and the big institutions, such as mutual funds, that traded in large blocks had been demanding change. Opposing it were the smaller brokerage houses that could not match the efficiency of the major houses.
With the end of fixed commissions, the price of stock trading plunged by about 40 percent overnight and has been falling ever since. As a result, Wall Street underwent a great consolidation as the smaller firms, unable to compete, merged with larger firms. Meanwhile new firms, discount brokers, opened, offering minimal services and minimal prices as well. The most important result of Mayday, as it was inevitably called, however, was the growth in volume. In the next seventeen years it increased 800 percent and has continued to grow exponentially since. Wall Street had had its first billion-share year in 1929; by the end of the twentieth century billion-share days were the norm. As the cost of stock ownership plunged, the percentage of Americans owning stocks directly increased rapidly as well, and the role of capitalist was played by an ever-wider part of the population, with growing consequences for American politics.
Even the tax code began to change in the late 1970s. In 1969 the outgoing secretary of the treasury in the Johnson administration had testified before Congress that in 1967 there had been 155 tax returns showing incomes more than $200,000 and 21 showing incomes more than a million that had no income tax liability, owing to various provisions of the tax code such as the exemption from taxes for municipal bonds. Congress responded by passing laws requiring minimum tax payments and then an entire supplementary tax code known as the Alternative Minimum Tax. This had the effect of pushing up rates on high incomes and could raise the tax on capital gains (the increase in value in assets that are sold over the purchase price) to as high as 50 percent.
This adversely impacted risk taking by lowering the potential reward without also lowering the risk involved. And investing in new technological possibilities is always very risky, for far more such ventures fail than succeed. It is an inescapable law of economics that if the rewards of success do not match the risks of failure, new ideas will not be tried.
It had also lowered total receipts from capital gains taxes, evidence that excessively high tax rates cause tax receipts to fall, not rise—the so-called Laffer Curve after its developer, the economist Arthur Laffer. In 1968, when capital gains taxes had been no higher than 25 percent, receipts from the tax amounted to $33 billion. By 1977 receipts in inflation-adjusted terms were down to $24 billion. And while there had been three hundred start-up technology companies in 1968, there were none at all in 1976. For an economy that had held the technological lead for more than a century, this was an ominous trend indeed.
Congressman William Steiger, a Republican from Wisconsin, decided to fight to change that. The Republican Party in the late 1970s was a party that seemed to many to be headed for political oblivion. Stained by the Nixon Watergate scandal, and largely out of power for more than four decades, it was perceived as the party of the past. In the election of 1976, Republicans had won only 158 seats in the House while the Democrats dominated with 277.
In fact, the Republican Party was beginning to crackle with new ideas to address the new economic realities. The Democrats would largely cling to the New Deal model that had served it so well for forty years, but that the great political commentator Walter Lippmann had noticed as early as 1964 was becoming outdated. As a result, only one Democratic candidate for president since that year, Jimmy Carter in 1976, has won a majority of the popular vote (and Carter won only 50.46 percent). In 1994 the Republicans would take complete control of Congress for the first time in four decades and hold it thereafter.
Steiger, who sat on the tax-writing House Ways and Means Committee, proved himself a persuasive politician and lined up his fellow Republicans in support of lowering the capital gains tax. Soon he had converted many Democrats as well, giving him a two-to-one majority on the committee. The Democratic establishment fought the proposal hard. President Jimmy Carter threatened to veto the Tax Reform Act of 1978, and the New York Times argued for eliminating all distinctions between capital gains and regular income, which would have raised the capital gains tax to as high as 77 percent.
Regardless, the bill passed Congress, and President Carter, despite his threat, signed it. The effect was immediate. In 1977 the venture capital industry had raised only $39 million. In 1981 the sum was $1.3 billion. Reagan would build on this tax reform by getting Congress to enact Kemp-Roth, sharply reducing the marginal rates on high incomes—inevitably the source of most new capital—in 1981. In 1986 Reagan struck a remarkable deal with Congressman Dan Rostenkowski, Democrat of Illinois and chairman of the House Ways and Means Committee. Together they agreed to further cuts in the marginal tax rates, the highest being reduced to a mere 28 percent, the lowest since the 1920s. In exchange, thousands of deductions and loopholes were closed, greatly simplifying the tax code and further improving the investment climate.
BY THE TIME RONALD REAGAN took office, even inflation was finally being brought under control, thanks to the Federal Reserve and its new chairman, Paul Volcker. Volcker, appointed by Jimmy Carter in the summer of 1979, changed the old Federal Reserve policy of controlling interests to one of seeking to rein in the money supply that had been growing very quickly and fueling the inflation. As a result, interest rates soared to their highest point in U.S. history in the next few years. Even the federal government, by definition the best credit risk in the country, had to pay 15.8 percent to sell twenty-year bonds.
The inevitable result of Volcker’s policy, which was bravely endorsed by the new Reagan administration, was a deep recession, the worst since the 1930s. For the first time since the Great Depression, the unemployment rate rose above 10 percent while the stock market fell below 800 on the Dow. The monetary medicine was bitter indeed, but with a broad safety net now in place, including such programs as unemployment insurance and the widespread provision of layoff benefits in union contracts, there was no widespread profound distress.
And the benefits were not long in coming. Inflation began to break. Inflation had raged at 13.5 percent in 1980. The next year it was at 10 percent. In 1982 it was 6.2 percent, the lowest since the early 1970s. In 1983 it was 4.1 percent. It averaged less than that for the rest of the decade.
With inflation under control, interest rates began to decline, although not nearly as quickly, as lenders still demanded protection from a feared resurgence. With lowering interest rates, borrowing and investment picked up, and the recession came to an end. The end was heralded, as usual, by a resurgent stock market, which began rebounding with a classic buyers’ panic in August 1982. By the end of that year the Dow had crossed 1,000 for the last time. The greatest bull market in world history was under way.
Part of that bull market was a new wave of mergers and acquisitions, the fourth to work its way through the American economy and in many ways strikingly similar to the first one in the 1890s. Low stock prices in terms of corporate assets, falling interest rates, and new capital-raising techniques such as “junk bonds”—bonds that paid high interest rates and financed risky, often untried ideas, such as CNN, the first all-news cable network—fueled the movement. By the end of the decade, more than one-third of the Fortune 500 companies would be taken over or merged. Just as in the 1890s, some of these mergers produced greatly improved economic performance and leaner, more flexible organizations. Others were misbegotten and failed. And some were tainted by fraud and shady dealing. But there is no doubt that the American economy was far stronger at the end of this merger wave than it had been before it.
By 1987 the Dow-Jones Industrial Average had reached 2,500, three times higher than where it had been a mere five years earlier, and the underlying economy was seen as basically sound. Regardless, in October of that year the market suffered the worst crash since 1929 and the worst one-day decline in percentage terms—22.8 percent—in history. The volume was a then utterly unprecedented 604 million shares, more than twice the previous volume record.
Many thought that this signaled the start of a new Great Depression. In fact, the market recovered 104 points the next day (on even higher volume, 608 million shares) and reached a new high on the Dow within fifteen months. The reason, principally, was that the Federal Reserve acted immediately and decisively to stem the panic and to protect the economic institutions of the country from harm. It “flooded the street with money,” in the words of Benjamin Strong, as it pumped massive liquidity into the economic system.
For the first time since Alexander Hamilton had stemmed the panic of 1792, federal monetary authorities had performed as they should in a moment of financial crisis. As a result, there was little long-term damage to the system as a whole, and the 1987 crash is today hardly remembered at all. The ghost of Thomas Jefferson’s hatred of getting and spending, it seemed, had at last been laid to rest. Unfortunately, that ghost was to have one more turn upon the stage of the American economy.
FRANKLIN ROOSEVELT had been reluctant to accept the idea of deposit insurance, for he feared the “moral hazard” it ineluctably created. “We do not wish to make the United States government liable for the mistakes and errors of individual banks,” he said, “and put a premium on unsound banking in the future.” But politics is usually a choice between imperfect means to desirable ends, and the Federal Deposit Insurance Corporation functioned well in the banking cartel that developed under the New Deal.
Commercial banks, savings banks, and savings and loan associations divided the world of American deposit banking among them. Commercial banks became full-service banks, offering savings and checking accounts to individuals while concentrating on business loans. Savings banks and S&Ls offered savings accounts at a slightly higher interest than commercial banks (interest rates were set by law), while concentrating on real estate loans. Even this market was allocated, for savings banks specialized in commercial real estate while the S&Ls lent almost exclusively on single-family houses. New charters were limited to prevent “excess competition.” While the number of S&Ls remained steady at around six thousand after the debacle of the 1930s, their collective assets rose from $8.7 billion to $110.4 billion between 1945 and 1965.
It was a cozy, low-stress business, what someone called 3–6–3 banking, because S&Ls paid 3 percent on deposits, charged 6 percent on loans, and management hit the golf course by 3 PM. But as the booming 1950s and early 1960s gave way to the gathering inflation of the late 1960s and 1970s, the business model of the S&Ls began to fall apart. Unregulated interest rates soared while the regulated banking rates stayed the same. Wall Street brokerage houses and mutual funds began offering money market funds that paid a far higher rate of interest than savings accounts.
People increasingly withdrew money from savings banks and savings and loan associations and moved it to the new money market funds, a movement referred to by the sonorous economic term “disintermediation.” Commercial banks, most of whose deposit base was in noninterest-bearing checking accounts, could cope. The other banks could not. Faced with a rapidly declining deposit base and with long-term real estate loans paying low interest, they went to the federal government for help.
Congress was anxious to do so. As Senator David Pryor explained, “You got to remember that each community has a savings and loan; some have two; some have four, and each of them has seven or eight board members. They own the Chevy dealership and the shoe store.” In other words, they were exactly the people whose support members of Congress need. As so often happens in a democracy in the short term, politics trumped economic reality, and what followed was a near textbook case on how not to deregulate an industry.
The savings banks and S&Ls should have been forced to merge with stronger institutions or to become commercial banks themselves, with the same capital and reserve requirements. Instead the ceiling on interest rates was removed, allowing banks to pay market rates on deposits, and the federal guarantee on bank deposits was raised to $100,000 from $40,000.
But Wall Street had found a way around even that generous limit with a device called brokered deposits, bundled deposits that exactly matched the federal guarantee. It was a simple means to allow people with large liquid holdings to have as much of their money federally insured as they wished. It was also what came to be called “hot money,” money that followed the highest interest rates.
With S&Ls paying higher and higher interest rates on deposits, while still stuck with their low-interest real estate loans, the industry rapidly went broke. The S&Ls had a collective net worth of $32.2 billion in 1980. Two years later it was $3.7 billion. Bank regulators, under pressure from Congress, came up with new quick fixes. Reserve requirements were lowered and accounting rules were relaxed. This had the effect of making the books look better without addressing the problem. It was a bit like a doctor declaring a temperature of 102 degrees to be normal so that a sick patient could be declared healthy.
And they changed the rules on who could own S&Ls. Instead of local people only, almost anyone could own a thrift and even use noncash assets, such as land, the most illiquid of all assets, as reserves. Latter-day Willy Suttons, sensing opportunity, moved into the industry.
In 1982 Congress allowed the S&Ls to write nonresidential mortgages and make consumer loans, just like commercial banks, but without anything like the capital and reserve requirements or accounting strictures of commercial banks.
A disaster was now unavoidable. Congress and the banking regulators had allowed the creation of an economic oxymoron, the high-yield, no-risk investment called brokered deposits, and allowed people with little banking experience and often dubious respect for the law to try to make them profitable. They quickly destroyed the thrift industry, and when the dust had settled, the federal government had to pay out $200 billion to depositors in failed thrifts.
It was the greatest financial scandal in American history. But, as with all scandals, it pointed the way to reform by overcoming entrenched resistance to it. In 1994 the Banking Reform Act finally freed the American banking industry from the last of its Jeffersonian shackles. Banks were allowed to branch over state lines and to become much larger, providing the protection of diversity and setting off a wave of bank mergers that continues to this day. The distinction between investment banks and deposit banks created in the 1930s by Glass-Steagall was repealed, as was much of the distinction between brokers and bankers and insurance firms. At last, the United States had a banking system that matched the American economy in both scale and scope.
BECAUSE OF THE RESTRUCTURING of the American economy in the 1980s, the dollar strengthened markedly against other currencies. It had been worth only 1.8 German marks in 1980; by 1985 it was worth 3 marks. The value against the French franc more than doubled. And foreign investment in the United States rose sharply. By the late 1980s foreigners owned about $400 billion more in assets in the United States than American citizens owned abroad, reversing the situation that had been the case since the First World War.
While some commentators lamented this as a sign of American weakness, it was, in fact, the opposite. Foreign capital poured into the country precisely because the American economy came to be increasingly seen, once again, as one of great opportunity. Foreign immigration also increased sharply in the 1980s, as the poor, like the rich, sought to prosper in the empire of wealth.
The reform of taxes and regulation that marked the true end of what Arthur Schlesinger, Jr., had termed the Age of Roosevelt two decades earlier could not have been better timed. The world economy was undergoing the most profound transformation since the coming of the Industrial Revolution two centuries earlier, perhaps since the coming of agriculture ten millennia ago. And because the United States was the first major country to undergo the wrenching changes required of a political economy based partly on redistribution, it was positioned to exploit first and most fully the boundless opportunities of a new political economy based on opportunity.