EPILOGUE
America After 9/11
On September 11, 2001, Deputy Secretary of State Richard Armitage declared, “History begins today.” He was meeting with the head of the Pakistani intelligence service, who happened to be in Washington when al Qaeda terrorists flew hijacked airplanes into the twin towers of New York’s World Trade Center and the Pentagon and hijacked a fourth plane, which crashed in Pennsylvania after passengers tried to seize control. Armitage warned that the United States gave no brief to the history that led Pakistan to support the Taliban government in Afghanistan, which was allowing Osama bin Laden to operate there. But Armitage’s assertion had a broader meaning, widely shared in Washington and the nation, that a radical disjuncture had taken place. President George W. Bush said that on September 11 “night fell on a different world.”
The 9/11 attacks and the reaction to them seemed to start a new phase of American history. Within less than two years, the United States began two wars, declared it had the right to take preemptive military action against countries it deemed threatening, and hardened its campaign against terrorism to include torturing prisoners captured abroad. At home, the government undertook a sweeping security program that included electronic surveillance without warrants, intrusive searches at airports, and color-coded alerts that raised and lowered national anxiety. In the process, Washington accrued new powers, often with little public debate or even public knowledge. The enhanced powers of the state were not restricted to security; in 2008, when the economy nosedived, the government lent hundreds of billions of dollars to banks and financial firms with much secrecy and little oversight.
The United States felt different in 2011 than it had ten years earlier. But much that seemed novel after 9/11 had roots in the previous fifty-six years, in the country’s quest for international power and stability, in struggles over the meaning of democracy, in debate over the proper role of government, in the push to expand mass consumption, in the shift from industry to finance, and in changing ideas of national greatness. The troubled first decade of the twenty-first century represented the legacy of postwar America as much as a break from it.
September 11, the War on Terrorism, and the Costs of Empire
The 9/11 attacks came out of history, not from outside it. For over a half century, the United States had pursued an expansive international political, economic, military, and cultural presence. Across the world it figured large, directly and indirectly, in the lives of billions of people, just as life within the United States depended more than ever on what happened outside it. Before World War II, it would have been difficult to imagine that developments in so distant a place as Afghanistan mattered to many Americans, but a half century of imperial reach brought echoes of events halfway around the world back home, sometimes with deadly effect.
September 11 stunned America. There had not been a foreign-launched attack on the United States proper since 1814. The scale of death and suffering in New York, Washington, and Pennsylvania exceeded the toll of any accident or violent incident since World War II. The experience millions of Americans had watching the 110-story World Trade Center towers collapse, live on television, had no precedent, as the solidity of their society seemed to disintegrate before their eyes.
During the years after World War II, most Americans had believed their country to be benign in its global activities, helping others as much as promoting its own interests, leaving the country ill-prepared for the murderous hostility it faced on 9/11. In his justifications for terrorism, bin Laden, while denouncing “Crusaders,” Jews, and Western involvement in the Middle East, pointed to specific historical events and political grievances, including the establishment of U.S. military bases in Saudi Arabia prior to the first Gulf War, Western control of Middle Eastern resources, and American backing for Israel and corrupt Arab regimes. Washington spurned any effort to review the events, circumstances, and policies that led to 9/11, as if that would somehow excuse the attacks or shift blame away from the perpetrators. The attacks, the president repeatedly said, were an act of evil, the work of “evildoers,” positioning them as a form of moral and spiritual pathology rather than an outgrowth of political and military conflict. Bush’s rhetoric echoed the Manichean formulations used during World War II and the Cold War that counterposed free and slave worlds, good and evil, rights and repression. Like Harry Truman before him, Bush described a battle not over national interests or geostrategy but over a “way of life.”
Though the huge military-security apparatus that the United States had built up since the outbreak of World War II had failed to protect it on 9/11, seven decades of militarism conditioned American leaders to see the problem of terrorism in martial terms. The Bush administration responded to the attacks by launching a “war on terror” directed at “every terrorist of global reach” and “any nation that continues to harbor or support terrorism.” Just hours after the attacks, Secretary of Defense Donald Rumsfeld, according to the notes of one of his aides, said the United States had to “go massive—sweep it all up—things related and not” to bin Laden.
Many senior figures in the Bush administration had held government posts during the latter part of the Cold War and had sought to build up military strength to counter the Soviet bloc. Once the Soviet Union collapsed, the major restraint on the use of military power disappeared. Easy victories against overmatched opponents like Grenada, Panama, Serbia, and Iraq (the first time) fueled overconfidence in the ability of the United States to reshape other nations and bred a recklessness that American leaders had generally avoided during the Cold War.
At first it seemed that the United States would be able to leverage its technological and economic superiority to project military power and impose order on its own terms. Right after 9/11, the United States demanded that the Afghan government turn over al Qaeda leaders and close the group’s training camps. When the Taliban government refused, the United States launched a military offensive against it, using a combination of airpower, special forces, and intelligence operatives and massive cash outlays to buy the cooperation of warlords and dissident political factions. Within two months, the Taliban regime collapsed.
But the victory turned out to be far from complete. The new Afghan government, headed by exiled anti-Taliban leader Hamid Karzai, proved limited in its ability to exert its authority, allowing for the eventual resurgence of the Taliban. Renewed violence took an increasing toll on American forces, with 1,446 deaths as of the end of 2010. And while the United States succeeded in shutting down al Qaeda operations in Afghanistan and killing many members of the group, bin Laden managed to escape into Pakistan when American armed forces failed to mobilize sufficient troops to surround him during a fierce battle in the Tora Bora mountains in late 2001.
Even as the World Trade Center and Pentagon smoldered, Rumsfeld, his deputy, Paul Wolfowitz, and other Bush administration leaders pressed for a response that encompassed more than going after terrorists. September 11, they believed, presented the opportunity for a broader reordering of the world, especially the Middle East, along lines congenial to American interests and values. Getting rid of Iraqi leader Saddam Hussein, they argued, would be a start toward reconstructing the Middle East and a demonstration to terrorists and hostile states of the reach of American power and its willingness to use it.
The Bush administration put forth two reasons for invading Iraq. Most importantly, it claimed that Iraq had defied UN resolutions demanding it shut down its programs to develop chemical, biological, and nuclear weapons of mass destruction [WMD]. Bush and Vice President Dick Cheney also claimed that Saddam had ties to al Qaeda and promoted terrorism. Both assertions were false. In July 2002, the head of British foreign intelligence reported after meetings in Washington, “Bush wanted to remove Saddam, through military action, justified by the conjunction of terrorism and WMD. But the intelligence and facts were being fixed around the policy.” That summer a senior Bush adviser told a reporter, “We’re an empire now, and when we act, we create our own reality.”
During the Bush administration, the United States behaved more frankly as an imperial power than at any time since the early twentieth century. In a June 2002 speech at West Point, Bush said the Cold War strategies of deterrence and containment did not suffice against new kinds of threats. In the face of terrorism and “unbalanced dictators with weapons,” the United States needed “to be ready for preemptive action.”
The Iraq War implemented the doctrine. The United States attacked Iraq because it seemed like it could, at relatively low cost; supply-side war. Kenneth Adelman, a onetime aide to Rumsfeld and Jeane Kirkpatrick, captured the mood of conservatives when he wrote in the Washington Post that “liberating Iraq would be a cakewalk.”
Post-9/11 fear, patriotism, and deference to authority facilitated the drive to war. The media swallowed whole-hog false administration claims, with liberal newspapers like the New York Times and the Washington Post beating the drums of war. Large demonstrations against an invasion of Iraq, at home and around the world, had no impact on the actions of the United States and its allies. Some Republicans and Democrats opposed invading Iraq, but the president easily won congressional authorization for military action.
Within three weeks of the initial attack on Iraq in mid-March 2003, American forces captured Baghdad and the Iraqi army melted away. The war, though, had just begun.
Bush administration leaders thought that removing Saddam and his circle would be all that was needed to create a new, democratic, free-market Iraq, given what they believed was a sophisticated society, with oil reserves that could pay for reconstruction, and a population they assumed would be grateful to the United States for ridding it of a brutal dictator. They pointed to post–World War II Germany and Japan as examples of how democratic societies could emerge under U.S. tutelage once autocratic regimes had been decapitated.
Nothing could have been more wrong. In Germany and Japan, not a single U.S. soldier was killed during their occupations. Over four thousand Americans died during the occupation of Iraq. Bush and his closest advisers paid little attention to postwar planning, having rejected before taking office the idea that the United States should get itself involved in “nation building.” Instead, they assumed that the Iraqi military and police would maintain postwar order, while a new political leadership would quickly emerge and the economy would blossom with privatization and increased oil production.
None of this happened. Instead, as soon as U.S. troops reached Baghdad, looting began. It continued for weeks, as American soldiers stood by, too few, as a result of Rumsfeld’s lean war plan, to impose order. By the time the looting ended, innumerable industrial facilities and almost every government building, from the national museum to ministry headquarters to ammunition dumps, had been stripped of everything of value, making it impossible to quickly restore any semblance of normality and putting tons of arms and explosives into circulation. Within months, serious military resistance to the occupation ramped up from Saddam loyalists, anti-occupation nationalists and Shiite groups, and foreign terrorists who flocked to Iraq, eager to take on the Americans. Sectarian violence, primarily between Shia and Sunni, took a ghastly toll.
As the Bush administration belatedly realized it would have to actually administer Iraq, it assembled an occupation bureaucracy stunning in its incomprehension and incompetence. Since World War II, the United States had developed far-reaching global interests while remaining parochial in its domestic culture, leaving it ill-equipped for old-fashioned, on-the-ground imperialism. The head of the Coalition Provisional Authority (CPA), L. Paul Bremer, a longtime diplomat and associate of Henry Kissinger’s, styling himself an imperial consul like MacArthur in Japan, quickly made a series of disastrous decisions, including banning every member of Saddam’s Baath Party from government service and dissolving the Iraqi army, leaving tens of thousands of armed men without pay, many of whom joined the anti-occupation insurgency.
The CPA staff had little expertise or experience in postwar reconstruction or the Middle East. As in Vietnam, the United States tried to run another country from a small island of America, the “Green Zone” in the heart of Baghdad, complete with swimming pool, sports bars, American food, a fleet of SUVs, and almost no Americans who could speak Arabic. Recent college graduates with connections to the Republican Party or conservative think tanks were given huge authority over a country they knew nothing about. As what started as a brief conventional war morphed into a long, brutal counterinsurgency struggle, more and more Iraqis turned against the American-led coalition.
After years of fighting and a “surge” of U.S. troops sent to Iraq in 2007, violence in the country diminished and the government set up after national elections took increasing responsibility for basic state functions. But Iraq remained violent, unstable, and underdeveloped. Five years after the invasion, electricity production only modestly exceeded the preinvasion level, which had been depressed by years of international sanctions, making infrastructure development and maintenance difficult. Oil production remained below what it had been under Saddam.
The cost of the war was horrendous. For the United States, it was the bloodiest conflict since Vietnam. An estimated 100,000 Iraqi civilians died.
A vast expansion of the national security apparatus accompanied the war on terror. One of the great changes in American life after World War II had been the development of a national security state, which in the name of fighting communism and other dangers engaged in a wide range of often secret activities. The end of the Cold War brought a modest cutback in covert surveillance and policing, but 9/11 reversed that trend, as the security sector grew larger, more intrusive, and more willing to push legal and constitutional limits.
The shock, fear, and national unity after 9/11 enabled the Bush administration to accrue extraordinary powers. In late October 2001, Congress, with massive majorities, passed the USA Patriot Act, which expanded authority for domestic wiretapping, allowed the government to track Internet and financial activity, and eased the requirements for search warrants. Congress appropriated $40 billion for domestic security and the fight against al Qaeda, followed by over $80 billion more over the course of the next two years. Bush established the Office of Homeland Security, later made a cabinet department, which took over border security and immigration control and included the Transportation Security Administration, set up to replace the private contractors providing air travel security. By the end of the decade, the counterterrorism apparatus had grown staggeringly large, according to the Washington Post involving over a thousand government agencies and nearly two thousand private companies working at ten thousand locations, with an estimated 854,000 people holding top-secret security clearances.
Some steps the Bush administration took secretly, without congressional approval, including setting up a program of warrantless electronic domestic surveillance. Bush directives denied captured foreign terror suspects access to any court, allowed them to be held indefinitely without charges, and permitted the use of “enhanced interrogation techniques” that in some cases amounted to torture. In February 2002, Bush announced that the United States would not consider the Geneva Conventions on the treatment of prisoners of war, to which it was party, applicable to fighters captured in the war on terror.
In choosing to conduct the antiterror campaign as it did, the Bush administration was promoting an agenda that predated 9/11. Many of its top members bemoaned what they saw as a weakening of the executive branch, especially the presidency, in the wake of Vietnam and Watergate. Reversing a historical pattern in which liberals had been the main backers of a strong presidency, conservatives embraced the idea of a potent executive branch. Dick Cheney—more powerful than any previous vice president—aggressively promoted far-reaching presidential action unrestrained by congressional or court oversight, international agreements, or public disclosure.
The U.S. government used all the power it gave itself, and then some, to try to extract information from captives accused of terrorism. Some suspects were handed over to allies that the United States knew would use rough treatment or torture to extract information, so-called rendition. Others were whisked off to secret CIA prisons set up around the world, where they were subjected to isolation, violence, and torture (often conducted by contractors, part of a general policy of hiring civilian companies to do work once done by military, intelligence, and police organizations). For less important captives, the United States erected a large prison complex, with open-air cages, at its Guantánamo Bay base in Cuba.
Though such post-9/11 practices shocked many Americans when they eventually were revealed, seen as departures from long-standing notions of law, morality, and national values, they had historical roots. The most notorious torture technique used after 9/11, waterboarding, the simulated drowning of prisoners, was a variation of a torture method used by the U.S. Army on Filipino independence fighters early in the twentieth century. During the Cold War, the federal government had been explicit in saying that it would do whatever was necessary to defend the United States, and it did many unsavory things, from attempting assassinations to supporting death squads to using mind-altering drugs and other coercive techniques in interrogations. But the revelations of torture at CIA detention centers, the leaking of photographs of American soldiers abusing Iraqi prisoners at the Abu Ghraib prison, and the shooting of Iraqi civilians by contractors working for the American government undermined the claims by the United States of moral superiority, on which the justification for its wars partly rested, and presented a disturbing image of what kind of society it had become.
Americans selectively looked to the past to guide themselves forward. In his diary, Bush called 9/11 “the Pearl Harbor of the 21st Century” and took to calling himself a “war president,” putting himself in the same category as Lincoln and FDR. The media called the site of the collapsed World Trade Center “ground zero,” an adoption of the World War II term for the point directly below the nuclear bombs exploded at the Trinity test site and in Japan. In his 2002 State of the Union address, Bush used the phrase “axis of evil” to describe Iraq, Iran, and North Korea, recalling Germany, Japan, and Italy, the Axis powers of the Second World War. Later, in announcing the end of major combat in Iraq on the aircraft carrier USS Abraham Lincoln, the White House drew on the dramaturgy of the Japanese surrender aboard the USS Missouri in Tokyo Bay, with a speechwriter consulting General MacArthur’s address in preparing Bush’s remarks. (Bush also looked to World War II in rejecting anything like the internment of Japanese Americans and seeking to minimize public and government discrimination against Muslims.) But analogies to the past obscured the changes that had taken place in the United States and its circumstances. Al Qaeda, a small if deadly organization, never represented the existential threat to the United States once posed by Germany and Japan and later by the Soviet Union, while North Korea, Iraq, and Iran had far fewer resources and much more modest records of aggression than the original Axis powers.
To the extent that global military power had once served Americans well, its worth became less clear as its cost in lives, dollars, and international standing mounted. The Bush administration wars, especially Iraq, dissipated the worldwide goodwill toward the United States evident after 9/11. The United States proved itself to be a can’t-do imperial power, an incompetent, blustering, sometimes brutal nation, whose hubris and carelessness caused enormous harm to its allies, those it claimed to be helping, and itself. Yet empire had become so interwoven in the fabric of American life that even as public support for the Iraq and Afghanistan wars diminished, issues of war and peace and foreign policy remained subsidiary notes in national politics and discussion.
Reaganism Redux
During the dark days after 9/11, few Americans anticipated the kinds of hardship their country would go through when the economy plunged into a severe recession in 2008. The quarter century after World War II had been a period of exceptionally robust growth and shared benefits, laying the basis for a democratic revolution in politics and culture. The next quarter century saw slower growth, stagnant income for most Americans, growing inequality, and a shift of power to the private sector. But only on the fringes of political discourse did suggestions arise that the American system of political economy might not be sustainable. The free fall after 2008 changed that, making it painfully obvious that the economy no longer served most Americans as well as it once had.
In domestic policy, the Bush administration in many respects amounted to a rerun of the Reagan years. Like Reagan, Bush believed the country would prosper if the government reduced its role in the economy, giving free rein to market forces. Like Reagan, he gave top priority to tax-cutting, especially for the well-off, and loosening of government regulations. Though the bitter partisan battling that characterized the Clinton years continued and fights over so-called social issues like abortion rights, embryonic stem cell research, and gay marriage raged, few significant changes in public policy occurred. Instead, the major transformations in daily life largely resulted from developments in the economy, especially the financial sector, continuing the pattern since the 1970s that increasingly put the power to shape society in private hands.
When Bush took office, the federal government was well on its way toward paying off the entire national debt for the first time since 1835, having run budget surpluses for four consecutive years. Bush wanted to distribute excess federal funds through tax cuts, arguing, “The surplus is not the government’s money; the surplus is the people’s money,” a Reaganite formulation that did not see government, as Abraham Lincoln did, “of the people, by the people, for the people,” but in opposition to them. The recession that began in March 2001, following the collapse of the stock market bubble, led the newly inaugurated president to shift his argument for lower taxes to the need for an economic stimulus. Congress gave him most of what he wanted, a $1.3 trillion tax cut that reduced both the top and bottom income tax rates and increased the amount exempt from estate taxes (“death taxes,” Bush called them).
The 9/11 attacks hurt an economy already in trouble, leading to a 20 percent drop in stock prices and a jump in unemployment. Bush sought to boost the weak economy with another tax cut, heavily skewed to the rich. With the federal government already having gone from running a surplus to running a deficit and the cost of the war on terror soaring, the second round of tax cuts faced more opposition than the first. But with the vice president casting the deciding vote in a deadlocked Senate, in May 2003 Congress accelerated the implementation of the previous tax cuts, lowered the top rate for capital gains and dividend taxes, and increased depreciation allowances for small businesses. Most taxpayers ended up with only slightly smaller tax bills, but wealthy payers got very substantial reductions.
The Bush administration seemed at ease with the rising tide of red ink created by its tax policy. Cheney reportedly said, “Reagan proved deficits don’t matter.” Bush took no steps to increase revenue even as the national debt ballooned over the course of his presidency from $3.3 trillion, representing 33 percent of GDP, to $5.8 trillion, 41 percent of GDP.
While the United States had gone through periods of high deficits in the past without lasting damage, Bush broke all precedent in not seeking new taxes to finance the wars the country was fighting. Afghanistan and Iraq had immediate fiscal consequences, accounting for about a third of the deficits run up between 2004 and 2006. Their long-run cumulative costs (including expenses that would continue for decades, such as medical care for injured soldiers) were staggering, by one estimate reaching roughly $2 trillion by 2011, $17,000 for every household in the country. One of the president’s few successful domestic initiatives, a prescription drug plan for senior citizens approved by Congress in 2003, also proved costly.
The drug plan and the 2001 education law, No Child Left Behind, reflected Bush’s willingness to support some expansion of the welfare state, especially if market mechanisms were employed. But more broadly, especially in economic regulation, the Bush administration displayed hostility to government action. Like Reagan, Bush used funding cuts, administrative action, and staffing decisions to undermine or diminish regulatory standards and enforcement. Though as a candidate he had acknowledged the problem of global warming and pledged to address it, once in office he reversed course, refusing to ratify the UN Kyoto Protocol on global warming and challenging the scientific consensus that human activity was altering atmospheric conditions.
Bush administration skepticism of government and expertise—a well-established conservative outlook—contributed to its catastrophic bungling of rescue and relief efforts during Hurricane Katrina in 2005 (much as it did to the failed occupation of Iraq). Bush’s first head of the Federal Emergency Management Agency (FEMA), his longtime aide Joe M. Allbaugh, told Congress that the agency had become “an oversized entitlement program.” Bush put political allies with no expertise in handling emergencies into top agency posts. Placed within the Department of Homeland Security in 2003, FEMA saw resources and attention diverted to counterterrorism.
When Hurricane Katrina bore down on New Orleans, top FEMA officials received up-to-the-minute reports on the horrendous damage it was wreaking. But as levees gave way, flooding much of the city, FEMA did little to help overwhelmed local and state agencies, having failed to pre-position necessary supplies or mobilize federal assets. While television networks broadcast images of people stranded on roofs, floating on mattresses, and clinging to trees, as conditions at a domed football stadium being used as a mass shelter became hellish and looting broke out across the city, the federal government remained inert and oblivious, with the president congratulating FEMA director Michael Brown for a job well done, when in reality the agency had barely swung into action.
An estimated eleven hundred people died in New Orleans as a result of Katrina. It is impossible to say how many could have been saved if the full resources of the U.S. government had been mobilized in a timely, effective manner, but surely the death toll would have been lower and the suffering less. Live television coverage of a major American city being essentially abandoned to its terrible fate shocked the nation and the rest of the world. Driven by an ideological aversion to government, cronyism, and a view of national security narrowly focused on terrorism, the Bush administration failed to protect the lives and safety of Americans when a predictable and predicted crisis arrived, a damning measure of how inept the world’s only superpower had become.
Bubble and Bust
Borrowed money fueled the economy during the Bush years, inflating an unprecedented bubble in housing prices. For a while, housing-driven growth masked fundamental economic problems, including stagnant income, growing inequality, declining manufacturing, and huge trade imbalances. When the housing bubble burst in 2007 and 2008, it brought down the financial sector, which had profited enormously from selling dubious mortgage-related securities, plunging the United States into its worst economic crisis since the Great Depression.
Government policy helped create the debt-based post-9/11 bubble economy. The Federal Reserve responded to the 2001 recession by lowering interest rates to historically low levels and keeping them there, even after the economy began to recover. In 2003 and 2004, the Fed set interest rates below the rate of inflation, an extraordinary inducement to private borrowing.
Over time, borrowing had become ever more important to keeping the economy going because of the stagnation of earnings, except for the wealthy, since the 1970s and the decline in personal savings. Consumer spending and the stimulus it provided to the economy depended on borrowed money. Federal deficit spending provided further stimulus.
The Bush-era borrowing spree was possible because foreigners were willing to lend massive amounts of money to the government and consumers. By 2008, nearly a third of mortgage debt was owed to foreigners, as was two-thirds of federal borrowing. Some money flowed in from abroad because investors believed they could make good profits, for example by buying securitized mortgages. But with interest rates low, money came for other reasons as well. Foreign governments, institutions, and individuals saw Treasury bonds as extremely safe investments, a good place to park money even if the return would be negligible. Countries that exported goods to the United States, most importantly China, bought its debt to keep the dollar strong, which made it cheap for Americans to buy foreign-produced goods.
For a while it seemed like sheer magic, a system in which everyone won: the U.S. government could cut taxes even as it fought expensive wars; consumers could borrow vast amounts of money at low interest rates to buy houses, cars, and other goods; foreign governments could keep their money safe and their factories humming; and the financial sector could profit handsomely from all the lending and borrowing, with its share of GDP rising to 8.3 percent in 2007, from 7.0 percent in 1998. But the growing mountain of debt eventually toppled, because real economic growth occurred at too slow a rate to support it.
Even before the collapse, between 2000 and 2007 the United States lost three and a half million manufacturing jobs. Though some economic sectors shielded from international competition did well, including construction (which benefited from the housing boom), retail (which had lots of cheap imported goods to sell), and hospitality, overall economic growth was anemic. The GDP rose an average of 2.3 percent a year between 2000 and 2007, well below the rate in the previous decade, the decade before that, and the decade before that. The mighty economic engine, once based on making, growing, and processing things, which had propelled the United States to world greatness and transformed life at home, was slowly winding down.
The housing boom accounted for much of the growth that did occur. Housing prices had begun to rise faster than inflation during the late 1990s and kept rising as the Federal Reserve drove down interest rates and new lending practices expanded the pool of buyers. Because median family income remained flat, even after the economy recovered from the recession of the early 2000s, the number of families eligible for mortgages did not rise. So financial institutions, with the backing of the government, lowered the qualifications and documentation needed for loans. The policy reflected the financial sector’s quest for robust profits in a slow-growing economy and the long-standing national belief in homeownership as a social good. Subprime mortgages that did not require borrowers to meet traditional criteria accounted for a quarter of all home loans by 2006 and helped push up the homeownership rate to a historic peak. In 2004, 69 percent of American families owned their own home.
Mortgage lenders were willing to make riskier loans because increasingly they only briefly retained them, selling them to other financial institutions, which bundled them into large pools—supposedly to lower the risk—and then selling securities backed by the income stream from those mortgage packages. To further reduce the risk, many buyers of such securitized instruments also bought credit default swaps, a lightly regulated form of insurance.
Home prices rose on average 51 percent between 2000 and 2005. In some areas, especially in the South and Southwest, the increases were staggering. The average price of a home in Los Angeles went from $161,000 in 1995 to $228,000 in 2000 and $585,000 in 2006. Such valuations led to a wave of speculation—in 2004 nearly a quarter of homes were bought as investments, not for owner occupation—further fueling price hikes. Soaring home prices allowed millions of Americans to get cash for other purchases by remortgaging their homes or taking out home equity loans. As long as house prices kept increasing, homeowners remained confident that they could pay back their growing debt when they eventually sold their houses.
It all worked as long as house prices kept rising, but at some point, as occurs with all bubbles, prices began to fall, at first slowly, beginning in mid-2006, then at a stomach-wrenching rate, down 10 percent during the second half of 2007 and 20 percent in 2008. As prices dropped, owners were unable to keep pulling money out of their homes, pushing down the increase in personal consumer spending to near zero in 2008. By the start of 2009, a sixth of houses with mortgages were worth less than their owners had borrowed, leading many families to simply walk away from their loans. More than one-fifth of homes being sold were in foreclosure, further driving down prices and effectively shutting down new construction.
The housing crisis spread to the financial sector as mortgage defaults began leading to defaults on mortgage-backed securities, or the fear of default, which caused their value to plummet. Hedge funds holding such securities began to totter as investors tried to withdraw their money, forcing large asset sales, depressing prices even more. Banks facing huge potential losses became reluctant to lend money for any purpose at all.
By the second half of 2007, the economy was sinking toward recession, with the Federal Reserve trying, unsuccessfully, to shore up the financial system. In March 2008, the giant brokerage firm Bear Stearns had to be bailed out. When in September 2008 Lehman Brothers went bankrupt, leading to something close to a worldwide credit halt, it seemed possible that the entire financial system would melt down. Federal Reserve chairman Ben Bernanke told congressional leaders and Bush administration policymakers, “We are headed for the worst financial crisis in the nation’s history. . . . We’re talking about a matter of days.”
As the Bush administration drew to a close, Congress, scared into action, allocated $700 billion for a financial rescue program to be shaped by the Treasury, with few guidelines and little oversight. By pumping massive streams of taxpayer money into the banks whose poor judgment and lack of due diligence had driven the country to the edge of catastrophe, the federal government managed to keep the financial system afloat. But with the collapse of the housing market and the credit crunch, the country slipped into the most serious recession in over three-quarters of a century.
The 2008 crisis had been a long time in the making. In its immediate aftermath, journalists, politicians, and pundits found plenty of proximate causes: Alan Greenspan’s refusal in the early 2000s to recognize that a housing bubble was developing from his low-interest policies; the fraudulent practices that riddled the mortgage industry; the hubris of the titans of finance, wielding ever more arcane mathematical models to claim to have created a risk-free world of pure profit; the incestuous ties between regulators and those they regulated. But the collapse had deep roots in the very nature of the political economy that had developed since the 1970s. The decline of manufacturing, deregulation, the financialization of the economy (the financial sector accounted for more than 30 percent of all corporate profits in 2004), globalization, and the grab of an ever greater proportion of national income by those already rich (by 2005 income inequality was at the level of the 1920s) had hollowed out the U.S. economy.
Though differences in economic policy had existed between administrations since the 1970s, there had been a broad continuity of vision. The Reagan administration had encouraged the corporate revolution that restructured the economy. The Clinton administration had been at least as aggressive in promoting free trade and the growth and deregulation of the financial sector. Even after the financial plunge and the election of Barack Obama, the basic contours of public policy continued along lines established in the 1980s and 1990s.
Obama’s election was a measure of how much the United States had changed since World War II. It was literally unimaginable to most Americans in 1945 or 1965 or even 1985 that an African American would be elected president of the United States. But his ascension also made clear—in spite of his campaign theme of “change”—the stubborn persistence of old ideas and old forces. In the decades after World War II, politics had become more formally democratic, but the political influence of business had so grown since the 1970s that meaningful democracy had become stunted. Deregulation, free rein for the financial sector, free-trade globalization, promotion of private homeownership, and the use of federal money to maintain an empire while physical and social infrastructure deteriorated were policies supported for a quarter century or more by both parties. The considered responses to the 2008 economic crisis all centered on preserving existing hierarchies of wealth and power. Obama signaled his restorationist economic agenda when he picked as secretary of the treasury Timothy Geithner, who as head of the Federal Reserve Bank of New York had been deeply involved in bailing out the banks, and as his chief economic adviser Larry Summers, who in the Clinton administration had helped spearhead the financial deregulation that made possible the massive economic problems of the decade that followed.
As the first decade of the twenty-first century ended, the American empire was on the decline. Wars being fought after years of sacrifice, without clear purpose or victory, had sapped the country’s finances and global standing. The economy no longer provided enough jobs or opportunities to maintain the living standards that had once been widely shared. In manufacturing, infrastructure, and education, the United States no longer led other industrial and industrializing countries, and in some respects lagged way behind. A small-minded, fractious political system proved unable to seize the moment or chart the future. The challenge of reinvention once again faced the nation.