Chapter Nine
From the Evil Empire to the Axis of Evil

What was Operation Desert Storm?

Milestones in the Gulf War

How do you “downsize” a president?

Can a man called Bubba become president?

Who took out a Contract with America?

What is “is”?

What is “irrational exuberance”?

Why the Federal Reserve Matters: A Glossary of Financial Terms and “Fedspeak”

Is that chad dimpled, pregnant, or hanging?

Where is Fox Mulder when we need him?

America in 2000: A Statistical Snapshot

9/11: What really happened?

War in Afghanistan: Who? What? When? Where? Why?

Milestones in the War in Afghanistan

Why was America in Iraq?

Milestones in the War in Iraq

Should same-sex marriage be legal for all Americans? And what does same-sex marriage have to do with interracial marriage?

How do you keep a “bubble” from bursting?

How did America elect its first black president?

To many Americans, the Reagan years had brought about a clean break with the long post-Vietnam, post-Watergate mood of the country. This was true despite the fact that budget deficits were ballooning toward nosebleed territory. Wall Street was tottering through another periodic scandal—this time it was over manipulating “junk bonds.” A banking crisis was costing taxpayers billions. Crack cocaine had become epidemic, bringing with it a deadly wave of urban crime. And the specter of AIDS had completely reshaped the American landscape. Still, on the surface at least, the Reagan years seemed to have restored a semblance of confidence in the country. And the chief beneficiary of that confidence was Reagan’s vice president, George Bush.

What was Operation Desert Storm?

George Bush looked like a sure two-termer early on.

In his first two years as president, Bush had witnessed the stunning unraveling of Communism in Europe. In a reverse of the domino theory, which held that Communism would win successive victories in such countries as Vietnam if the U.S. allowed, the Berlin Wall crumbled, East and West Germany united, once-captive nations embraced democracy, and, astonishingly, the Soviet Union, the longtime adversary that Ronald Reagan had called the Evil Empire, simply and bloodlessly disintegrated. Not with a bang but a whimper.

Soviet leader Mikhail Gorbachev (b. 1931) had attempted to restructure the Soviet economy (perestroika) and loosen political restraints (glasnost). But he had let the genie out of the bottle. The Cold War was over. George Bush was the president on hand to usher in, he thought, a New World Order.

But even as the Evil Empire unraveled and a half century of Cold War tension and conflict wound down, Bush’s presidential high point was to come in a part of the world that has confounded every American president since Truman: the Middle East. In Bush’s case, the crisis came from Iraqi dictator Saddam Hussein’s August 1990 invasion of neighboring oil-rich Kuwait. Mobilizing the United Nations against Saddam Hussein, Bush first ordered Operation Desert Shield, a defensive move to protect the vast oil fields of Saudi Arabia. Although there was considerable rhetoric about protecting freedom and liberty, it was difficult to make a case that the U.S. was going to war to defend democracy when it came to either Kuwait or Saudi Arabia, monarchies in which political parties are illegal and women are still treated as property. A quick move by Iraq’s army into the Saudi kingdom would have given Iraq control of more than 40 percent of the world’s oil reserves, a frightening prospect given Saddam Hussein’s proven willingness to measure up to his chief role model, Joseph Stalin.

Leading a coalition of thirty-nine other nations and with United Nations approval, the United States spearheaded Operation Desert Storm, a devastating air war, followed by a 100-hour ground offensive. It was to be George Bush’s shining moment.

The fourteenth former vice president who became president, Bush was the first vice president elected in his own right since Martin Van Buren in 1836. When someone gave Bush a portrait of Van Buren on Inauguration Day, he may have politely failed to mention that Van Buren served only a single term. He was turned out of office because of the terrible shape of the American economy back then! So you think history doesn’t repeat itself?

MILESTONES IN THE GULF WAR

August 2 The UN Security Council issues a resolution condemning Iraq’s invasion.

August 6 The UN Security Council imposes an embargo that prohibits all trade with Iraq except for medical supplies and food in certain circumstances.

August 7 The United States announces that it is sending troops to the Persian Gulf to defend Saudi Arabia from possible attack by Iraq.

August 25 The UN Security Council authorizes the use of force to carry out the embargo against Iraq.

November 29 The council gives coalition members permission “to use all necessary means” to expel Iraq from Kuwait if Iraq does not withdraw by January 15, 1991. Iraq does not withdraw.

1991

January 17 The air war begins at 3 A.M. The coalition’s goal is to destroy Iraq’s ability to launch attacks; eliminate Iraq’s biological, chemical, and nuclear weapons facilities; and reduce Iraq’s ability to defend itself. By late February, the air war reduces the number of Iraqi troops in Kuwait and southern Iraq to about 183,000, mostly through casualties and desertions.

Iraq responds to the allied air assault by launching Scud missiles at populated areas in Israel and Saudi Arabia. Crude and inaccurate by modern Western military standards, the Soviet-built Scuds have enormous psychological value, striking fear in the targeted cities. Among the chief fears is the possibility that Saddam will arm these missiles with either chemical or biological weapons as he has done in suppressing a rebellion by the ethnic minority Kurds in Iraq. The attacks on Israel are designed to draw the Israelis into the war. However, Israel does not enter the war, thus making it much easier to keep the coalition together.

February 24 At about 4 A.M., coalition forces launch a major three-pronged ground attack. U.S. and French troops invade Iraq from Saudi Arabia, west of Iraqi fortifications in Kuwait. They move rapidly north into Iraq and toward the Euphrates River to cut off Iraqi supply lines and to prevent an Iraqi retreat. U.S. and British troops cross into Iraq from Saudi Arabia. They move north into Iraq and then sweep east to attack the Iraqi troops. Coalition troops, consisting of U.S. Marines and troops from Egypt, Kuwait, Saudi Arabia, and Syria, assault Iraqi forces at several points across southern Kuwait. The troops quickly break through Iraqi fortifications, and about 63,000 Iraqi soldiers surrender.

February 26 Saddam Hussein orders his troops to leave Kuwait. But by that time, the Iraqi forces have been surrounded.

February 28 The coalition ends all military operations at 8 A.M., about 100 hours after the ground attack began. American losses for the operation were 148 killed in action and 7 missing in action.

The Gulf War lasted forty-two days: thirty-eight days of intense air strikes and four days of ground fighting. The U.S.-led coalition routed Saddam’s army, overran Kuwait and southern Iraq, and liberated Kuwait. By stopping the offensive against Iraq without assaulting Baghdad and possibly overthrowing Saddam Hussein, President Bush and his advisers had fulfilled the UN’s terms of the action against Iraq. (Ten years later, the consequences of that decision would impact many of the same men. In the wake of the September 11 attacks on the United States and the ensuing war in Afghanistan, still in positions of power were Colin Powell, the chairman of the Joint Chiefs of Staff during the Gulf War, now secretary of state under George W. Bush, forty-third president and son of the forty-first. Dick Cheney, the secretary of defense during the Gulf War, was now Bush’s vice president.)

The Persian Gulf War devastated Iraq. As many as 100,000 Iraqi soldiers were killed and a great number of civilians also died. Iraqi roads, bridges, factories, and oil industry facilities were demolished. Water purification and sewage treatment facilities could not operate without electric power. A continuing trade embargo caused serious economic problems. In March 1991, Kurdish and Shiite Muslim uprisings, with encouragement from President Bush, who had promised American support that never came, broke out. But by April, Iraqi troops put down most of the rebellions with brutal efficiency.

In April, Iraq accepted the terms of a formal cease-fire agreement and the UN Security Council officially declared an end to the war. In the cease-fire agreement, Iraq agreed to the destruction of all its biological and chemical weapons, its facilities for producing such weapons, and any facilities or materials it might have for producing nuclear weapons. After the formal cease-fire, the UN continued the embargo to pressure Iraq to carry out these terms.

AMERICAN VOICES

GEORGE BUSH, accepting the 1988 Republican presidential nomination:

The Congress will push me to raise taxes, and I’ll say no, and they’ll push, and I’ll say no, and they’ll push again. And all I can say to them is, “Read My Lips: No New Taxes.”

How do you “downsize” a president?

With this swift and relatively low-casualty victory over Iraq, Bush’s ratings soared like a Patriot missile, a defensive weapon that had been hailed during the war for stopping Iraq’s Scud missiles. Before the Gulf War, Bush had already notched one win after U.S. troops swept into Panama and captured dictator Manuel Noriega, who had been indicted in a U.S. court on drug charges. Coupled with the victory in Kuwait, American prestige seemed unmatched in Bush’s New World Order.

But every missile that goes up must come down. Just as the Patriot missile’s reliability, accuracy, and performance were questioned after the war, the glow of the Reagan–Bush foreign policy coups vanished, dulled by a dizzying slide from postwar euphoria. Unemployment surged as the Federal Reserve’s high interest rates, designed to wring inflation out of the American economy, mired the economy in a recession. The new corporate policy of “downsizing”—in old words, “layoffs”—was pushing unemployment higher. And it was a different kind of unemployment. The downsized now included the white-collar workers who thought they had bought into corporate security, as well as the blue-collar factory workers more accustomed to the traditional cycle of layoffs and rehirings. Bush was viewed as out of touch with average Americans, and publicity stunts such as shopping for socks at a local mall only made him seem more disconnected.

Americans were cranky, and George Bush wasn’t the only target. Congress faced intense scrutiny as a series of scandals rocked the House. Speaker Jim Wright and Democratic Whip Tony Coelho resigned following ethics investigations, and a congressional post office scandal involving cushy check-writing privileges sounded like fraud to most Americans. The House clearly enraged the public. To the average American, Congress fiddled while America burned.

Early in his presidency, Bush had to deal with the worst crisis in the banking industry since the Great Depression of the 1930s. Between 1980 and 1990, more than 1,000 savings and loan institutions failed and hundreds more neared bankruptcy, a crisis resulting from defaults on loans, poor regulation, and fraud and mismanagement in the industry. Soon after entering office, Bush proposed legislation to rescue and restructure the industry. The savings and loan bailout eventually cost taxpayers more than $400 billion to maintain failing banks, one of which involved Bush’s third son, Neil, a board member of Silverado Savings, a Colorado bank.

Mounting budget deficits and deep anxiety over health care magnified these problems. The grim national mood darkened with the bitter debate over legal abortion, which President Bush opposed in a reversal of his own earlier position. “Gender gap” tensions rose higher when law professor Anita Hill charged that Clarence Thomas, Bush’s choice to succeed civil rights legend Thurgood Marshall (1908–93) on the Supreme Court, had sexually harassed her when she was his assistant. For two days in October 1991, the nation was once again transfixed by Senate hearings in which the two testified about the charges. Thomas denied Hill’s account and charged that a “lynch mob” was out to get him. Thomas was approved by the Senate 52–48, but the ugly episode, in which both Thomas and Hill became the targets of vicious character attacks, had further darkened the mood of the country along party, gender, and racial lines.

The depth of that racial animosity boiled over in April 1992 rioting that swept Los Angeles after four policemen were acquitted in the vicious 1991 beating of Rodney King. In two days of the worst American riots in a generation, President Bush had to order Marines and Army troops to keep peace in the city. When the violence subsided, fifty-two people had been killed and more than 600 buildings set afire, many of them burned to the ground.

But for American voters, George Bush’s gravest sin seemed to be his broken tax promise. When Bush agreed to new taxes to reduce the deficit in 1990, Desert Storm was worth so much desert sand. Since colonial days, Americans have hated taxes and often reserved a special hell for politicians who raise them, especially after pledging not to. With the recession millstone around his neck, George Bush plunged in the polls—his 90 percent approval ratings sharply downsized—as the 1992 election approached.

AMERICAN VOICES

Sign in BILL CLINTON’S campaign headquarters, attributed to campaign director James Carville:

IT’S THE ECONOMY, STUPID.

Can a man called Bubba become president?

Although it won’t go down as one of the great presidential pronouncements like, “Ask not what your country can do for you—ask what you can do for your country,” when candidate Bill Clinton told America, “I tried it once, but I didn’t inhale,” it certainly was memorable.

During the primary battles of 1992, Bill Clinton was asked by reporters about smoking marijuana in his college days, and his reply left many Americans choking with laughter. A Rhodes scholar who attended Oxford, where he avoided the Vietnam-era draft, Arkansas governor Bill Clinton (b. 1946) dodged many uncomfortable questions during the 1992 campaign. But Americans were less interested in pot smoking, draft dodging, and womanizing than in solving America’s problems. Running as an “agent of change” who promised reforms, Bubba Clinton and his vice presidential running mate, Senator Al Gore of Tennessee, became the first “baby boomers” to win the White House, following a raucous election most notable for the third-party candidacy of H. Ross Perot.

A man whose political fame would be built on assailing big government and excessive government spending, Ross Perot (b. 1930) had built his Electronic Data Systems into a billion-dollar firm with large and very profitable government contracts. With his deep pockets, the amply financed Perot ran as an independent with a campaign aimed at overhauling government. His folksy style and can-do approach appealed to millions of American voters who were completely disenchanted with the two major political parties, whose differences seemed marginal and who seemed most concerned with fund-raising and retaining control. But when he abruptly canceled his unorthodox campaign, Perot was dismissed as a wealthy kook. Then, only weeks before Election Day, Perot stunned the political world by rejoining the fray.

In a series of three-way televised debates, the most indelible image was that of President George Bush checking his wristwatch at the Richmond, Virginia, debate as if his limo was double-parked with the engine and the meter running. When advised to fire up his campaign, Bush turned to name calling, deriding Clinton and Gore as “bozos.” Gore, an environmentalist and author of a book about the risks of global warming, was dismissed as “ozone man.” Then a week before the election, the Iran-Contra special prosecutor announced a grand jury indictment of Caspar Weinberger, and questions about Bush’s role in the Iran-Contra case were pushed back into the headlines.

Garnering nearly 20 million votes (19 percent), Perot drew disaffected voters from Bush and probably skewed the race, allowing the Clinton-Gore ticket to win with 43 percent to Bush’s 37 percent. In later years, Bush would claim that the indictment of Weinberger and the failure of Federal Reserve Chairman Alan Greenspan (p. 557) to cut interest rates quickly enough doomed his presidency. But Ross Perot and the Reform Party, like several other successful third-party candidates in American history, had probably been the big difference, tipping the balance in a very closely divided and unhappy America.

Must Read: Shadow: Five Presidents and the Legacy of Watergate by Bob Woodward; First in His Class: The Biography of Bill Clinton by David Maraniss.

AMERICAN VOICES

Retired Arizona SENATOR BARRY GOLDWATER (1909–98), a longtime leader of the conservative movement in the Republican Party, on the question of homosexuals in the military:

You don’t need to be “straight” to fight and die for your country. You just need to shoot straight.

Who took out a Contract with America?

It may be called a “honeymoon,” but Bill Clinton must have been wondering when the fun would start. Before he unzipped his suitcases in the White House, Clinton’s brief “honeymoon” period was over. Having pledged to overturn the ban on homosexuals in the military, Clinton found himself walking into a Pentagon Chainsaw Massacre. Accepting a compromise “don’t ask don’t tell” policy, Clinton retreated from his promise, hinting at the policy and personnel reversals that would plague his first two years.

Two potential choices for attorney general were shot down in what was called Nanny-gate, the use of illegal aliens as childcare workers and nonpayment of taxes on those workers. Another Justice Department nominee, a widely respected black woman from the academic world named Lani Guinier, was attacked as a “quota queen.” Clinton withdrew her name as head of the civil rights division rather than stand up to a Republican firestorm. The accuracy of the charges against Guinier was never challenged, but a new style of “scorched earth” attacks in the guise of congressional advice and consent had been unleashed. The rough-and-tumble horse trading and attacks on presidential appointees had long been a cloakroom matter in Washington, mostly out of public view. Appointments had traditionally been treated as a president’s prerogative and, while there have been notable rejections of presidential appointees, most presidents get to make the appointments they want. But under the new rules, aided and abetted by twenty-four-hour news networks eager for blood in the water, the process had turned vicious. The personal tone of the assaults had been ratcheted up in the wake of the Bork and Thomas nominations. Republicans were looking for payback.

Each of these early setbacks to the Clinton agenda overshadowed a slowly recovering economy and, more surprisingly, the shrinking deficit. Risking that Americans wanted to be done with the excessive deficits, Clinton had gambled on a tax package that included tough deficit reduction restrictions in 1993. Passage of a free trade pact with Mexico and Canada (known as NAFTA) and a major anticrime package that included new handgun controls—known as the Brady Bill in honor of James Brady, the White House press secretary who had been severely wounded and permanently injured in the assassination attempt on Ronald Reagan—were also victories.

But Clinton’s gaffes were eclipsing his successes. Some miscues were trivial, such as the criticism over a $200 haircut. Others cut deeper. A standoff with a religious cult led by David Koresh in Waco, Texas, turned disastrous when an FBI assault on the compound led to a deadly fire (see p. 578). Bedeviled by continuing reports of his womanizing, dismissed during the primary campaigns as “bimbo eruptions,” Clinton was also being dogged by one woman’s alleged sexual harassment suit based on events that took place when Clinton was governor of Arkansas. The suit attracted only passing attention when it first surfaced—dismissed as another “bimbo eruption.” But these stories would eventually become connected to an ongoing investigation of the Clintons’ Arkansas investments and real estate deals, known as Whitewater. When White House aide Vincent Foster, a longtime Clinton friend from Arkansas, committed suicide, his death was tied into Whitewater as well, and official Washington went into a full-blown scandal investigation mode.

The policy stumbles, personal embarrassments, and major missteps culminated in the 1993 defeat of Clinton’s legislative keystone, the overhaul of the health-care system. Establishing a commission to examine American health-care policies, Clinton’s first mistake may have been his choice of his wife, Hillary Rodham Clinton, to head the commission. A controversial figure in her own right, the first lady stepped on congressional toes on the way to proposing a far-reaching plan that would cover all Americans. But the Clintons saw their prize project wither, bucked by Congress and an intense lobbying effort by the health-insurance industry. This sharp rebuttal of Bill Clinton’s policy centerpiece was an omen of 1994’s midterm elections.

In a political earthquake, Republicans swept control of the House of Representatives for the first time in forty years. They were led by Georgia representative Newt Gingrich, who trumpeted a conservative list of promises called the Contract with America. This “contract” promised a laundry list of favorite Republican right-wing positions, including a balanced budget amendment, increased defense spending, term limits for congressional seats, an amendment to end legal abortion, and a reform of the welfare system. They were joined by a Republican majority in the Senate, setting the stage for a struggle between the White House and Congress as America moved toward the final presidential election of the twentieth century.

A brilliant politician with finely tuned skills in the new world of “talk show” politics, Clinton was able to co-opt some of the Contract with America as he deftly moved to the political center, guided more often by his influential consultant, pollster Dick Morris, than by his political advisers. In November 1995, Clinton regained the upper hand when he battled Gingrich and the Republican Congress over the budget, a stalemate that actually led to a shutdown of the United States government. For a few days, nonessential federal workers were sent home because there was no funding to pay them. Most Americans barely noticed that the government was out of business. Clinton also co-opted one of the centerpieces of the Contract with America by championing a major overhaul of the welfare system despite opposition from traditional Democratic Party allies.

It was also during that budget confrontation with Newt Gingrich and the Republican House in mid-November 1995 that President Clinton took notice of a White House intern named Monica Lewinsky.

Must Read: All Too Human: A Political Education by George Stephanopoulos.

AMERICAN VOICES

FIRST LADY HILLARY RODHAM CLINTON on the Today show (January 27, 1998):

This started out as an investigation of a failed land deal. I told everybody in 1992, “We lost money.” . . . Well it was true. It’s taken years but it was true. We get a politically motivated prosecutor who is allied with the right-wing opponents of my husband. . . . I do believe that this is a battle. I mean, look at the very people who are involved in this, they have popped up in other settings. . . . [T]his vast right-wing conspiracy that has been conspiring against my husband since the day he announced for president.

What is “is”?

In the media-driven world that American politics has become, when the “photo op” has become the chief means of communicating policy, presidents and their staffs have become increasingly sensitive to the “image.” And for the public, it is those images—not always those of the president’s choosing—that become fixed in history. Today, whenever we see an image of Richard Nixon, he is either toasting the Chinese leadership in the historic visit that marked the high point of his presidency, or flashing a V sign as he boarded the helicopter that took him from the White House in disgrace. Ronald Reagan is almost always smiling broadly as he rides a horse on his California ranch, or honoring the war dead at a D-Day memorial in Normandy. Jimmy Carter is immortalized in a cardigan sweater telling Americans to turn their thermostats down during an energy crisis.

Then there is the indelible image of Bill Clinton, wagging his finger as he indignantly told America on January 26, 1998, “I did not have sexual relations with that woman, Miss Lewinsky. I never told anybody to lie, not a single time, never. These allegations are false.”

In fact, they were not false. At least not all of them. And on December 19, 1998, Bill Clinton became the second White House occupant in U.S. history to be impeached by the House of Representatives. (The first was Lincoln’s successor, Andrew Johnson; see p. 249.) Four articles of impeachment were sent to the House by the Republican-controlled Judiciary Committee, but only two of them were adopted by the full House.

The long, tawdry history of the Clinton impeachment case dates to scandals during Clinton’s years as attorney general and governor in Arkansas, and later as president. But they were also tied into the culture and history of Washington. The veneer of gentility in American politics had been erased, mostly since the 1970s Watergate era. The backroom political wrangling that had been kept behind closed doors changed into the no-holds-barred, pit-bull style of the eighties and nineties—with the full compliance of a new twenty-four-hour-a-day news media. In the new age of continuous cable and Internet news, the press no longer operated by the “old school” rules of the Washington press corps—rules under which FDR’s wheelchair was never shown in photographs and reporters looked the other way when it came to JFK’s numerous dalliances. The viciousness of the Robert Bork and Clarence Thomas hearings was all part of a new power game in which winning was the only thing.

Aided and abetted by his wife, Hillary, who sat by his side during a famous 60 Minutes television interview in which the couple confessed that they had marital problems, Clinton had been able to dodge stories of his womanizing during the 1992 campaign. But in 1994, a new story came along. An Arkansas woman named Paula Corbin Jones sued Clinton for sexually harassing her while he was governor in 1991. This story broke as the Clintons were being actively investigated by a special prosecutor, Independent Counsel Kenneth Starr, who was examining a tangle of Arkansas real estate deals known as Whitewater, as well as two separate cases involving misuse of FBI files by the White House and missing billing records from Hillary Clinton’s Little Rock law firm. Starr’s investigations were essentially going nowhere and Clinton’s law team was successfully delaying the Jones suit until after the 1996 election—in which Clinton rather easily defeated veteran Republican senator Robert Dole, with Reform Party candidate Ross Perot playing a substantially diminished role.

In spite of these high-profile investigations, and in the midst of his reelection campaign, Clinton had become involved with Monica S. Lewinsky. She was, in 1995, a twenty-one-year-old White House intern. Between their first encounter and March 1997, when her internship ended, Lewinsky frequently and repeatedly engaged in what most people would call “sex” with Clinton in the Oval Office. As it turned out, Bill Clinton defined the word differently.

The Starr investigation and the Jones suit were proceeding when Jones’s lawyers were informed of the Clinton-Lewinsky affair by Linda Tripp, a former White House staffer who worked in the Pentagon, where Lewinsky had been transferred. Tripp had surreptitiously taped the young woman discussing her involvement with the president. Clinton was then questioned about this relationship by the Jones law team under oath and denied the relationship. Starr received word of this denial and began an investigation into possible perjury and obstruction-of-justice charges against Clinton. In the course of videotaped testimony to prosecutors, Clinton was once asked, “Is that correct?” Already notorious for his ability to wiggle around words, Clinton responded, “It depends on what the meaning of the word ‘is’ is.”

In September 1998, just prior to the midterm congressional elections, Independent Counsel Starr released a report on his investigation to Congress. (He chose not to announce that the Clintons had been exonerated in the investigations of the FBI files and the law firm billing files.) Disgusted by the revelations in the Starr report of oral sex taking place during official calls to congressmen and media reports of Lewinsky’s semen-stained dress, the American public seemed to be split three ways: those who hated Clinton, those who defended him, and a large middle ground that seemed to want the whole question to go away and the government to get on with its business. Pressing Clinton at every turn, the Republicans smelled blood in the water. But they apparently misread the American mood.

In a stunning reversal of American election tradition, the Democrats gained five seats in the House. (The party controlling the White House historically loses seats in the sixth year of a presidency.) Postelection analysis pointed to voters who were tired of the Republican obsession with the scandal. House Speaker Newt Gingrich, the leader of the conservative Republican “revolution,” had gambled $10 million on last-minute advertising that attacked Clinton, and was largely blamed for the party’s dismal showing. Within a week of the election, Gingrich announced his resignation from the House.

The essence of the case for impeachment of the president presented by Starr boiled down to a seamy affair about which the president had clearly lied while under oath and had possibly asked others to lie for him. These were the charges that the House took up on December 19, 1998, when they impeached the president for perjury and obstruction of justice.

In an almost farcical incident that captures the tenor of the times, Clinton’s impeachment vote was nearly overshadowed when, on the same day, Louisiana congressman Robert Livingston, the speaker of the House designate, who was to have replaced Newt Gingrich, announced his retirement from the House. In dramatic fashion, he resigned after Hustler magazine publisher Larry Flynt had uncovered reports that Livingston had had at least four extramarital affairs. The revolution was eating its own.

Apart from the case of Richard Nixon, who certainly would have been impeached had he not resigned, Congress had dealt with fifteen impeachment proceedings since the first case in 1799. Twelve involved judges, one was a cabinet member, one a senator, and only one a president, Andrew Johnson. Of these fifteen impeachments, seven men had been removed from office (all of them federal judges); two cases were dismissed; and six ended in acquittal. Although these earlier impeachment cases provided a limited set of historical precedents, most of the questions regarding impeachment are fairly clear. During the Watergate hearings, a history of impeachment had been prepared for the House Judiciary Committee. One of the young lawyers who worked on it was a recent Yale Law graduate named Hillary Rodham.

During the framing of the Constitution in 1787, the rules regarding impeachment were vigorously debated by men who recognized the enormity of removing an official from office, especially an elective office. The working draft of the document called for removing the president only for bribery and for treason. After heated discussion over the question of how easy such a removal should be, Virginia’s George Mason offered a compromise phrase that dated from old English law: “High Crimes and Misdemeanors.” And it is that phrase that has caused the most controversy. To most people, the modern sense of the word “misdemeanor” means a petty crime. But many historians hold that when the Constitution was composed, “high misdemeanors” referred specifically to offenses against the state or community as opposed to a crime against people or property.

Alexander Hamilton, who would soon after be dragged publicly through a scandal owing to his own sexual misbehavior, explained this view in the Federalist Papers. To Hamilton, an impeachable offense had to be “of a nature which may with peculiar propriety be denominated POLITICAL, as they relate chiefly to injuries done immediately to the society itself.” That was the reason that the 1974 Judiciary Committee rejected an article of impeachment against Richard Nixon for cheating on his personal income taxes. Even so, many impeachment proceedings, and certainly the one brought against Andrew Johnson, were politically motivated. As Gerald Ford said in 1970, as a member of the House, “An impeachable offense is whatever a majority of the House of Representatives considers it to be at a given moment in history.”

On January 7, 1999, the Senate impeachment trial was formally opened by Chief Justice William H. Rehnquist. The Republican majority in the Senate meant that a strict party line vote would bring Clinton’s conviction. Five days later, the Senate voted to acquit President Clinton on both articles of impeachment. On the first count of perjury, ten Republican senators joined the Democrats in a 55–45 vote of not guilty. On the second count of obstruction of justice, five Republicans joined the Democrats for an even 50–50 vote. Clinton was left to serve out his term. On Clinton’s next-to-last day in office, his legal team cut a deal that spared Clinton from criminal charges. He admitted that he had made false statements under oath and surrendered his law license. He also agreed to a payment to Paula Jones, which, had it been agreed to earlier, would have probably prevented the whole Lewinsky affair from becoming public.

After Clinton left office, a final report by Independent Counsel Robert Ray, who succeeded Kenneth Starr, concluded in March 2002 that prosecutors had insufficient evidence to show that either the president or Hillary Rodham Clinton had committed any crimes. However, the report stated, “President Clinton’s offenses had a significant adverse impact on the community, substantially affecting the public’s view of the integrity of our legal systems.” Clinton’s defenders dismissed the report as they had dismissed most of the charges brought against him—the result of a partisan assault on a president deeply despised by many conservative Republicans. The report also marked an end of an era. Congress had allowed the law that created the role of the special prosecutor, a vestige of Watergate days, to lapse.

Did this long-running soap opera affect history? More to the point, did it affect policy? Many of Clinton’s critics argued that the latter was certainly true. Twice during the Lewinsky scandal, Clinton had launched air strikes; once against Iraq, and once against suspected terrorist bases in Sudan and Afghanistan, thought to be the home of a then relatively obscure but very wealthy Saudi who was thought to be funding terrorist activity—Osama bin Laden. Republicans openly questioned these actions as a Wag the Dog scenario, so named for an eerily prescient film in which a war is fabricated to improve a president’s fading political standing. After the second American missile attack, which was launched in response to the bombing of two separate American embassies, the Senate majority leader, Republican Trent Lott, stated openly, “Both the timing and the policy are open to question.” Republican representative Gerald Solomon was even more critical. “Never underestimate a desperate president,” he said in a press release. “What option is left for getting impeachment off the front page or maybe even postponed.” (In retrospect, Clinton was later criticized for not having attacked bin Laden more aggressively.)

This suspicion about Clinton’s motives carried over into the biggest military undertaking of his two terms, the leadership of a NATO alliance attack on Yugoslavia in March 1999 to stop the “ethnic cleansing” policies against ethnic Albanians in the province of Kosovo. In the wake of the breakup of the Soviet Union, former Eastern European Communist states like Yugoslavia had also come unglued. Some, like Poland and Czechoslovakia, had done so peacefully. But in Yugoslavia, that was not the case. Yugoslavia broke apart, and hostilities ensued among the republics along ethnic and religious lines. Croatia, Slovenia, and Macedonia declared independence in 1991, followed by Bosnia-Herzegovina in 1992. Serbia and Montenegro remained as the Republic of Yugoslavia.

Bitter fighting followed, especially in Bosnia, where Serbs reportedly engaged in “ethnic cleansing” of the Muslim population. A peace plan, the Dayton Accord, was brokered by the United States and signed by Bosnia, Serbia, and Croatia (in December 1995) with NATO responsible for policing its implementation. But in the spring of 1999, led by the United States, NATO conducted bombing strikes aimed at stopping Yugoslavia’s campaign to drive ethnic Albanians out of the Kosovo region. Of this foreign policy decision, journalist Bob Woodward wrote, “When President Clinton led the NATO alliance to attack Yugoslavia to stop the ethnic cleansing in Kosovo, he voiced the right humanitarian motives. Yet there was a careless ad hoc quality to his decision making. The clearest lesson of Vietnam and the Gulf War seemed to have been ignored. When going to war, state clear political objectives and ensure that enough military force is committed to guarantee success. Yet . . . lingering in the background were the unavoidable suggestions that Clinton’s actions were influenced by his need for personal atonement and his political desire to do something big and bold so historians would concentrate less on his impeachment.”

Ultimately, the campaign against Yugoslavia in Kosovo proved largely a success, with American air power practically destroying Yugoslavia’s ability to function. A peace accord was reached in June 1999 under which NATO peacekeeping troops entered Kosovo. President Slobodan Milosevic was ousted after the defeat. Later arrested, he was extradited in 2001 to the Hague, where a UN tribunal tried him for war crimes against humanity. In a landmark verdict, a UN tribunal had found Bosnian general Radislav Krstic guilty of genocide in August 2001 for his role in the mass killing of more than 5,000 Muslims in 1995.

Perhaps time and some future memoir will shed more light on the question of Clinton’s legacy and the role his impeachment will play in it. Until then, journalist Jeffrey Toobin’s assessment in A Vast Conspiracy seems appropriate. “To be sure he will be remembered as the target of an unwise and unfair impeachment proceeding. But just as certainly, history will haunt Clinton for his own role in this political apocalypse, and for that, despite his best efforts, this president can blame only himself.”

Must Read: A Vast Conspiracy: The Real Story of the Sex Scandal That Nearly Brought Down a President by Jeffrey Toobin.

AMERICAN VOICES:

FEDERAL RESERVE CHAIRMAN ALAN GREENSPAN in testimony before the Senate Banking Committee (February 26, 1997):

Caution seems especially warranted with regard to the sharp rise in equity prices during the last two years. These gains have obviously raised questions of sustainability.

From the beginning of 1995 to the time Greenspan spoke, the Dow Jones Industrial Average had increased 80 percent.

What is “irrational exuberance”?

He saved the world at least three or four times. He cost George Bush I his reelection in 1992. He made everybody in America rich by causing the markets to soar. When he used the phrase “irrational exuberance” to describe a stock market that he feared might be too high in 1996, he sent tremors through the global economy. He caused a recession and crashed the market in 2001. He can jump tall buildings in a single bound. He is not a bird or a plane. He is the chairman of the Federal Reserve.

Who is Alan Greenspan, the “Fed chairman”? And how did he get to be the most powerful man in the world?

Until fairly recent times, few Americans ever heard of the Federal Reserve Board or cared who its chairman was. But in the person of Alan Greenspan, the Fed chairman became one of the most powerful people in the world. By the mid-1990s, his every pronouncement and television appearance was watched with the same reverence once given the Delphic oracle in ancient Greece. One financial news network even created a “briefcase index,” a humorous attempt to determine if the thickness of the chairman’s attaché case might provide some clue to the actions he was about to take—actions that could make or break fortunes and entire national economies with a word. While “Greenspan watching” became a national pastime during the 1990s, many people had a very basic question: Who was this man and what did he actually do?

The Federal Reserve Board, or the Fed, was created as the central bank of the United States in 1913 with passage of the Federal Reserve Act. An independent government regulatory agency, the Fed is supposed to preserve and protect a flexible but stable economy. To do that, it has power over the nation’s currency and conducts the nation’s monetary policy (simply put, the actions taken to influence how much money is theoretically in the economy at a given time, also known as the money supply). It also regulates banks. Since 1913, the act has been modified, and in 1978, the Full Employment and Balanced Growth Act instructed the Federal Reserve to seek stable prices—in other words, to fight inflation—and maximum sustainable growth for the economy, while also seeking to maximize employment.

And there’s the rub, as they say. Economics (also known for good reason as “the dismal science”) traditionally holds that growth is good, but too much growth is not good. An economy that is growing too fast will inevitably lead to inflation—which in its simplest terms is too much money chasing too few goods, or demand outstripping supply. Job growth is also good, but too much job growth is not; in theory, when too many people work, labor costs rise and lead to inflation. In the classic economic textbook view, a certain amount of unemployment is necessary, even desirable, because it limits rising labor costs and dampens consumer demand, helping to keep prices in check. Until the economic boom of the mid-1990s, economic orthodoxy said that the unemployment rate could not fall below 6 percent without provoking dangerous levels of inflation.

What, then, is “sustainable” growth? And what is “maximum employment”? These are two of the key questions that the Federal Reserve has to wrestle with in setting its policies—policies that can ultimately determine the cost of a home mortgage or car loan, the profits that support corporate survival and the stock market, or even, potentially, who is the next president.

Officially, the Federal Reserve system includes the board of governors in Washington, D.C., and the twelve district Federal Reserve banks and their outlying branches. The seven governors of the Federal Reserve system are nominated by the president and confirmed by the Senate to serve fourteen-year terms, nearly lifetime appointments that are supposed to guarantee that short-term political considerations will not enter into the Fed’s deliberations and decision making. But like the Supreme Court, the Fed is keenly aware of which way the political winds are blowing. The chairman and the vice chairman of the board of governors are named by the president and confirmed by the Senate. They serve a term of four years, with no limit on the number of terms they may serve. (Greenspan was appointed to his fourth term in 2000. He retired in 2006 and was replaced by Ben Bernanke.)

There are twelve reserve banks, in Atlanta, Boston, Chicago, Cleveland, Dallas, Minneapolis, Kansas City, New York, Philadelphia, Richmond, St. Louis, and San Francisco. These banks oversee the banking industry, regulate the coin and paper currency in circulation, clear the majority of all banks’ paper checks, and facilitate wire transfers of payments.

Why the Federal Reserve Matters: A Glossary of Financial Terms and “Fedspeak”

Board of Governors: The board of governors of the Federal Reserve system is made up of seven governors, each appointed by the president and confirmed by the Senate, to fourteen-year terms. The chairman is a member of the board of governors, and his position as head of the Federal Reserve is based on a separate four-year appointment by the president. The board controls the discount rate (see p. 560), and each member of the board is a member of the Federal Open Market Committee (see p. 561).

Bonds, Bills, and Notes: A bond is a note or obligation requiring the borrower—usually a government or corporation—to pay the lender—an individual or institutional investor—the amount of the loan (“face value”) at the end of a fixed period of time. It is one of the chief ways that governments and businesses raise cash to stay in business. Government and corporate bonds trade in an open bond market, and the prices of bonds move in the opposite direction of interest rates.

The United States Treasury issues several types of securities: Treasury bonds usually have maturities of between ten and thirty years, although thirty-year bonds were discontinued; Treasury notes are securities with a maturity of more than one year but less than ten years; and Treasury bills are securities with maturities ranging from thirteen weeks to one year. Together, these Treasury-issued securities are generally referred to as bonds and are considered safe investments with little or no risk to principal (the original investment). Corporations also issue bonds, which generally offer higher interest returns but higher degrees of risk. There are many types of corporate bonds, and they are rated in terms of their risk. The riskiest bonds, which usually offer the lowest credit quality but the highest potential returns, are also known as junk bonds.

Budget Deficit/Surplus: When the government budgets more spending annually than it takes in taxes, tariffs, and other fees it collects, the shortfall is a deficit. Accumulated deficits make up the national debt. Large deficits tend to lead to higher interest rates, as the government competes for capital with private business and consumers. But some economists hold that deficit spending is necessary, often to stimulate the economy and create jobs, or in a crisis such as a war. A surplus is created when the government takes in more than it spends. The surplus can then be saved against future budget expenses, used to reduce the debt, or returned to taxpayers in the form of tax cuts.

Central Bank: A national bank that operates to control and stabilize the currency and credit conditions in a country’s economy, usually through the control of interest rates (monetary policy). The Federal Reserve is the central bank of the United States.

Consumer Price Index (CPI): A measure of the average change over time in the prices paid by consumers for certain goods and services. The CPI, announced on a monthly basis, is considered the broadest measure of the country’s rate of inflation (see p. 561). Another statistical gauge, the Producer Price Index, or PPI, measures costs that producers—factories, manufacturers, farmers, etc.—pay to make and deliver their goods. Higher producer prices often lead to higher consumer prices as the business passes along its higher costs. During the 1990s, many economists argued that the CPI overstates the real inflation rate by as much as 1 percent because of difficulties in gauging prices on a national scale and out-of-date comparisons.

Discount Rate: The interest rate that the twelve Federal Reserve banks charge to private commercial banks, savings and loan associations, savings banks, and credit unions to borrow reserves from the Fed. This rate is controlled by the board of governors. In recent times, this rate has been the key means the Fed uses to set interest policy.

Easing or Tightening: To ease, or make credit more available, the Fed pumps money into the nation’s banking system by buying U.S. Treasury bonds. This causes the Fed funds rate to go down, making it easier for consumers and businesses to borrow money. Normally used to fight a slow-growth economy or recession (negative growth), easing encourages the economy to grow as consumers and companies buy more “stuff.”

To tighten credit, the Fed sells U.S. Treasury bonds, which withdraws money from the banking system. With the money supply decreased, banks become less willing to lend, making borrowing more difficult for businesses and consumers. When short-term interest rates rise, the economy’s growth usually slows. Tightening credit is the Fed’s main tool for fighting inflation.

Federal Open Market Committee (FOMC): A twelve-member committee that meets eight times a year (about every six weeks) to assess the state of the economy and set guidelines for the Federal Reserve regarding the sale and purchase of government securities in the open market. Chaired by the reserve chairman, the FOMC consists of the seven governors and the presidents of the twelve Federal Reserve banks, but only twelve people vote in the committee. They are the seven Fed governors, the president of the New York Federal Reserve Bank, and four of the other eleven Federal Reserve Bank presidents, who serve one-year voting terms on a rotating basis.

Fed Funds Rate: The average interest rate at which federal funds actually trade in a day. The Fed influences interest rates by easing or tightening through either the sale or the purchase of U.S. Treasury bonds, and this rate is controlled by the Federal Reserve’s Federal Open Market Committee, or FOMC (see above). It is the rate that banks charge one another for overnight loans—a key “short-term” interest rate. The funds rate affects overall credit conditions in the country and is the Fed’s main weapon against both inflation and a slow growth economy, or recession, a shrinking economy.

Fiscal Policy: The government’s plan for spending and taxing. (It differs from monetary policy, which the Fed controls; see p. 562.)

Gross Domestic Product (GDP): Once known as the Gross National Product, this is the broadest measure of the output of a nation’s economy—the total amount of all goods and services that a nation produces. Measuring the GDP determines whether the economy is growing or contracting and how fast it is moving in either direction.

Inflation: Simply put, too many dollars chasing too few goods, usually the result of demand outstripping supply. The result is rising prices, as measured by the CPI (see p. 560). While price increases are generally necessary to sustain business profits and wage growth, a rapid rise in prices of all goods and services is considered a danger to the economy. Fighting inflation and keeping a lid on sharp price rises are some of the key goals of the Fed.

Monetary Policy: The central bank’s actions to influence interest rates and the supply of money available in the economy.

Productivity: Simply, the statistical measure of the average hourly output of workers. Ideally, productivity gains allow efficiencies that both reduce consumer costs and increase profits, which can then be shared with workers, increasing the standard of living. Productivity is viewed as a key to restraining inflation as the cost of goods falls because they are cheaper to produce.

During the 1990s, Alan Greenspan came to accept the theory that rapid and lasting gains in American productivity—the result of technological advances and a better educated and trained workforce, among many other factors—were the key to sustained economic expansion with little inflation.

Recession: A nationwide decline in overall business activity characterized by a drop in buying, selling, and production, and a rise in unemployment. Many economists consider a nation’s economy in a recession if the output of goods and services falls for six consecutive months, or two quarters. When a recession grows worse and lasts longer, it becomes a depression.

Stocks: Certificates representing partial ownership in a corporation and a claim on the firm’s earnings and assets. Stocks of profitable corporations normally yield payments of dividends, a portion of the corporate earnings distributed to shareholders. At its most basic, the value of stock often rises and falls as a company meets, or fails to meet, earnings expectations.

During Alan Greenspan’s tenure at the Fed, which began in 1987, the American economy did some remarkable things. For ten years beginning in 1991, the economy grew steadily, sometimes rapidly, with relatively low inflation. Once upon a time, they called it the “Goldilocks economy.” Jobs were being created at a record clip, pushing unemployment ranks down to unprecedented peacetime levels. Inflation was tame. Corporations were making money. People had jobs and record numbers were investing in the stock market, primarily through their company’s retirement plans. Everything seemed to be “just right,” as Goldilocks would say.

New technologies, relative peace in the world with the end of the Cold War, the growth of global markets, and freer trade were all credited for powering this economic engine. But to many people, the guiding hand behind this remarkable economic turnaround was the sometimes inscrutable Alan Greenspan.

Chairman of the board since 1987, Alan Greenspan was appointed by President Reagan to replace Paul Volcker, a Democrat. An imposingly tall, cigar-smoking, powerful personality, Volcker had been responsible for setting interest rates extremely high in an attempt to battle the excessive inflation of the late 1970s. Volcker’s chairmanship had lasted from 1979 to 1987, and his bitter anti-inflation remedy of high interest rates had worked over the long haul, reducing the inflation rate from 13.3 percent to just 1.1 percent. However, the policy was also responsible, in large measure, for Jimmy Carter’s inability to get the economy moving, one of the chief reasons he was so handily trounced by Ronald Reagan in 1980. Reagan’s political team was going to name a Fed chairman that they hoped wouldn’t similarly cripple Reagan’s presidency.

They thought they had found him in a Republican economist who had made a fortune in economic forecasting and was a free-market true believer. Born in 1926 in New York City, Alan Greenspan was a Depression-era child whose mother, a furniture store salesperson, and father, a self-educated stock market analyst, divorced when he was three. He attended George Washington High School in upper Manhattan where he was, coincidentally, three years behind future secretary of state Henry Kissinger. From there, he went on to study clarinet and piano at what later became the Juilliard School of Music, dropping out after two years to play in a big band. Always a lover of statistics and numbers, he enrolled at NYU while still in the band to study economics, graduating summa cum laude in 1948, and getting his master’s degree in economics in 1950. In 1952, he married a painter who introduced him to the influential writer Ayn Rand, the Russian-born novelist whose best known works are The Fountainhead (1943) and Atlas Shrugged (1957). Devoutly anti-Socialist and antireligious, Rand set forth a moral and economic philosophy, called Objectivism, based on individualism and self-interest. Greenspan was a committed disciple of Rand’s philosophy, which eschewed government interference and celebrated the individual over the greater society.

During the 1950s, Greenspan and a partner formed an economic consulting team that forecast changes in the economy and made a considerable fortune during the postwar boom years. During the 1970s, he had served as an economic adviser to President Ford. He was also given high marks for chairing a bipartisan commission charged with shoring up the Social Security system. Following that, he was working in the private sector when the call came from Ronald Reagan.

Two months after he took office in August 1987, Greenspan faced a Wall Street crisis of near 1929 proportions. On one day in October, the stock market lost 508 points, more than 22 percent of its value, in the Crash of ’87. The nation’s largest financial institutions, which were facing huge investment losses along with the customers who owed them money, were in danger of cracking and crumbling. It was a potential repeat of the Great Crash of 1929, when the stock market collapse had begun a run on banks that led to the downfall of the American banking system. Nearly sixty years later, if one or more of the companies in the very elaborate network of banks and brokerage houses that made up the world’s financial system had failed to make payments, or even delayed payments, a devastating domino effect might once again have been triggered. The next morning, Greenspan issued a terse announcement: “The Federal Reserve Bank, consistent with its responsibilities as the nation’s central bank, affirmed today its readiness to serve as a source of liquidity to support the financial system.”

The statement, meant to soothe roiled nerves on Wall Street, would have been meaningless without action behind it. Like two Dutch boys putting their fingers in the leaking dikes, Greenspan and the New York Fed Bank president worked the phones to executives of the nation’s largest banks, getting them to extend credit to some of their insolvent debtors, and promising that the Fed would back them up. It was done behind the scenes. It broke some rules, but it was what the Fed had failed to do in 1929. And this time, the dikes held.

Greenspan’s 1987 gambit made him a hero on Wall Street. Five weeks after the stock market crisis, a Wall Street Journal headline read: “Passing a Test: Fed’s New Chairman Wins a Lot of Praise on Handling the Crash.” And he parlayed his success into new power. Before the Greenspan era, the FOMC voted as a body on each change in interest rates. In the aftermath of the 1987 crash, Greenspan persuaded his colleagues to give him new powers by changing the process. While the committee would vote on an overall policy direction—either in the direction of raising or cutting interest rates—the timing and size of rate setting were left to the chairman. As Bob Woodward writes of this shift, the FOMC was “basically ceding operational control to Greenspan.”

The powerful post of chairman became even more powerful. In other hands, this much power might have been dangerous. But the consensus is that Greenspan’s swift, decisive actions and behind-the-scenes reassurances had prevented a greater meltdown. During the next few years, he headed off several other potential global financial catastrophes, including the huge banking losses of the 1990s, the collapse of the Mexican peso in 1994, the “Asian contagion” of 1998, when several newly important Asian economies basically failed, a collapse of the Russian ruble following the end of Soviet rule, and the failure of Long-Term Capital Management, an investment firm whose failure might have brought down a number of large banks, possibly setting off another domino effect with devastating consequences for the financial markets.

Greenspan did that by building consensus among his fellow central bankers, with a deft mastery of minute details of the working of the American economy. He established credibility as an inflation fighter, which reassured Wall Street. But he also began to realize that the old economic rules had changed. Perhaps most important, he rolled up his sleeves—literally—to work with statisticians and realized that the changes in technology and other efficiencies in productivity meant that the economy could grow and unemployment fall without appreciable inflation.

Ironically, the Republican banker–inflation fighter was viewed by some as one of the reasons behind a Republican president’s defeat. Republicans, including George Bush himself, later said that Greenspan had been responsible for his defeat in the 1992 election. Years later, Bush said in a 1998 Wall Street Journal interview, “I think that if interest rates had been lowered more dramatically that I would have been reelected President, because the recovery that we were in would have been more visible. I reappointed him, and he disappointed me.” While some economists agree that Fed rate cuts came too late and too slowly during the recession of 1990–91, the causes for Bush’s defeat were probably more complex than Greenspan’s interest rate decisions and more likely hinged on the Perot candidacy. (See “Can a man called Bubba become president?” p. 545.)

The flip side of that irony lies in the other key to Greenspan’s success—his alliance with President Bill Clinton in engineering what many economists believe was the underpinning of the financial boom of the 1990s. The two struck a bargain to keep interest rates low if the president would work with Congress to reduce the federal deficit. Greenspan’s deal with Democrat Clinton, and Clinton’s ability to broker that balanced budget plan with Congress to deliver on deficit reduction, may ultimately be considered Clinton’s single most important accomplishment as president—perhaps removing some of the sting of being only the second president to be impeached in American history.

The other question economists and others argue about was whether Greenspan was right back in 1996. Was there “irrational exuberance”? And was that the cause of the massive meltdown that crippled Wall Street from 2000 forward? In a short space of time, the NASDAQ stock average, which had rocketed to 5,000 on the strength of a flood of money being invested in new technology companies, the Internet, and “dot.com” startups, fell back below 2,000. The Dow Jones Industrial Average, which topped out at more than 11,000 points in 2000, tumbled back below the 9,500 range—later moving down to 7,500—as America entered its first recession in a decade and people learned for the first time in a decade that stocks can move up and down. Technology hysteria, “bubble” mentality, crowd psychology, stock analysts and brokers who touted everything as a “buy,” thereby inflating stock prices, as well as revelations of an increasing number of possible frauds committed against shareholders by corporations and accountants who were disguising their profits by “cooking the books” all served to overheat the financial markets.

In many ways, the 2000 stock market meltdown was comparable to the great losses of 1929, except this crash affected selected industries and did not take the entire banking community down with it as thoroughly. Decades of federal regulation governing stock markets and banking, and a more diverse economy that was in far better fundamental shape than the 1929 economy had been, served to prevent a global financial catastrophe on the order of the Great Crash of 1929.

Must read: Maestro: Greenspan’s Fed and the American Boom by Bob Woodward.

Is that chad dimpled, pregnant, or hanging?

“Those who cast the votes decide nothing. Those who count the votes decide everything.”

Joseph Stalin, to whom that quote is attributed, may well be smiling in the Socialist Paradise, if there is one. The Soviet dictator would have certainly appreciated the American presidential race of 2000 and the strange goings-on in the state of Florida. A smile might also cross the face of William “Boss” Tweed, the notorious nineteenth-century “fixer” of all things political in New York City. On Election Day in 1871, Tweed said, “As long as I count the votes, what are you going to do about it?”

If it hadn’t been so important, it would have been completely comical. Citizens of the richest, most powerful, most technologically advanced country in the world—on the verge of launching the International Space Station into orbit 300 miles above Earth—were reduced to holding paper ballot cards up to the light to see if a hole had been punched through the paper or not. As the world watched the state of Florida in electoral disarray, America was learning a whole new vocabulary of “chads”—the small bits of paper produced when a hole is poked through a punch card. For all of America’s fears of going into the year 2000 with the much-dreaded possibility of a Y2K computer bug that would crash the world’s entire information and technology apparatus, it was the simple act of pushing a sharp stick through a piece of paper that actually wreaked the most havoc.

It had been a very long time since America had seen a presidential election so close: the Nixon-Kennedy race of 1960, to be exact. It had been even longer since Americans had seen an election in which the candidate with the most popular votes lost. That dates back to the controversial race of 1888, when Grover Cleveland won the popular vote and Benjamin Harrison won the Electoral College vote and became president. But these are the strange but true vagaries of American presidential politics.

Election Day 2000 pitted the sitting vice president against the son of the man who had been defeated eight years earlier in the election of 1992. On the face of it, the incumbent vice president, enjoying the fruits of the longest and most successful era in American economic history, had the cards in his favor. The economy, though flagging, was still strong, with employment high and inflation low. Prosperity is usually good for an incumbent. There was relative peace in the world. The American effort in Bosnia had been a success, with little cost in American lives. To be sure, there were problems, but Al Gore thought that his biggest problem was embracing everything good about the Clinton years while distancing himself from the scandals that had nearly destroyed Bill Clinton’s presidency.

On the other side was the two-term governor of Texas, George W. Bush. Dismissed as an intellectual lightweight, he campaigned as a “compassionate conservative” who would restore honor and dignity to a White House besmirched by Clinton’s reprehensible behavior. Both men had survived bruising primary battles. Gore had fought off an insurgency from former New Jersey senator Bill Bradley, a Rhodes scholar and former professional basketball player for the New York Knicks who appealed to the more liberal wing of the Democratic Party. Bush had confronted a strong popular campaign waged by Arizona senator John McCain, a conservative Republican who was a much admired war hero, survivor of years of captivity in a North Vietnamese prison in Hanoi. The primary campaigns had left both candidates battered, and in a fairly unmemorable campaign, neither distinguished himself. Interestingly, one of the most historic aspects of the campaign—the first Jewish nominee for vice president in the person of Connecticut senator Joseph Lieberman—barely made a ripple.

There were also two significant independent candidates adding intrigue to the race. Consumer advocate Ralph Nader represented the liberal environmentalist Green Party. Patrick Buchanan, the fiery archconservative former Republican candidate, had wrested control of Ross Perot’s once crucial Reform Party. But neither of these two men was included in any of the debates between the two main party candidates, and it seemed that they would have little impact on the national vote. But as Tip O’Neill, the famed Massachusetts congressman and speaker of the house, once said, “All politics is local.” And Nader and Buchanan would prove to have an impact on a few important local races.

On the eve of the election, surveys had the race too close to call. America seemed to be split in two. And for once the polls were right. As the voting booths closed across the country on election night, the closeness of the election was apparent, but Gore seemed to have a slim edge. Late on election night, network television projections declared Al Gore the victor in the state of Florida, giving him the electoral votes needed to win the presidency. The balloons of victory were soon floating in the air. But they were about to burst. In a stunning turnaround, the network projections were retracted. Florida was proving too close to call.

In Florida, earlier on Election Day, state Democratic officials had already begun to receive reports of confused voters in several election districts who believed that they may have miscast their votes. Many of them were Jewish senior citizens, traditionally Democratic Party loyalists, who feared that they might have cast votes for Reform Party candidate Patrick Buchanan, considered by many Jews to be an anti-Semite, largely because of remarks he had once made about Hitler. Others thought that they had pushed out two holes in the paper punch cards. Would their votes still count? At the same time, Ralph Nader, on the Green Party ticket, was drawing about 100,000 votes in Florida. It was not a huge number, but his votes were presumably drawn from more liberal and reformist independent voters who might have been more likely to choose Gore if Nader had not been in the race.

In the early hours of the morning, television networks completed the turnabout and projected George Bush the winner of Florida’s electoral votes, giving him the presidency. Gore telephoned Bush to concede. Told that the race in Florida was too close to call, Gore then retracted his concession and the projection of Bush as the winner was also retracted. Governed by Jeb Bush, the Republican candidate’s brother, Florida had been considered a key state in the election. Nobody could have imagined just how key—and chaotic—it was going to be.

On the day after Election Day, Gore had a lead in the national popular vote and was ahead in electoral votes, with 255 to 246 for Bush; 270 electoral votes are needed to win the presidency. While two other states were also still undecided, the balance of the election hung on the outcome in Florida, with its 25 electoral votes.

Before the day was out, the first suits contesting ballots in some Florida counties, like the first scattered gunshots in a battle, were filed by both sides. While an incomplete count of Florida’s votes gave Bush a statewide lead of 1,784 votes, prompting a mandatory statewide mechanical recount, voters in heavily Democratic Palm Beach County filed a suit challenging the results there. They were specifically challenging the “butterfly ballot,” a paper ballot with candidates’ names shown in two facing columns. Voters were expected to punch out a hole that corresponded to their candidate. But the ballot had confused many voters, especially the “early bird special” vote, Florida’s large, politically active retiree community. Many voters worried that they had mistakenly voted for Buchanan; some, realizing they had made a mistake, punched a second hole instead of requesting a new ballot. Ballots with two votes cast would be invalidated as “overvotes.”

Over the course of the next month, Florida became an armed camp of dueling lawyers and “spin doctors” all trying to sway the courts and public opinion. Battling through local, county, state, and federal courts, the two sides fought over which votes should be counted, who should count them, and whether it was too late to count them under Florida law. Finally, on December 1, the U.S. Supreme Court heard arguments on an appeal by the Republican candidate George Bush. On December 4, the U.S. Supreme Court vacated a Florida Supreme Court decision extending the deadline for certification of votes, and returned the case to the state court for clarification. When the Florida Supreme Court ordered a manual recount of all ballots in which a vote for president was not recorded by a machine and restored 383 votes from partial recounts in two Florida counties, the Bush legal team again appealed to the U.S. Supreme Court.

On December 9, in a five-to-four decision, the Supreme Court halted the manual recounts pending a hearing of Bush’s appeal, and on December 11, the Court heard oral arguments from the two sides. Finally, in high drama at 10 P.M., on December 12, the U.S. Supreme Court issued two unsigned rulings that reversed the Florida Supreme Court’s order to proceed with the recount. Technically speaking, the Court had sent the decision back to the Florida Supreme Court for review, while noting at the same time that there was no time for a recount because of constitutional deadlines.

On December 13, in a televised address, Al Gore again conceded to Bush. George W. Bush addressed the nation as president-elect. For the first time since John Quincy Adams was elected, America had another set of father-son presidents. More significantly, for the first time in American history, the Supreme Court had played a decisive role in the outcome of a presidential election.

The final official results showed that Gore had won the national popular vote 51,003,894 (48.41 percent of the popular vote) against Bush’s 50,495,211 (47.89 percent of the popular vote), a difference of 508,683 votes, or approximately one half of one percent of the popular vote. Ralph Nader, the Green Party candidate, took 2,834,410 votes, less than 3 percent of the national vote. However, in Florida Nader garnered 97,488 votes. Many political analysts agreed that Nader had played the spoiler’s role, drawing support from Gore in Florida, as well as a few other close states.

Patrick Buchanan, the archconservative Republican who turned controversial Reform Party candidate, polled only 446,743 votes, a mere 0.42 of the national presidential vote. In 1992, Ross Perot had won 19 percent of the popular vote under the Reform Party banner. Four years later, the Reform movement still managed to deliver Perot more than 8 million votes. But the controversial Buchanan, a former speechwriter for Richard Nixon, had splintered the Reform Party, rendering it insignificant. It was clear, however, that Buchanan’s candidacy had hurt Al Gore because of the confusion over the ballot in Florida, where Buchanan tallied 17,484 votes.

While the debacle in Florida captured the world spotlight for thirty-six days, the election was not only about the Sunshine State. The spotlight of the post-election drama was deservedly on Florida and the dueling lawyers and press conferences. And much of the post-election analysis focused on how likely it was that Gore would have won the state without the Nader-Buchanan factor and the large number of disqualified ballots. But the official electoral tally showed Bush winning the electoral vote with 271 votes to Gore’s 266. (Gore should have had 267, but one elector from Washington, D.C., abstained.)

Mostly overlooked was the fact that the sitting vice president, beneficiary of the greatest economic boom in modern times and with the country enjoying relative peace and prosperity, could have won the Oval Office simply by carrying one of a handful of other states that had voted for Clinton-Gore four years earlier:

Tennessee Gore’s home state, where he and his father had both been elected U.S. Senator, with its eleven electoral votes, a state carried by Bill Clinton in 1992 and 1996.

Arkansas With six electoral votes, the home state of President Bill Clinton had also voted Democratic in 1992 and 1996. But it went to George Bush in 2000 by 50,000 votes.

West Virginia Another state won easily by Bill Clinton in 1992 and 1996, West Virginia gave Bush five electoral votes by a margin of some 40,000 votes. One key to the Bush win there was the lavish campaign spending by the coal industry, one of the chief beneficiaries of President Bush’s environmental policies. These included new rules that removed restrictions from “mountaintop removal,” a form of coal mining in which entire tops of mountains were sheared off to mine coal. This technique, which created vast amounts of earth and rock, and had them dumped into rivers and streams, had been prohibited under previous Environmental Protection Agency rules.

Ohio Of the states won by Clinton in 1992 and 1996 that Gore could not hold in 2000, the largest Electoral College vote lay here with twenty-one electoral votes. Once considered a solid Democratic stronghold with a powerful union vote, Ohio went to Bush. (Although Nader took more than 100,000 votes in Ohio, his candidacy was a smaller factor there than elsewhere.)

New Hampshire The closest race of all was in this small New England state with its four electoral votes. Had they gone to Gore, he would have won the election. The only New England (and northeastern) state carried by George Bush, New Hampshire had been carried by Bill Clinton in both 1992 and 1996. But in 2000, Bush won the “Live Free or Die” state by 7,211 votes, out of some 600,000 presidential votes cast. Here the Green Party’s Ralph Nader may have played the most crucial spoiler role in this independent-minded and often quirky state. Nader won more than 22,000 votes, presumably drawing from liberal and reform-minded voters who might have otherwise voted in the Democratic camp.

Much of the commentary in this extraordinary and bizarre election also focused on the unprecedented role of the Supreme Court. Had the Rehnquist Court, which often divided between five conservative justices (William Rehnquist, Sandra Day O’Connor, Antonin Scalia, Clarence Thomas, and Anthony Kennedy) and four more liberal justices (Stephen Breyer, David Souter, Ruth Bader Ginsburg, and John Paul Stevens), acted properly in its decision? Or had the Court overreached its legitimate bounds in essentially deciding the election? The answer to that question, predictably, seemed to depend on the political preference of the person who answered. At one end of the spectrum were those who found the Court’s opinion perfectly acceptable, usually Republican Bush supporters. One constant in their view: the Supreme Court had to overturn a flawed and politically biased ruling made by the Democratic majority in the Florida State Supreme Court—even if that flew in the recent tradition of the Republican Party’s devotion to states’ rights. That was the view expressed by one Republican legal scholar who argued that the decision was poorly reasoned and badly written but, in the end, fundamentally the correct one.

In the highly partisan postdecision atmosphere, it was difficult to find a conservative voice that disagreed with the Supreme Court. But, writing in the Weekly Standard (December 25, 2000), John J. DiIulio Jr. was one of the few who did: “To any conservative who truly respects federalism, the majority’s opinion is hard to respect. . . . The arguments that ended the battle and ‘gave’ Bush the presidency are constitutionally disingenuous at best. They will come back to haunt conservatives and confuse, if they do not cripple, the principled conservative case for limited government, legislative supremacy, and universal civic deference to legitimate, duly constituted state and local public authority.” He concluded, “There was a time when conservatives would rather have lost a close, hotly contested presidential election, even against a person and a party from whom many feared the worst, than ad-
vance judicial imperialism, diminish respect for federalism, or pander to mass misunderstanding and mistrust of duly elected legislative officials. . . . Desirable result aside, it is bad constitutional law.”

On the other end of the spectrum were those—generally Democrats and Gore supporters—who thought the decision a judicial outrage. Harvard’s Alan Dershowitz called it the “single most corrupt decision in Supreme Court history.” It was difficult to find a Gore supporter who thought that the Court had done the right thing. Perhaps the most outraged was famed attorney Vincent Bugliosi, who argued, in a best-selling book called The Betrayal of America, that the majority was not only mistaken, but actually criminal. “Considering the criminal intention behind the decision, legal scholars and historians should place this ruling above the Dred Scott case (Scott v. Sandford) and Plessy v. Ferguson in egregious sins of the Court. The right of every American to have his or her vote counted, and for Americans (not five unelected Justices) to choose their President was callously and I say criminally jettisoned by the Court’s majority to further its own political ideology.”

Bugliosi’s sense of outrage seemed not to be shared by most Americans—and it should be noted that close to half of all eligible Americans did not vote to begin with—who were apparently relieved that the Supreme Court had put the country out of its electoral misery. The prevailing attitude seemed to be any decision was better than no decision and the endless bickering between lawyers in Florida.

For weeks and months after the election, the close votes, the political stratagems of the two candidates, and the Supreme Court’s behavior were debated and deliberated. But the real scandal of the election came in discovering how many votes routinely don’t count in American elections because of voting machine problems and other Election Day irregularities. In most elections, these uncounted votes rarely affected the outcome, so the media did not pay much attention to uncounted votes and “double counted” or “overvotes” that were rejected in the official tally. But in 2000, when every vote truly became precious, Americans learned how disposable their votes actually could be. And it should come as no surprise to learn that most of the votes that never get counted come largely from the poorest districts, often with large minority populations, where the least money is traditionally spent to modernize election machines and ensure that every vote, supposedly considered an American’s most precious birthright, is counted.

This disintegration of millions of votes, coupled with the fact that the popular vote had been upstaged by the Electoral College—that nineteenth-century vestige of a fear of too much democracy—at least briefly rekindled the call to be rid of the Electoral College once and for all. If it ever had its usefulness as a means of guaranteeing that a president had to be elected by a diverse geographical population and not just those in a few large states, that rationale no longer existed for many commentators.

The intense public interest in the close election of 2000, the debate over the Supreme Court’s remarkable intrusion into presidential politics, and the push for election reform, including the end of the Electoral College, were big stories in America—for a very short time. As the nation busied itself with “business as usual,” complacency about the election seemed to set in—except perhaps among a handful of Democratic true believers convinced that their man was the “real” president.

But in a broader historical sweep, the strange election of 2000 was more or less forgotten a year later, eclipsed by the events of September 11, 2001.

Must Read: Too Close to Call: The Thirty-Six-Day Battle to Decide the 2000 Election by Jeffrey Toobin; Bush v. Gore: The Court Cases and the Commentary edited by E. J. Dionne Jr. and William Kristol.

Where is Fox Mulder when we need him?

Remember the child’s rhyme about the little girl with the curl right in the middle of her forehead? “When she was good, she was very, very good. When she was bad she was horrid.”

That sort of summarizes the history of the FBI during the past twenty years. It was, to put it another way, the best of times and the worst of times for the G-men.

To an older generation of television-watching Americans, the FBI was always perfect. Every week, an FBI agent played by Efrem Zimbalist Jr. solved a crime, supposedly taken from FBI case files, within an hour. But by the time The X-Files appeared, a new disillusioned generation, fed on tales of cover-ups, conspiracy, and questionable competence, saw the FBI as a bureau of disinformation campaigns and malevolent leadership. The Truth may be “out there,” as Agent Fox Mulder of The X-Files told us each week, but it probably lies somewhere between these television views of the FBI.

The recent history of the FBI—and related intelligence and law enforcement agencies like the Immigration and Naturalization Service (INS), the Drug Enforcement Agency (DEA), the Bureau of Alcohol, Tobacco, and Firearms (BATF), and the CIA—is a record of stunning successes combined with embarrassing and dismal failures.

Ruby Ridge: The confrontation between federal agents and the Weaver family in Ruby Ridge, Idaho, became one of the most controversial and widely discussed examples of abuse of federal power, feeding a whole subculture of Americans who distrust their government. Ruby Ridge began with an abortive arrest by U.S. marshals of a self-proclaimed Christian white separatist named Randall Weaver. The Weaver family lived in a remote mountain cabin near Ruby Ridge in northern Idaho. Their racist and anti-Semitic views would be anathema to the vast majority of Americans. Randall Weaver had failed to appear for a hearing on a charge of selling unregistered firearms—two sawed-off shotguns—to an BATF undercover agent in 1989. This failure was due to a typographical error in the letter advising Weaver of the date for his court appearance. The court nonetheless issued a bench warrant for his arrest. (A jury later determined Weaver had been entrapped although the Justice Department investigation of the case found that there was no illegal entrapment.)

On August 21, 1992, during a check of Weaver’s property—which, it was later determined, constituted an illegal search by the marshals—one of Weaver’s dogs began to bark and was shot by a federal agent. It was later revealed that one of the marshals had thrown stones at the dogs to see what it would take to agitate them. A gun battle ensued, and Weaver’s fourteen-year-old son, Sammy, was shot in the back while running away. In the gunfire that followed, a marshal was also shot and killed.

The next day, when Randy Weaver stepped out of his cabin to retrieve his son’s body, an FBI sniper shot him. As Weaver and two companions tried to get back in the cabin, Weaver’s wife, Vicki, stood in the cabin doorway holding an infant. The sniper fired again, killing Mrs. Weaver instantly. According to FBI guidelines, deadly force is allowed only when necessary to protect someone against immediate danger. These limits are implicit in the Constitution and have been established through Supreme Court rulings.

At a subsequent trial, the government claimed that Weaver and another man were shot because they had threatened to shoot at an FBI helicopter. The judge threw out that charge for insufficient evidence. An Idaho jury found Weaver and a companion innocent on most of the serious charges against them. Weaver’s attorney, Gerry Spence, said afterward, “A jury today has said that you can’t kill somebody just because you wear badges, and then cover up those homicides by prosecuting the innocent.”

A later investigation of the fiasco by the Justice Department, one of the most intensive internal reviews of an FBI investigation ever, concluded in a 1994 report that the FBI’s hostage rescue team overreacted to the threat of violence and instituted a shoot-on-sight policy that violated bureau guidelines and Fourth Amendment restrictions on police power. The FBI disciplined twelve agents and employees, including Larry Potts—the head of the criminal division at the time who was later promoted.

All these investigations and recommendations came long after the fact. Too long after to be of any help when the FBI confronted its next deadly siege, near a Texas town called Waco.

Must Read: From Freedom to Slavery: The Rebirth of Tyranny in America by Gerry Spence.

Waco: On February 28, 1993, the Treasury Department’s Bureau of Alcohol, Tobacco, and Firearms staged a disastrous raid on a ramshackle compound in Mount Carmel, Texas, ten miles east of Waco. More than 100 agents were serving arrest warrants for gun violations on members of the Branch Davidian religious sect and its leader, David Koresh. A high school dropout, Koresh, like many cult leaders, was a charismatic type whose rambling discourses on the Bible and the coming apocalypse enthralled his followers. And like other cult leaders, Koresh had begun to use his religious leadership as a sexual tool. He preached that women members of the group had to have sex with him to become true disciples, and he had allegedly fathered as many as a dozen children by different “wives,” some of them girls as young as twelve years old.

He allegedly used traditional mind-manipulation techniques on his followers. A spartan environment, surrender of possessions, and slavish loyalty were used to create a world in which the sect members were utterly dependent on him. Discipline was central to Koresh’s methods, and often inhumane. Children were routinely paddled to the point of bleeding. Adults were forced to stand in sewage pits. Nonetheless, while their practices were considered bizarre, immoral, or even illegal, many sect members held jobs outside the compound.

But Koresh had also embarked on an ambitious plan to acquire weapons. It was for these weapons, including assault rifles, hundreds of thousands of rounds of ammunition, and parts for making machine guns, that the BATF staged its raid. In the ill-planned BATF raid, the question of who fired first was never clearly answered. Survivors said that they did not shoot until they were fired upon by the federal agents. Four BATF agents were killed and several Branch Davidians lay dead.

That same day President Clinton ordered the FBI to take over the case. The initial FBI strategy was to make life hellish for the Branch Davidians by gradually shrinking the perimeter around the compound, shining searchlights at the house for twenty-four hours a day, and then playing ear-splitting noises. Koresh only used these ploys to strengthen his apocalyptic predictions and make himself more messianic to his followers. After negotiations in a fifty-one-day standoff went nowhere, and believing that the children inside the compound were at risk, Attorney General Janet Reno approved a plan to end the siege with an assault, and President Clinton endorsed the plan on April 18. The FBI planned to step up pressure by using CS gas, a type of gas deemed more effective than tear gas. The decision was made despite the fact that, as James Bovard noted in his book Lost Rights, “A few months earlier, the U.S. government had signed an international treaty banning the use of CS in warfare, effectively recognizing that its effects were so harsh that its use on enemy soldiers was immoral. But the international treaty did not prohibit the U.S. government from using CS against American citizens.”

Despite fears of a mass suicide, as had happened in Guyana when 900 followers of cult leader Jim Jones had taken their lives, the FBI launched its raid at about 6 A.M. A modified tank began battering holes near the compound entrance and spraying a mist of CS gas. The Branch Davidians began firing at the tanks. By 9 A.M., a tank had smashed in the front door of the compound and the FBI thought the standoff was over. The FBI planned to continue the pressure of gas and tanks closing in until the cult members surrendered. But at around noon, wisps of smoke appeared, and the building was soon in flames, whipped by high winds blowing off the Texas prairie. Agents entering the building to try to rescue cult members found children in a concrete pit filled with water, rats, and excrement. There were no fire trucks on the scene, as the FBI thought that they would be endangered if a gunfight broke out.

When the conflagration was over, eighty Davidians, including twenty-seven children, were identified as having died in the fire. Seven, including Koresh, had gunshot wounds in their heads. Almost immediately, claims began to be circulated that the FBI had deliberately set the fire.

Later investigations, bolstered by evidence from listening devices that had been secretly sent into the compound in which Davidians were heard saying “Spread the fuel,” indicated that internal fires had been set by the Davidians. Kerosene and gasoline were detected on the clothes of some of the surviving cult members, but there remained a possibility that some of the fires may have started in the assault itself. In the aftermath, two BATF members were fired but later reinstated at a lower rank. No FBI officials were disciplined. Eight of the surviving Branch Davidians were convicted on charges ranging from weapons violations to voluntary manslaughter.

The country had always been supportive of the FBI and its handling of the case, but the two cases combined had a powerful impact on the public perception of the FBI and other agencies. According to Ronald Kessler’s The Bureau, “By 1999, a majority of the public believed that the FBI had murdered innocent people at Waco . . . and Ruby Ridge.”

Oklahoma City: Waco would have aftershocks. On April 19, 1995, the second anniversary of the Waco fire, a truck bomb exploded in front of the Alfred P. Murrah Federal Building in downtown Oklahoma City, blowing out the entire front of the high-rise office building and killing 168 people, 19 of them children. Suspicion was cast immediately on Arab terrorists in the wake of the World Trade Center bombing that had taken place in 1993. But within a few hours, the FBI had a piece of the truck used in the bombing and had traced it to a Kansas truck rental shop.

Within days, Timothy McVeigh was identified as the key suspect in the bombing. A decorated Gulf War veteran, McVeigh was already in jail awaiting a hearing. He had been arrested by an Oklahoma state trooper on the day of the bombing for driving without license plates and for carrying a concealed weapon. A fellow conspirator, Terry Nichols, had also been identified and captured and agreed to testify against his partner. McVeigh was convicted and then scheduled for execution. But even when the FBI had done things right, things now went wrong. Just days before McVeigh’s scheduled execution in May 2001, the FBI revealed to the convicted bomber’s attorneys that it had failed to turn over 3,000 pages of documents relating to the case. McVeigh’s scheduled execution was delayed by the new attorney general, John Ashcroft, while the condemned man’s attorneys reviewed the documents. The papers had no impact, other than to delay the execution and give the bureau another black eye. McVeigh was executed in June 2001, the first federal execution in America since 1963.

The Unabomber: Between 1978 and 1998, the United States was plagued by a wave of mail bombings. Most of the bombs, which consisted of hundreds of nails, cut-up razor blades, and metal fragments, were made to look like ordinary parcels, which exploded when their victims opened them. Because the earliest of these bombs targeted professors of science and engineering as well as airline executives, the FBI dubbed the perpetrator the Unabomber (UNA = “Universities and Airlines”). The Unabomber struck sixteen times, killing three people, injuring twenty-three others, and causing millions to live in fear.

In 1995, the media were sent a 35,000-word manifesto written by the Unabomber describing his targets as modern industry and technology. According to this document, he believed that the only way to restore humanity’s self-esteem was to destroy the institutions that fostered technology and innovation. The FBI ultimately spent the next seventeen years and $50 million tracking down the Unabomber in the longest, most expensive manhunt in history.

The case broke only when the Unabomber’s own brother recognized the political rhetoric in the manifesto. A genius-level former professor, Ted Kaczynski, now identified as the Unabomber, had taken to living in the mountains of Montana in a shack without electricity or plumbing. Assisted by the brother, who received a reward that he then distributed to the families of victims, the FBI arrested Kaczynski on April 3, 1996. He pleaded guilty to the bombings and expressed no remorse for his actions. In 1998, he was sentenced to four consecutive life terms in a Colorado penitentiary.

Olympic Park: When a pipe bomb exploded on July 26 in the midst of the 1996 Atlanta Olympics, FBI agents were immediately suspicious of a security guard who had alerted police to an abandoned mysterious backpack twenty-three minutes before the bomb exploded. The guard, Richard Jewell, had helped evacuate the area after the bombing, which killed two people. Three days later, a local newspaper reported that Jewell was the innocent victim of a bungled investigation and an irresponsible media feeding frenzy that essentially presumed his guilt. The FBI eventually focused its investigation on Eric Robert Rudolph, a fugitive charged with an abortion clinic bombing in Birmingham, Alabama, and Atlanta. (The subject of an intensive manhunt, Rudolph was captured in 2003.)

Los Alamos: Bombs of another sort figured in still another FBI fiasco, the pursuit of a spy supposedly selling atomic bomb secrets to the Chinese. When a Chinese defector to Taiwan gave up bundles of classified documents, suspicion fell immediately on Wen Ho Lee, a Chinese-American scientist at the Los Alamos nuclear laboratory since 1978. The Department of Energy initially investigated the matter but turned the case over to the FBI. In fact, no one was even certain that the secrets had ever been stolen. But in 1999, an indictment was brought against Wen Ho Lee, charging him with copying atomic bomb secrets. Denied bail and shackled in extremely harsh conditions as he awaited trial, Wen Ho Lee professed innocence. The case against him eventually fell apart. In the end, the government settled for a guilty plea to the lesser charge of making copies and mishandling national security information, and Lee was sentenced to time already served. Whether Wen Ho Lee—or anyone, for that matter—had sold documents to the Chinese was never a certainty. But the FBI had again bungled a high-profile investigation. A federal prosecutor who investigated the case said, “This was a paradigm of how not to manage and work an important counterintelligence case.” (In late 2002, Wen Ho Lee reported that he was unable to find employment either at a university or in a laboratory.)

Walker/Ames/Hanssen: The Wen Ho Lee case was one in a string of highly visible espionage cases that left the FBI’s counterspying abilities open to serious question. One was the Walker case. Without detection for seventeen years, Navy communications specialist John Walker and his son Michael had spied for the Soviets, selling them important U.S. Navy secrets. Walker began spying in 1968, undetected by FBI surveillance of the Soviet embassy in Washington, and brought his son into the operation in 1983. He also brought his brother Arthur Walker, a retired Navy commander who was working for a defense contractor, into the ring. John Walker was caught only when his marriage failed and his embittered ex-wife alerted the FBI about his activities. Walker was arrested in May 1985, along with his son, brother, and a fourth accomplice. After cooperating with the government, John Walker got a life term, his son a twenty-five-year term, Arthur Walker received three concurrent life sentences and was fined $250,000, and the fourth accomplice was sentenced to a 365-year term and a fine of $410,000.

The second major counterintelligence failure of the period was a CIA-centered fiasco that spilled over to the FBI. Aldrich Ames, the son of a CIA operative, joined the CIA in 1962. By the mid-1980s, he was spying for the Soviets, but it took years for the CIA or FBI to catch on to him, despite the fact that Ames was living way above his means. With a government salary of about $70,000, Ames drove a Jaguar and lived in a house worth half a million dollars. Over the years, the FBI had observed Ames meeting with the Soviets, but were stonewalled by the CIA, who never pressed the investigation when asked. As it turned out, Ames had received $2.5 million from the Soviets since 1985 and had betrayed more than a hundred CIA operations. In 1994, he was eventually caught, pleaded guilty, and was sentenced to life without parole.

The Walker and Ames cases were thought to be massive intelligence failures. But they paled against the damage done by Robert Hanssen, an FBI agent who worked for the Soviets and had access to the most sensitive government secrets. A former Chicago policeman, Hanssen joined the FBI in 1976. Just three years later, he began selling classified documents to the Soviets. His wife discovered his spying and made him confess to a Roman Catholic priest who told him to give his spying profits to Mother Teresa, but Hanssen continued his damaging betrayal. Through the 1980s and 1990s, Hanssen sold secrets to the Soviet KGB, exposing double agents who were working for the United States.

In January 2001, Hanssen was arrested and charged with spying for the Russians. He pleaded guilty to selling 6,000 pages of documents along with computer disks to the Soviet Union and later the Russians over a period of twenty-one years. Ames’s spying had led directly to more deaths, but Hanssen had turned over far more damaging information. The FBI and CIA needed Hanssen’s cooperation, so he was given life in prison without possibility of parole.

In summarizing the Hanssen case, Ronald Kessler writes in The Bureau, “Hanssen felt confident that the FBI would not catch him—not even if he broke into an FBI official’s computer . . . [that] he could put erotic stories on the Internet using his real name and heavily mortgage his house without raising any suspicion. So complacent was the FBI that, when a computer repair technician found hacker software on Hanssen’s computer, no one asked about it. Nor were five-year background checks done on a regular basis, as is required for agents with Hanssen’s level of clearances.

All of these failures—which must also be measured against the FBI’s noteworthy but usually overlooked successes in counterintelligence—would be embarrassing to the agencies, disturbing to American taxpayers, and perhaps even shocking—if not for the events of September 11, 2001. Less than a year after the September 11 terror attacks, reports of the significant information that various FBI field agents had begun to collect about Islamic terrorists who were actively plotting an attack using airplanes started to surface. These revelations came after administration officials and the head of the FBI had pointedly told Americans that there had been no warning of the attacks.

Clearly, many FBI agents were on to the possibility of a terror strike. A memo written by one FBI field agent in Phoenix warned about Middle Eastern men with possible connections to terror groups enrolling in flight schools. His memo went to New York and Washington FBI offices, but no further action was taken. Even when a man was arrested in August before the terror attacks because of his suspicious behavior at a flight training school, their reports were apparently not taken seriously up the chain of command. Documents relating back to the 1993 World Trade Center bombing also had never been translated from Arabic. Similar hints of potential hijackings had come to the CIA, but the two agencies have always been known for their fierce turf protection. Congressional requirements that the CIA not be involved in domestic spying—spawned after CIA abuses were revealed in the 1970s—also created a roadblock to intelligence sharing that might have proven valuable. The level of the FBI’s awareness as well as its inability to piece together important clues to the impending terror attacks is still unclear. In June 2002, President Bush ordered a massive reorganization of America’s intelligence gathering and domestic security branches into a new cabinet-level department, which was created late in 2002.

AMERICAN VOICES

GERRY SPENCE, attorney for Randy Weaver,
in From Freedom to Slavery (1995):

These are dangerous times. When we are afraid, we want to be protected, and since we cannot protect ourselves against such horrors as mass murder by bombers, we are tempted to run to the government, a government that is always willing to trade the promise of protection for our freedom, which left, as always, the question: How much freedom are we willing to relinquish for such a bald promise?

Written in the aftermath of the Oklahoma City bombing in 1995, Spence’s words were eerily prescient. Seven years later, in the wake of the September 11 terrorist attacks, the scenario Spence described in discussing the Oklahoma City bombing would also be the Bush administration’s reaction to the new terrorist threat.

America in 2000: A Statistical Snapshot

The U.S. Constitution requires a census to be taken every ten years for the purpose of apportioning seats in Congress. The first U.S. census was taken in 1790, shortly after Washington became president. It took eighteen months to complete and counted 3.9 million people.

As the nation grew, so did the scope of the census. Questions about the economy—factories, agriculture, mining, and fisheries—were added over the decades. In 1850, census takers asked the first questions about social issues. In 1940, the Census Bureau began using statistical sampling techniques. Computers came along in 1950.

Census 2000 showed that the resident population of the United States on April 1, 2000, was 281,421,906, an increase of 13.2 percent over 1990. Women held a slight advantage over men (143,368,000 women to 138,054,000 men). This increase of 32.7 million people was the largest total census-to-census population increase in U.S. history, exceeding even the “baby boom” jump of 28 million between 1950 and 1960. The fastest-growing region in the U.S. was the West; the fastest-growing states were Nevada, Arizona, Colorado, and Idaho. The largest state, California, recorded the largest numeric increase, with 4.1 million people added to the state’s population. Reflecting the movements west and south, the nation’s population center, a statistical measurement of where the “middle” of the country is, moved 12.1 miles south and 32.5 miles west, to a point near Edgar Springs, Missouri.

The Census Bureau also keeps a running estimate of the U.S. population. Taking the 2000 census as a starting point, the clock assumed one birth every 8 seconds; one death every 14 seconds; a net gain of one international immigrant every 34 seconds and one returning U.S. citizen every 3,202 seconds. That results in an overall net gain of one person every 11 seconds.

In political terms, the 2000 census meant changes in the makeup of Congress as 12 of the 435 House seats were reallocated to account for the population shifts, with most of the new House seats moving to the South and West from the North and Midwest.

Winners (State and number of new House seats): Arizona 2; California 1; Colorado 1; Florida 2; Georgia 2; Nevada 1; North Carolina 1; Texas 2.

Losers: Connecticut 1; Illinois 1; Indiana 1; Michigan 1; Mississippi 1; New York 2; Ohio 1; Oklahoma 1; Pennsylvania 2; Wisconsin 1.

Since the Electoral College is based on the number of seats in Congress, these changes would also affect the 2004 presidential race.

The American dream had also changed. The well-packaged and expertly marketed 1950s vision of Dad, Mom, two kids, dog, house, and two-car garage was a thing of the past—a romantic notion that barely ever existed in America. The new American household was smaller. Married-couple households declined to just a little over half of all households. People living alone, the second most common living arrangement, rose to more than a quarter of all households.

Single-mother households remained at 12 percent of all households while unmarried men increased to 4 percent. And 5 percent of all households were unmarried-partner households. Leave It to Beaver was on its way to becoming an endangered species in America.

The nation was also getting older. The median age in the U.S. in 2000 was 35.3 years, the highest it has ever been. That increase reflected the aging of the baby boom generation, those born between 1946 and 1964.

Although people thought of the 1990s as the decade of economic prosperity, eleven states experienced increased poverty. In terms of weekly wages, census data showed that most gains were made by those already earning the most, with the lower wage earners making much smaller gains. In other words, the rising tide lifted all the boats, but some boats were lifted a little higher. Or as one of the pigs in George Orwell’s Animal Farm, put it, “All animals are equal but some animals are more equal than others.”

Most shocking of all was the fact that child poverty remains one of America’s most stunning failures. Overall, the nation’s official child poverty rate fell to 16 percent, which is still above the lows of the late 1960s and 1970s when it was around 14 percent. Even with reduced childhood poverty, the United States lags behind most other wealthy nations. America’s poorest children have a lower standard of living than those in the bottom 10 percent of any other nation except Britain.

And in a country whose political leadership routinely says, “No child will be left behind,” American infant mortality rates ranked 33rd in the world, only slightly better than Cuba’s. Eighteen percent of women in America received no prenatal care. Fourteen percent of children had no medical care. The vaunted Welfare Reform Act of 1996, which promised to move people from “welfare to work,” succeeded largely in putting people in jobs that left them officially poor or close to the poverty line. And just to be clear on what the government considers poverty, in 1999, the threshold for a family of four was $16,954, approximately what an average Wal-Mart employee receives annually for a forty-hour workweek.

The 2000 census depicted a more racially diverse America. For the first time, respondents were allowed to select one or more race categories, and nearly 7 million people (2.4 percent) took advantage of the opportunity. Of the other 275 million who reported only one race, 75.1 percent reported white; 12.3 percent black or African American; 0.9 percent American Indian or Alaska native; 3.6 percent Asian. A separate question collected information on Hispanic or Latino origins. Hispanics, who may be of any race, totaled 35.3 million, or about 13 percent of the total population.

Yet widespread patterns of segregation of whites and minorities continue in America, in spite of the improvements in income and education for minority Americans. And home ownership remained a sore point with minorities, who are much less likely to own their homes—generally considered the golden ticket to the American dream.

Was there any good news for America? Well, the 2000 census showed that Americans who relied on outhouses and bathtubs in the kitchen had fallen below the one million mark for the first time in American history.

More significantly, ten years after the Los Angeles rioting that followed the acquittal of four white policemen in the savage beating of a black man named Rodney King—resulting in one of the worst racially motivated riots in American history—the progress is worth noting. Under the George W. Bush administration, the two highest-ranking members of the American foreign policy team were black. Perhaps even more astonishing, they were black Republicans! One of them, National Security Adviser Condoleezza Rice, is a black woman. The other, Colin Powell, is the child of Jamaican immigrants who, having benefited from affirmative action policies, as he is quick to point out, rose through the ranks of the American military to become the highest-ranking officer in the Pentagon under George Bush and then the first black secretary of state. There is no doubt that Colin Powell would have been a credible presidential candidate had he chosen—or if he ever chooses—to run. The institution that cultivated Colin Powell, the American military, is generally credited with being the most successfully integrated institution in American society.

Also among the growing number of black corporate leaders, the heads of American Express and AOL Time Warner, the world’s largest media company, are black men. The daughter of sharecroppers was named to head Brown University, one of the most prestigious jobs in the academic world. The most powerful, influential, and widely admired woman in American media—and perhaps in all America—is Oprah Winfrey. At the 2002 Academy Awards, history was made when two black performers, Denzel Washington and Halle Berry, received Oscars. Two more of the most widely admired Americans are athletes who need only be mentioned as Tiger and Michael. In other words, America has moved beyond tokenism in some very visible and meaningful ways.

Of course, social mobility, corporate power, sports and entertainment achievements, and political office are only part of the story. Much of America still resides in very separate black and white worlds. Poverty and unemployment still affect minorities in much greater numbers than in white America. The question of how those disparities can be addressed is still a troubling one for America. Affirmative action policies, which have been used to address the inequities of the past, have come under increasing fire for being unfair “quotas” that solve past discrimination by discriminating against deserving whites, in either college admissions or business practices. At another end of the spectrum, there are serious black scholars who feel that the United States should still pay “reparations” to the descendants of American slaves, just as Japanese Americans imprisoned during World War II by the government were reimbursed, and families of Holocaust victims have received reparation payments.

Native Americans also lay claim to deserving more from the government for the treatment their people received over the centuries.

AMERICAN VOICES

Bin Laden Preparing to Hijack U.S. Aircraft and Other Attacks.

—Presidential daily briefing provided to President Clinton in 1998.

Bin Laden Determined to Strike in U.S.

—Headline of a presidential daily briefing provided to George W. Bush on August 6, 2001.

At 8:46 on the morning of September 11, 2001, the United States became a nation transformed.

—“Executive Summary,” The 9/11 Report (June 2004).

9/11: What really happened?

In the spring of 2010, Iran’s president Mahmoud Ahmadinejad, the former mayor of Teheran, who was elected president of Iran in 2005 and whose animosity toward America and the Western world in general has been well documented, announced on Iranian state television that “September 11 was a big lie and a pretext for the war on terror and a prelude to invading Afghanistan.” Over the years, President Ahmadinejad had also called the Holocaust a “lie,” and many of his Iranian compatriots as well as other Muslims believed him.

But in the case of 9/11, there are a surprising number of Americans who agreed with the Iranian leader. As the ninth anniversary of the terror strikes aimed against the United States approached, many Americans continued to believe that they didn’t really know what took place on 9/11.

Almost since the day they happened, the events of 9/11 have spawned a wide range of conspiracy theories, spread through books and on the Internet. Some of the most extreme conspiracy theorists suggested that the United States government, perhaps in collusion with the owner of the “Twin Towers” of the World Trade Center, staged the 9/11 attacks. For a variety of suggested reasons, they contend Americans were behind the awful destruction of September 11, 2001, one of the most significant events in recent American history, a disaster that directly led America into two wars and into an era of fighting global terrorism that profoundly affected life and politics in America as well as the rest of the world.

A national commission, established by Congress only after intense pressure from the families of September 11 victims, attempted to answer many of the questions that Americans had about the attacks: about Al Qaeda, the organization that had planned and carried them out; about the response of the government and especially the intelligence community to the threat of terror; and, finally, about the aftermath of the most deadly attack on American soil since Pearl Harbor.

The coordinated attacks came on the morning of September 11, 2001, when a group of nineteen hijackers, all of them Arab men, seized four commercial airliners and crashed two of them into the World Trade Center in lower Manhattan (site of an earlier Islamic terrorist bombing in 1993), and a third into the Pentagon in Arlington, Virginia. The fourth plane was presumed to be heading toward another target in Washington, D.C., possibly the White House, when the passengers and crew attempted to retake the plane and it crashed into a field near Shanksville, in rural Pennsylvania, killing all on board. The total number of victims in the attacks is officially listed as 2,973, with others claiming that a number of rescue workers at the World Trade Center site have died from ailments that came as a result of their work at the site—in essence, “collateral damage” from the hijackings.

Within minutes of the hijackings and crashes, the FBI opened what would become the largest criminal investigation in American history. Within seventy-two hours, the identities of the hijackers were known. All were from Arab nations: Saudi Arabia (eleven of the men were Saudis), United Arab Emirates, Lebanon, and Egypt. The specific planning of the attacks was attributed to a man well known to American intelligence, Khalid Sheikh Mohammed (KSM), who grew up in Kuwait, attended college in the United States—where he had earned a degree in mechanical engineering in 1986—and eventually entered the anti-Soviet jihad (“holy war”) in Afghanistan. (KSM was captured in Pakistan, transferred to the controversial “detainee” American prison built in Guantánamo, Cuba, and at this writing was awaiting trial. He has, according to official accounts, confessed to planning the 9/11 attacks along with a number of terrorist acts that were either foiled or carried out.)

Thomas Kean, the former Republican governor of New Jersey, and the former Democratic representative Lee Hamilton led the work of the 9/11 Commission. In reconstructing the long series of events leading up to 9/11, as well as the United States government’s response to the attacks, the commission won high praise, for both its depth and the clarity with which its final report was written. The commission report laid out the trail of Islamic terror threats and activities against the United States and its interests: the February 1993 bombing of the World Trade Center that killed six; a plot to blow up the Holland Tunnel, connecting lower Manhattan to New Jersey; the October 1993 downing of an American helicopter in Mogadishu, Somalia—the event depicted in the book and film Black Hawk Down—that was accomplished with the help of Al Qaeda; a series of murderous bombings in Saudi Arabia in 1995 and 1996; the August 1998 attacks on the U.S. embassies in Kenya and Tanzania that killed 244 people; and, in October 2000, just a few weeks before the presidential election, the assault on the USS Cole, a U.S. Navy destroyer, in Yemen, killing seventeen American sailors.

At the center of these acts was a man known to the American intelligence community as “Usama Bin Ladin” (as the Commission Report called him) or, more widely, Osama bin Laden. The wealthy son of a Saudi family, he had moved from financing terror to forming and leading the group that became known as Al Qaeda, Arabic for “the base,” in the late 1980s. Its stated mission was to rid Muslim countries of Western influence and replace their governments with fundamentalist Islamic regimes. Al Qaeda, according to Jayshree Bajoria and Greg Bruno of the Council on Foreign Relations, grew out of a group that was fighting the Russians who had invaded Afghanistan in 1979. “In the 1980s, the Services Office—run by bin Laden and the Palestinian religious scholar Abdullah Azzam—recruited, trained, and financed thousands of foreign mujahadeen, or holy warriors, from more than fifty countries. Bin Laden wanted these fighters to continue the ‘holy war’ beyond Afghanistan. He formed al-Qaeda around 1988.”

Ten years later, in February 1998, bin Laden issued a fatwa, or holy decree, declaring that it was the will of God that every Muslim should kill Americans because of America’s “occupation” of Islam’s holy places in Saudi Arabia (particularly since American combat troops were stationed in the kingdom during the first Gulf War against Iraq) and for aggression against Muslims, especially in supporting Israel.

At various times during the Clinton administration, numerous attempts were made to kill or capture bin Laden and other senior al Qaeda leaders with both air strikes and cruise missiles.

When the 9/11 Commission released its sobering report in July 2004, it was only after tremendous resistance from the Bush White House and the intelligence community. A member of the commission, John Farmer, a former federal prosecutor, attorney general of New Jersey, and dean of Rutgers Law School, would later write in his 2009 book The Ground Truth, “The Bush administration . . . demonstrated that leadership is undermined by dissembling. The false and misleading account the administration peddled about its response to the 9/11 attacks may have worked for a time. . . . At the end of the day, however, the Administration’s story raised more questions than it answered, and led directly to the formation of the 9/11 Commission.”

One of the commission’s first conclusions was that the attacks “were a shock but they should not have come as a surprise.” The nineteen-month investigation laid bare the failures of the Central Intelligence Agency (CIA), the Federal Bureau of Investigation (FBI), the Pentagon, the National Security Council (NSC), and virtually every other government agency responsible for defending the nation.

Apart from the unsuccessful attempts to kill bin Laden, the 9/11 Committee laid out a series of grievous intelligence failings, which kept different arms of the American intelligence community from sharing information that could have led to the capture of the hijackers. Some of these were turf battles between agencies, others simple human error. In the end, the results were disastrous. “Across the government, there were failures of imagination, policy capabilities and management. . . . Terrorism was not the overriding national security concern for the U.S. Government under either the Clinton or the pre 9/11 Bush administrations,” the report said in a damning assessment.

In public, the Bush administration maintained that it was not time for “finger-pointing,” and balked at any independent investigation of the run-up to the 9/11 catastrophes and the nation’s response to the emergency.

The official foot-dragging at the White House, in particular, only seemed to heighten the belief that a conspiracy to conceal the truth was at work. The editors of the magazine Popular Mechanics would write in March 2005, in a comprehensive article examining the “conspiracy theories”:

The collapse of both World Trade Center towers—and the smaller WTC 7 a few hours later—initially surprised even some experts. But subsequent studies have shown that the WTC’s structural integrity was destroyed by intense fire as well as the severe damage inflicted by the planes. That explanation hasn’t swayed conspiracy theorists.

Responding to the compelling questions that surrounded the 9/11 tragedies, the notably apolitical Popular Mechanics had set out to investigate some of the claims made by the advocates of conspiracy theories. The editors wrote:

Healthy skepticism, it seems, has curdled into paranoia. Wild conspiracy tales are peddled daily on the Internet, talk radio and in other media. Blurry photos, quotes taken out of context and sketchy eyewitness accounts have inspired a slew of elaborate theories. . . . In the end we were able to debunk each of these assertions [made by conspiracy theorists] with hard evidence and a healthy dose of common sense. We learned that a few theories are based on something as innocent as a reporting error on that chaotic day. Only by confronting such poisonous claims with irrefutable facts can we understand what really happened on a day that is forever seared into world history.

The article in Popular Mechanics specifically addressed and rebutted many of the major questions and controversies raised about 9/11 and the destruction of the World Trade Center in particular. Offering extensive support for their statement, the magazine’s editors concluded that “The widely accepted account that hijackers commandeered and crashed the four 9/11 planes is supported by reams of evidence, from the cockpit recordings to forensics.”

Among the chief claims of the conspiracy theorists was the contention that the impact of the planes could not have caused the towers to fall and that the buildings had also been wired with explosives. Popular Mechanics instead offered a plausible scientific explanation; summarizing, it said that the plane debris had “sliced through the utility shafts at the North Tower’s core, creating a conduit for burning jet fuel—and fiery destruction throughout the building.” The burning fuel was not hot enough to melt steel but was hot enough to weaken the structural strength of the building’s steel frames. The fuel ignited other combustible materials, including furniture, rugs, curtains, and paper. As Forman Williams, a professor of engineering at the University of California, told Popular Mechanics, “The jet fuel was the ignition source. It burned for maybe 10 minutes, and the towers were still standing in 10 minutes. It was the rest of the stuff burning afterward that was responsible for the heat transfer that eventually brought them down.”

The Popular Mechanics article and multiple investigations of the catastrophic events of 9/11 by government agencies and by writers—such as Lawrence Wright in his award-winning account of Al Qaeda, The Looming Tower, and James Bamford in Body of Secrets, his analysis of the role of the National Security Agency (NSA) in failing to prevent the attacks—have all confirmed the voluminous facts that the 9/11 Commission reported. These conclusions are supported by thousands of phone intercepts made by the NSA, the precise record of communications between the planes and ground control, and the heart-wrenching records of telephone calls made by those who died on board the planes and those who were able to communicate before the towers crashed.

However, in his book The Ground Truth, the 9/11 Commission member John Farmer also noted that as of June 2009, there was still no plan to have “interoperability” of emergency communications, one of the key recommendations of the commission. While some cities have acted to improve emergency communications, which failed disastrously at the World Trade Center (and would also contribute to the fatally inadequate response to Hurricane Katrina in the summer of 2005), Farmer noted, “National interoperable communications will not be achieved . . . for the foreseeable future.”

Must Reads: The 9/11 Report; The Ground Truth by John Farmer; 102 Minutes by Jim Dwyer and Kevin Flynn; The Looming Tower: Al Qaeda and the Road to 9/11 by Lawrence Wright; Body of Secrets: Anatomy of the Ultra-Secret National Security Agency by James Bamford.

War in Afghanistan: Who? What? When? Where? Why?

To most Americans, Afghanistan is another of those countries that are as obscure and remote as the dark side of the moon. Slightly smaller than Texas, with 29 million people (2010 est.), Afghanistan is a landlocked country, a landscape of stark mountains and harsh desert in central Asia. Its name typically suggests either a crocheted blanket or a breed of dog.

The nation of Afghanistan piqued some Americans’ curiosity when the Soviet Union invaded it in 1978. However, that was mostly because the invasion meant an American-led boycott of the 1980 summer Olympics in Moscow. With the assistance of the CIA, Afghan rebels kept up a protracted war against the Soviet Union and the Afghan regime it had installed. This region was, as many Americans noted with some relish, “the Soviets’ Vietnam.”

In 1988, a UN-mediated agreement paved the way for the Soviet Union’s withdrawal, leading to a civil war for control of the country. Eventually, the Taliban, a radical fundamentalist Islamic insurgent group, seized power in 1996. They immediately began to enforce a rigid Islamic code in the country. The Taliban were in control of Afghanistan at the time of the attacks on 9/11 and had turned parts of the country into a training ground for terrorist groups, including bin Laden’s Al Qaeda. Initially the Taliban were a mixture of mujahideen, or Islamic “holy warriors,” who fought against the Soviet invasion of the 1980s, with the backing of the United States’ CIA, and a group of Pashtun tribesmen who spent time in Pakistani religious schools, or madrassas, and received assistance from Pakistan.

According to the Council on Foreign Relations:

The Taliban practiced Wahhabism, an orthodox form of Sunni Islam similar to that practiced in Saudi Arabia. With the help of government defections, the Taliban emerged as a force in Afghan politics in 1994 in the midst of a civil war between forces in northern and southern Afghanistan. They gained an initial territorial foothold in the southern city of Kandahar, and over the next two years expanded their influence through a mixture of force, negotiation, and payoffs. In 1996, the Taliban captured Kabul, the Afghan capital, and took control of the national government.

Taliban rule was characterized by a strict form of Islamic law, requiring women to wear head-to-toe veils, banning television, and jailing men whose beards were deemed too short. One act in particular, the destruction of the giant Buddha statues in Bamiyan, seemed to symbolize the intolerance of the regime. The feared Ministry for the Promotion of Virtue and Prevention of Vice authorized the use of force to uphold bans on un-Islamic activities.

The Taliban had also allowed Osama bin Laden, who had fought by their side against the Soviets, to freely use Afghan territory to train Al Qaeda recruits. On August 20, 1998, U.S. cruise missiles struck bin Laden’s training camps north of Kabul. In 1999, the United Nations imposed sanctions on the Taliban for their refusal to turn bin Laden over to the United States for prosecution. Throughout this period, the CIA maintained its support of an Afghan opposition group, the Northern Alliance, which it hoped would overthrow the Taliban regime.

MILESTONES IN THE WAR IN AFGHANISTAN

2001

September 9 Ahmed Shah Massoud, a leader of the anti-Taliban Northern Alliance, is assassinated in a bombing.

September 11 Following the 9/11 attacks, President George W. Bush demands that the Taliban turn over Al Qaeda leaders or face destruction.

September 18 President Bush signs a joint resolution authorizing the use of force against those responsible for the 9/11 attacks.

October 7 The United States, with British support, commences the bombing of Afghanistan, beginning Operation Enduring Freedom. Canada, France, Australia, and Germany pledge future support to the war effort.

October 26 The USA PATRIOT (Providing Appropriate Tools Required to Intercept and Obstruct Terrorism) Act is signed into law. The law increases the ability of law enforcement agencies to search telephone, e-mail communications, medical, financial, and other records. It also eases restrictions on foreign intelligence gathering within the United States; expands the authority of the Treasury Department to regulate financial transactions, and broadens the power of law enforcement to detain and deport immigrants suspected of activities related to terrorism. The act was supported by wide margins in both houses of Congress.

November 9 The Northern Alliance, with CIA support, breaks through Taliban positions at Mazar-e-Sharif. Coalition forces take the Afghan capital, Kabul.

November 14 The UN Security Council calls for a “central role” for the UN in establishing a transitional government in Afghanistan.

December 3–17 The battle of Tora Bora leaves an estimated 200 Al Qaeda fighters dead, but Osama bin Laden is nowhere to be found.

December 9 Taliban forces collapse, and the Taliban leaders, along with Al Qaeda leaders, escape into the mountainous area between Afghanistan and Pakistan.

Hamid Karzai is sworn in as head of an interim power-sharing government.

2002

February The new Afghan national army begins training with assistance from U.S. forces.

March Operation Anaconda, the first major ground assault by combined U.S. and Afghan forces, takes place. But U.S. military planners are increasingly focusing on a potential invasion of Iraq.

July Hamid Karzai, head of the transitional government, escapes an assassination attempt in his hometown, Kandahar.

November The U.S. Congress passes legislation calling for $2.3 billion in reconstruction funds and an additional $1 billion to expand the NATO-led International Security Force.

2003

February Afghanistan remains the world’s largest producer of the opium poppy, according to the United Nations.

March 20 The United States begins the invasion of Iraq (see “Milestones in the War in Iraq,” p. 605).

April The North Atlantic Treaty Organization (NATO) agrees to take over command of security forces in Afghanistan; it is NATO’s first operational commitment outside Europe in its history, as it was initially formed to defend Western Europe against the Soviet threat.

May 1 Secretary of Defense Donald Rumsfeld declares an end to “major combat” in Afghanistan. The announcement coincides with President George Bush’s “mission accomplished”—a declaration that fighting in Iraq has come to an end (see “Milestones in the War in Iraq,” p. 605).

2004

January An assembly agrees on a constitution for Afghanistan.

April 22 Pat Tillman, a former star player for the Arizona Cardinals in the National Football League, who left the pros to join the army, is killed in Afghanistan. The army initially characterizes his death as the result of a firefight with hostile forces, and promotes it as an act of battlefield heroism. However, his death will later be ruled a result of a “friendly-fire” incident in which he was killed by three shots to the head.

AMERICAN VOICES

JON KRAKAUER, Where Men Win Glory: The Odyssey of Pat Tillman:

Although it wasn’t Tillman’s intention, when he left the Cardinals to join the Army he was transformed overnight into an icon of post 9/11 patriotism. Seizing the opportunity to capitalize on his celebrity, the Bush Administration endeavored to use his name and image to promote what it had christened the Global War on Terror. Tillman abhorred this role. As soon as he decided to enlist, he stopped talking to the press altogether, although his silence did nothing to squelch America’s fascination with the football star who had traded the bright lights and riches of the NFL for boot camp and a bad haircut. . . . Unencumbered by biographical insight, people felt emboldened to invent all manner of personae for Tillman after his passing.

October 9 Hamid Karzai becomes the first democratically elected leader of Afghanistan.

October 29 Osama bin Laden releases a videotape three weeks after Afghanistan’s election and just a few days before the U.S. presidential election in which George Bush wins reelection.

2005

September The first Afghan parliamentary and provincial elections in more than three decades are held.

2006

Summer In a resurgence of violence, Taliban forces step up attacks and suicide bombings. NATO troops take over military operations in the south. There is severe fighting in the region as the Taliban presence remains strong.

2007

May The Taliban’s most senior military commander, Mullah Dadullah, is killed during fighting with U.S. and Afghan forces.

November Taliban forces kill six Americans, bringing the U.S. death toll in Afghanistan for 2007 to 100, the highest for any year since the beginning of the war.

2008

June Taliban fighters free 1,200 prisoners, including 400 Taliban prisoners of war, in an assault on a Kandahar jail.

July A bomb attack on the Indian embassy in Kabul kills more than forty people; days later insurgents kill nine American soldiers in a single attack.

2009

January When President Bush leaves office, there are approximately 60,000 international troops—roughly half of them from the United States—on the ground in Afghanistan.

February 17 President Obama orders 17,000 additional troops sent to Afghanistan.

November President Karzai wins another term following a disputed election with widespread claims of election fraud.

December 1 Replicating the strategy that had apparently succeeded in pacifying Iraq’s most violent areas, President Obama announces an “Afghan surge”; another 30,000 American troops would deploy in 2010. Saying that the deployment is not an open-ended commitment, Obama declares that a troop withdrawal would begin in July 2011.

2010

July 26 The U.S. commander in Afghanistan, General Stanley A. McChrystal, is relieved of command. He and some of his subordinates had been quoted making disdainful remarks about top Obama administration officials to a reporter from Rolling Stone magazine. President Obama names General David Petraeus, the architect of the “Iraq surge” strategy, to replace McChrystal.

July A six-year archive of classified military documents about the war in Afghanistan is released online by WikiLeaks, an organization dedicated to whistle-blowing and exposing corruption by governments and corporations. The documents, released to major newspapers around the world, show that American forces are struggling against a revived Taliban insurgency in Afghanistan and fuel growing skepticism in Congress and among Americans about the war effort.

October 7 On the ninth anniversary of the invasion of Afghanistan, American casualties in the war there had reached 1,321 military fatalities. In addition, 811 members of NATO and other coalition forces had been killed in Afghanistan. As the war entered its tenth year, public support for the war was slipping in both the United States and Western Europe. The Netherlands had withdrawn its troops and the Canadians would follow. President Obama confirmed that his troop withdrawal would still begin in July 2011.

AMERICAN VOICES

PRESIDENT GEORGE W. BUSH, from the State of the Union address, January 2002:

Our . . . goal is to prevent regimes that sponsor terror from threatening America or our friends and allies with weapons of mass destruction. Some of these regimes have been pretty quiet since September 11. But we know their true nature. North Korea is a regime arming with missiles and weapons of mass destruction, while starving its citizens. Iran aggressively pursues these weapons and exports terror, while an unelected few repress the Iranian people’s hope for freedom. Iraq continues to flaunt its hostility toward America and to support terror. . . . States like these, and their terrorist allies, constitute an axis of evil, aiming to threaten the peace of the world.

This address, given a few months after the 9/11 terrorist attacks took place and the war against Afghanistan began, may go down as Bush’s signature speech. The phrase “axis of evil,” meant to conjure up the image of the Axis in World War II—Germany, Japan, and Italy—was more symbolic than an actual alliance among the countries Bush mentioned. Iraq and Iran were uneasy neighbors who had fought a devastating war against each other a decade earlier. North Korea was an economic shambles. But the Bush administration viewed all three as sources of funding, training, and support for terrorists.

Why was America in Iraq?

Following George W. Bush’s election in 2000, the United States and the United Kingdom continued to bomb Iraqi air defense systems as part of the “no-fly zones” established by the United Nations after the 1991 Gulf War. In that war, Bush’s father, the first President Bush, had led a multinational coalition against Saddam Hussein’s Iraq following its invasion of neighboring Kuwait (see “What was Operation Desert Storm?,” pp. 539–40). Although there were also UN-mandated sanctions against Iraq, these were widely disregarded.

Then, according to National Security Council adviser Richard Clarke, in the wake of 9/11 and the invasion of Afghanistan, Secretary of Defense Donald Rumsfeld and other Bush administration officials expressed an immediate interest in attacking Iraq. In a controversial book, Against All Enemies, Clarke contended that the Bush administration was obsessed with Saddam Hussein and became distracted from the war in Afghanistan and the hunt for Osama bin Laden. Clarke’s account was later corroborated by several high-ranking members of the military and was borne out by a CBS News report. The high-powered obsession of the Bush administration with Saddam Hussein’s Iraq was also documented by the filmmaker Charles Ferguson, who recorded in No End in Sight, “One day after the September 11 attacks, senior Pentagon officials including Defense Secretary Donald Rumsfeld and Undersecretary Douglas Feith spoke to senior military officers about Iraq and the need to remove Saddam. Eighteen months later, they got their wish.”

One year after the 9/11 attacks, President Bush spoke at the United Nations and demanded that Iraq eliminate its weapons of mass destruction (WMD), refrain from supporting terrorism, and end the repression of its people. These three reasons became the ostensible justification for America’s going to war in Iraq—in the first preemptive war in American history—to achieve “regime change,” that is, to eliminate Saddam Hussein as Iraq’s leader. But unlike his father, who had mustered a large coalition of nations in 1991, President Bush decided to go to war despite the objections of most of America’s Western allies. The principal exception was Great Britain, although Spain and Italy also joined in the U.S. invasion of Iraq.

With National Security Adviser Condoleezza Rice raising fears of “a mushroom cloud” and by constantly connecting Iraq to Al Qaeda and the 9/11 attacks—in spite of substantial evidence to the contrary—the United States went to war, unprovoked, against Iraq in March 2003.

In Fiasco (2006), a widely praised, stinging assessment of that war three years after it began, the Washington Post’s senior correspondent Thomas E. Ricks wrote, even as the war dragged on:

President George W. Bush’s decision to invade Iraq in 2003 ultimately may come to be seen as one of the most profligate actions in the history of American foreign policy. The consequences of his choices won’t be clear for decades, but it is already abundantly apparent in mid-2006 that the U.S. government went to war in Iraq with scant solid international support and on the basis of incorrect information—about weapons of mass destruction and a supposed nexus between Saddam Hussein and al Qaeda’s terrorism—and then occupied the country negligently. Thousands of U.S. troops and an untold number of Iraqis have died. Hundreds of billions of dollars have been spent, many of them squandered. Democracy may yet come to Iraq and the region, but so too may civil war or a regional conflagration, which in turn could lead to a spiraling of oil prices and a global economic shock. . . . Spooked by its own false conclusions about the threat, the Bush administration hurried its diplomacy, short-circuited its war planning, and assembled an agonizingly incompetent occupation.

MILESTONES IN THE WAR IN IRAQ

2001

October The State Department begins planning a post–Saddam Hussein transition in Iraq, including discussions with Iraqi expa-
triates.

November In the aftermath of 9/11, a small group of senior U.S. military planners is instructed to begin planning for a possible invasion of Iraq.

2002

July 23 The “Downing Street memo”—prepared by a senior British intelligence official and subsequently leaked—states, “Bush wanted to remove Saddam, through military action, justified by the conjunction of terrorism and WMD [weapons of mass destruction]. But the intelligence and the facts were being fixed around the policy.” (The Sunday Times of London publishes the leaked memo in May 2005.)

September 8 In a speech, National Security Adviser Condoleezza Rice describes the threat posed by Iraq: “The problem here is that there will always be some uncertainty about how quickly [Iraq] can acquire nuclear weapons. But we don’t want the smoking gun to be a mushroom cloud.”

November 27 After the UN Security Council declares Iraq in breach of a disarmament resolution, Iraq allows weapons inspectors to return to the country in search of banned weapons.

2003

January 28 In his State of the Union Address, President Bush declares, “The British government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa.”

February The army general Eric Shinseki testifies to a Senate committee that “something in the order of several hundred thousand soldiers” would be required to stabilize Iraq. Shinseki is publicly rebuked by both Secretary of Defense Rumsfeld and one of Bush’s security officials, Paul Wolfowitz.

February 6 Secretary of State Colin Powell addresses the UN Security Council to present the American case for war. The Security Council rejects an explicit authorization for the use of force.

March 18 6 Secretary of State Colin Powell addresses the UN Security Council to present the American case for war. The Security Council rejects an explicit authorization for the use of force.

March 18 United Nations weapons inspectors leave Iraq.

March 19 Operation Iraqi Freedom commences with air strikes. Most Iraqi forces melt into the civilian population; Iraqi forces, including the feared Republican Guard, are quickly overwhelmed, but there is no evidence of any chemical or biological weapon used by Iraqi forces.

April 9 U.S. Marines topple a statue of Saddam Hussein in a major square in Baghdad.

April Order in Iraq begins to break down amid widespread looting. The national archives and museum and all government ministries, except the ministry of oil, are looted. Asked at a press conference about the looting, Secretary of Defense Donald Rumsfeld says, “Stuff happens” and “Freedom is untidy.”

April 1 Special Forces rescue Private First Class Jessica Lynch, who had been captured by Iraqi forces on March 23. The Pentagon, desperate for positive news, inflates her story into heroic proportions. Lynch is given an honorable medical discharge later in April. Along with the media, she will eventually accuse the government of embellishing her story as part of a propaganda effort by the Pentagon.

May 2 Aboard the USS Abraham Lincoln, under a banner reading “Mission Accomplished,” President Bush delivers an address in which he declares the “end of major combat operations” in Iraq.

May 6 L. Paul Bremer is appointed to supervise the occupation of Iraq, replacing General Jay Garner, as chaos sweeps Iraq. On May 9, Bremer disbands the entire Iraqi military and intelligence services. He also announces that some 50,000 members of Saddam Hussein’s Baath Party are prohibited from any government work.

July 6 Former ambassador Joseph Wilson publishes an article on the New York Times op-ed page in which he states that Saddam Hussein could not obtain uranium from sources in Africa, contradicting the claim made by President Bush in the State of the Union Address a few months earlier, in January. At this time, Wilson’s wife, Valerie Plame, is a CIA operative; her classified identity will soon (July 14) be revealed to the press. The leak of her name and identity will eventually be traced to Lewis “Scooter” Libby, chief of staff to Vice President Dick Cheney. Libby will be convicted (in March 2007) of four counts of obstruction of justice, perjury, and making false statements to investigators and sentenced to thirty months in prison; but President Bush will commute his sentence in July 2007.

August 19 A bombing attack on UN headquarters in Baghdad kills more than twenty people, including the UN special representative, Sergio Vieira de Mello. The UN closes its Iraq mission.

December 13 Saddam Hussein is captured in his hometown, Tikrit.

2004

January 4 The Bush administration concedes that its prewar arguments about extensive stockpiles of WMD appear to have been mistaken.

March 31 Four American employees of a private security contractor, Blackwater, are killed in the city of Fallujah and their bodies are hung from a bridge. The incident leads to a bloody siege of the city, which has become the center of an insurgency by Sunni Iraqis, many of them former loyalists of Saddam Hussein.

April 30 Photographs of prisoners at Abu Ghraib prison abused by U.S. soldiers are first published. Seven soldiers are later convicted of torturing and humiliating Iraqi prisoners, but no senior officers are charged or punished.

May 8 An interim constitution for Iraq is ratified.

November President Bush is reelected for a second term, defeating Senator John Kerry of Massachusetts.

2005

January 30 Iraqi national elections are held but are largely boycotted by Sunni political parties.

March A presidential commission concludes that “not one bit” of prewar intelligence about chemical, biological, and other weapons in Iraq was accurate.

August 6 In neighboring Iran, the hard-line Islamist mayor of Teheran, Mahmoud Ahmadinejad, is elected president.

2006

February 22 A mosque in Samarra, one of the holiest sites for Shiite Muslims, is bombed. The attack leads to a dramatic escalation of sectarian violence, with Shiites attacking Sunnis in large-scale killings.

May 20 Nouri al-Maliki takes over as prime minister of Iraq.

November 7 In U.S. midterm elections, the Republicans lose control of the House of Representatives, largely as a result of growing opposition to the war. Secretary of Defense Donald Rumsfeld steps down and is replaced by Robert Gates, a moderate who had been privately critical of the war.

December 6 The Iraq Study Group, formed earlier to analyze the situation in Iraq, concludes that it “is grave and deteriorating” and “U.S. forces seem to be caught in a mission that has no foreseeable end.”

December 30 Following separate trials for “crimes against humanity” and genocide, Saddam Hussein is hanged.

2007

January 10 The presidential campaign to replace George Bush begins; President Bush announces a “surge” of additional troops to Iraq. General David Petraeus is assigned to implement the new strategy.

2008

November 4 Barack Obama, a critic of the Iraq war who has voted against it in the Senate and campaigned to end it, is elected president. Obama asks Secretary of Defense Gates to remain in the post.

November 27 Iraq’s government approves an agreement calling for the withdrawal of U.S. forces by December 31, 2011. By September 2009, the U.S. troop presence in Iraq will decline to 130,000. (There will still be tens of thousands of private American contractors doing Defense Department work in Iraq at that time.)

2009

June 30 American troops begin to withdraw from Baghdad and other Iraqi cities as part of the drawdown.

2010

March Iraqi parliamentary elections are held, and despite bombings, there is 62 percent turnout.

August 31 President Obama declares an end to the seven-year combat mission in Iraq. However, some 50,000 American forces remain in the country to provide security and training for Iraqi forces. More than 4,400 American soldiers and more than 70,000 Iraqis lost their lives in the conflict, according to figures cited by the New York Times.

October 1 After months of political haggling since the March elections, Prime Minister Nouri Kamal al-Maliki was reelected. The party of anti-American cleric Moktada al-Sadr, the exiled religious leader whose forces had once battled American and coalition forces in several cities, backed Maliki. The deadline for withdrawing the rest of the American forces remained December 2011.

(Sources: No End in Sight: Iraq’s Descent into Chaos, Council on Foreign Relations; The World Almanac and Book of Facts 2010.)

AMERICAN VOICES

CHARLES H. FERGUSON, documentary filmmaker, No End in Sight: Iraq’s Descent into Chaos (2008):

Yet never in my wildest nightmares did I imagine that the occupation of Iraq would be conducted with such arrogance, stupidity, and incompetence as it was. Despite all my training and experience, which included lessons in skepticism, I would have laughed if someone had told me before the war, look, it’s going to be like this: They won’t start any planning for the occupation at all until two months before the war, and then they’ll start completely from scratch. They’ll exclude the State Department and CIA people who know the most about the country. They won’t have any telephones or e-mail for months after they arrive in Iraq. Our troops will stand by as nearly every major building in the country is looted, destroyed, and burned. They will spend the first month preparing to install an Iraqi government, restart the administration of the country, and recall the Iraqi army for use in security and reconstruction. And then, with no consultation or warning, in a one-week period, a newly appointed head of the U.S. occupation will reverse all those decisions, crippling the administration of the country and throwing half a million men into the street, destitute. As an insurgency builds, they will deny its existence and refuse to negotiate, even when the leaders of the insurgency signal a desire for compromise. They will airlift $12 billion in hundred-dollar bills into the country, with no accounting controls, and three-quarters of it will remain permanently unaccounted for.

Must Read: The Assassin’s Gate by George Packer; Cobra II by Michael R. Gordon and General Bernard E. Trainor; Fiasco: The American Military Adventure in Iraq by Thomas E. Ricks; The Dark Side by Jane Mayer; Chain of Command: The Road from 9/11 to Abu Ghraib by Seymour M. Hersh; The Forever War by Dexter Filkins.

AMERICAN VOICES

MILDRED LOVING, June 2007:

Surrounded as I am now by wonderful children and grandchildren, not a day goes by that I don’t think of Richard and our love, our right to marry, and how much it meant to me to have that freedom to marry the person precious to me, even if others thought he was the “wrong kind of person” for me to marry. I believe all Americans, no matter their race, no matter their sex, no matter their sexual orientation, should have that same freedom to marry. Government has no business imposing some people’s religious beliefs over others. I am still not a political person, but I am proud that Richard’s and my name is on a court case that can help reinforce the love, the commitment, the fairness, and the family that so many people, black or white, young or old, gay or straight seek in life. I support the freedom to marry for all. That’s what Loving, and loving, are all about.

Mildred Loving issued this statement on the fortieth anniversary of the Supreme Court ruling in the case of Loving v. Virginia, the decision that made it unconstitutional for states to prohibit interracial marriages.

Should same-sex marriage be legal for all Americans? And what does same-sex marriage have to do with interracial marriage?

It surprises most people to learn that when Barack Obama was born—yes, in Hawaii in 1960—his white mother and black father could not have been married in the state of Virginia, home state of eight American presidents. Interracial marriage did not gain full constitutional protection until June 12, 1967, when the Supreme Court issued its ruling in the case of Loving v. Virginia, striking down all state laws prohibiting “miscegenation,” or interracial marriage, in America.

Mildred and Richard Loving brought the case. He was white and she was black and they had married legally in Washington, D.C., in June 1958. But when the couple traveled to Virginia, Richard Loving was arrested for the crime of being married to a black woman. The Lovings pleaded guilty to the charge and were sentenced to one year in jail. The trial judge suspended the sentence for a period of twenty-five years if the couple agreed to leave the state and not return for twenty-five years. At the time, the trial judge wrote in his opinion:

Almighty God created the races white, black, yellow, malay and red, and he placed them on separate continents. And, that but for the interference with his arrangement, there would be no cause for such marriage. The fact he separated the races shows that he did not intend for the races to mix.

The Lovings returned to Washington, D.C., but later filed a motion of appeal in the case. They lost the early rounds, but their case eventually wound its way to the Supreme Court.

On June 12, 1967, the Supreme Court ruled that anti-miscegenation laws, such as those in Virginia, violated the Fifth Amendment’s “due process” clause (“No person shall be . . . deprived of life, liberty, or property, without due process of law”) and the “equal protection” clause of the Fourteenth Amendment (“No state shall . . . deny to any person within its jurisdiction the equal protection of the laws”).

In declaring the unanimous opinion, Chief Justice Earl Warren wrote: “Marriage is one of the ‘basic civil rights of man,’ fundamental to our very existence and survival.”

With the passage of time, fewer and fewer Americans ever heard of the Loving case, and interracial marriage no longer seemed controversial. But interest in the case was reawakened in light of the controversy that accompanied a spate of court decisions and legislation legalizing civil unions and marriages between same-sex partners. The question of same-sex unions and same-sex partners obtaining basic legal protections surged to the forefront of American society in the 1990s as some states were moving to permit some form of same-sex union. While not a major national political issue in 1996, the question was significant enough to spur Congress to pass the Defense of Marriage Act, signed into law by President Clinton, in September of that year. The law essentially defined marriage as a legal union between one man and one woman and said that no state, or the federal government, needed to treat a relationship between persons of the same sex as a marriage, even if that relationship was recognized as a marriage in another state.

The controversy over “gay marriage” accelerated in 2000, when Vermont became the first state to permit civil unions, on April 26. In 2004, the issue came to greater prominence during the presidential race between George Bush and his Democratic opponent, Senator John Kerry of Massachusetts. It had been elevated to the national stage in 2003, when the Massachusetts supreme judicial court extended the right to marry to same-sex couples, a decision that took effect in May 2004. Although Bush and Kerry agreed in defining marriage as a union between a man and a woman, conservatives used the issue to assault “liberal” judges who were “legislating from the bench.”

By 2008, the controversy had grown as a court in California, the nation’s most populous state, also issued a decision that effectively legalized same-sex marriage there. A rush of same-sex marriages occurred in California at the same time that a constitutional amendment known as Proposition 8 was put on the ballot. The initiative was designed to restore the definition of marriage as opposite-sex only. On Election Day 2008, Proposition 8 passed, but at the same time, the courts recognized the 18,000 same-sex marriages that had already taken place in the state, creating two classes of “marriage.”

Since 2008, same-sex marriage has also become legal in Connecticut by legislation (2008), in Iowa by court decision (2009), in Vermont by legislation (2009), and in New Hampshire by legislation (2010). Maine briefly legalized same-sex marriage, only to have it defeated in a public referendum in November 2009. In addition, New Jersey created a same-sex legal union, and at least ten other states offer some form of legal unions providing some of the rights of legal marriage to same-sex and otherwise unmarried couples. In March 2010, Washington, D.C., legalized same-sex marriages by an act of the city council.

Under the Defense of Marriage Act, the federal government does not recognize these marriages or unions. This is significant in regard to federally guaranteed benefits extended to spouses, such as disability or Social Security. However, some states that do not offer same-sex marriage do recognize these out-of-state marriages, leading to a crazy quilt of differing protections. All this means the question of “equal protection under the law,” the constitutional guarantee affirmed in Loving v. Virginia, may ultimately leave this issue in the hands of the United States Supreme Court.

On August 4, 2010, a federal judge ruled that California’s Proposition 8, which overturned the state’s same-sex marriage law, was unconstitutional. In his decision, Judge Vaughn R. Walker said that the ban discriminated against gay men and women. “Excluding same-sex couples from marriage is simply not rationally related to a legitimate state interest,” Judge Walker wrote.

The argument for overturning Proposition 8 was based in part on the precedent of the Loving case. More surprisingly, a team that included the attorneys David Boies and Ted Olson argued the suit for the plaintiffs. (The state of California did not participate in the case.) These two men were best known as the opposing lawyers in Bush v. Gore, the arguments over the election of 2000 that culminated in the historic Supreme Court decision that made George Bush president. Boies had represented Vice President Al Gore, and Olson had represented George W. Bush and then, following Bush’s victory, served as solicitor general under him.

A staunch conservative Republican whose wife—the attorney and Fox News commentator Barbara Olson—had died aboard the plane that crashed into the Pentagon on 9/11, Ted Olson once said, “Creating a second class of citizens is discrimination, plain and simple.” He also told the New York Times that “his support of same-sex marriage stemmed from longstanding personal and legal conviction. He sees nothing inconsistent with that stance and his devotion to conservative legal causes: the same antipathy toward ‘government discrimination.’ ”

The decision in the California case seemed destined for the Supreme Court.

AMERICAN VOICES

PRESIDENT GEORGE W. BUSH, commenting on the relief efforts in the wake of Hurricane Katrina, led by the director of the Federal Emergency Management Agency (FEMA), Michael Brown, Mobile, Alabama, September 2, 2005:

Brownie, you’re doing a heckuva job. The FEMA Director’s working 24 hours a day.

The most costly storm in American history made its first landfall in southeast Florida on August 25, 2005, as a Category 1 hurricane. After weakening slightly, Katrina moved over the Gulf of Mexico and began to strengthen over the Gulf’s warm, shallow waters, eventually being upgraded to a Category 5 hurricane on August 28. That day, Mayor Ray Nagin of New Orleans ordered a mandatory evacuation of the city. Thousands of residents either chose to stay or were unable to leave because of inadequate preparations for transporting so many people. Responding to a request for 700 buses by the Louisiana National Guard, FEMA sent 100. More than 30,000 people took refuge in the city’s football stadium, the Superdome. They had roughly thirty-six hours’ worth of food.

Katrina was at Category 3 strength when it made landfall again on Monday, August 29, in Louisiana. Despite weakening, Katrina caused massive storm surges, measuring up to twenty-five to twenty-eight feet, throughout southeast Louisiana, Mississippi, and Alabama. Combined with strong winds and heavy rainfall, the storm surges contributed to the failure of the New Orleans levee system on August 30. The levees, a system of canals that ringed New Orleans, a city that mostly lies below sea level, were topped, as predicted by two federal agencies—the National Weather Service and the National Hurricane Center—and entire neighborhoods were inundated. This scenario had actually been forecast by a FEMA exercise thirteen months before Katrina struck, according to MSNBC.

By Wednesday, August 31, New Orleans had fallen into chaos and thousands of residents, mostly poor and African American, were trapped in the Superdome, where people were dying. Lacking sufficient supplies and sanitary facilities, the evacuation center at the Superdome soon became a hellish scene of crime and horror. When asked where FEMA’s director, Mike Brown, was as the situation unraveled, his secretary told reporters that Brown was pressed for time, since they were caught in traffic and Brown had a lunchtime restaurant reservation he did not want to miss.

While many communities along the Gulf Coast were severely damaged by the storm, New Orleans was the center of the calamity. About 80 percent of the city was eventually flooded, displacing more than 1 million people from their homes. An exact death count was never established, but the fatalities from Katrina were estimated at more than 1,833, with property damage in excess of $125 billion.

The reaction of the government to the Katrina disaster was considered grossly disorganized and inefficient, at every level from Ray Nagin, the mayor of New Orleans, and Louisiana’s first female governor, Kathleen Blanco, to the entire federal response. But President Bush and the performance of FEMA and its director Michael Brown in particular came in for the most criticism. President Bush was criticized for failing to take the situation seriously and then for making what was considered only a cursory flyover of the area a few days after the hurricane struck. Then after claiming that nobody had predicted the failure of the levees, the president was contradicted by videotapes, which showed him being given specific warnings of the impending disaster at a briefing before Katrina made its devastating landfall. Michael Brown, however, paid the price. He was recalled to Washington and resigned ten days after the infamous press conference in which the president had praised him.

One year after the Katrina disaster, Brown stated on MSNBC that at the insistence of the White House, he had lied about what had happened. “The lie was that we were working as a team and that everything was working smoothly. And how we would go out, and I beat myself up daily for allowing this to have happened, to sit there and go on television and talk about how things are working well, when you know they are not behind the scenes, is just wrong. . . . That’s exactly why I think that’s the biggest mistake that I made, was not leveling with the American people, and saying, you know what, this is a catastrophic event, and it’s not working at the state, local, or the federal level.”

AMERICAN VOICES

HENRY PAULSON, then Secretary of the Treasury in the Bush administration, in On the Brink:

It was Thursday morning, September 4, 2008, and we were in the Oval Office of the White House discussing the fate of Fannie Mae and Freddie Mac, the troubled housing finance giants. For the good of the country, I had proposed that we seize control of the companies, fire their bosses, and prepare to provide up to $100 billion of capital support for each. If we did not act immediately, Fannie and Freddie would, I feared, take down the financial system, and the global economy, with them.

AMERICAN VOICES

Excerpt from an exchange between Representative HENRY WAXMAN (Democrat, California) and the former chairman of the Federal Reserve, ALAN GREENSPAN, during congressional testimony over the collapse of the financial system, October 2008:

Rep. Henry Waxman: You found a flaw in the reality . . .

Alan Greenspan: Flaw in the model that I perceived in the critical functioning structure that defines how the world works, so to speak.

Rep. Henry Waxman: In other words, you found that your view of the world, your ideology, was not right, it was not working?

Alan Greenspan: That is—precisely. No, that’s precisely the reason I was shocked, because I had been going for forty years or more with very considerable evidence that it was working exceptionally well.

How do you keep a “bubble” from bursting?

The catastrophic flooding of New Orleans and much of the Gulf Coast after Hurricane Katrina gave the word “underwater” a deadly realism and terrifying finality. Images of people clinging to rooftops as floodwaters surged around them made certain of that. But the word “underwater” acquired a whole new meaning to Americans when a financial perfect storm threatened to inundate the American economy, and with it the rest of the world’s financial systems.

In the language of economics, real estate, and banking, “underwater” describes a house that is worth less than the amount it is mortgaged for. During an incredible boom time in America, such an idea was far removed from the minds of most people, including many of the world’s economists and central bankers, such as Alan Greenspan. For more than a decade, skyrocketing housing values continued to propel the American economy, aided by increasingly elaborate mortgage products that promised to put millions of Americans in reach of a “McMansion” of their own. Housing, for obvious reasons, is a major driver of the economy and for a time in America, there was only one way for house prices to go—up.

But if mortgage debt was rain, and the banks that held those mortgages and the investment houses that bought and sold mortgage-backed securities were the canals holding in the water, the levees were about to break. Soon, millions of Americans found themselves with houses that were “underwater”—and banks began to foreclose on those houses in an effort to avoid their own bankruptcies. The vastly inflated “bubble” in housing prices had been pricked. And as the bubble deflated, it threatened to bring America back to the hopeless days of the Great Depression.

The “subprime” crisis. The credit crisis. The Great Recession. It was an economic downturn unlike anything most living Americans had ever seen. And for a while, the nation teetered on the edge of another Great Depression. Would the financial system, now intricately linked across borders, actually hold? Would the financial levees hold? Or would a financial flood of debt come washing over the country, inundating homes and businesses just as the waters had drowned New Orleans in 2005?

As the nation prepared to elect a new president in 2008, the deluge certainly seemed to be coming. For months, the country had teetered at the edge of economic disaster. Slumping housing sales. Plummeting retail sales. Personal and corporate bankruptcy filings on the rise. Employers slashing payrolls at the fastest pace in decades. One of the most visible of American retailers, Circuit City—home of the electronic playthings that had become as American as two cars in the garage and a chicken in every pot—shut its doors.

The immediate crisis had begun in early 2007 with the deepening troubles of a number of Wall Street investment firms whose risky bets in mortgage-backed securities—a financial invention of the previous decades that had proved extremely profitable while housing prices were going up—came crashing down as a recession burst the bubble in American housing prices.

Many of those mortgages were underwritten and held by two agencies, known widely as Fannie Mae (the Federal National Mortgage Association) and Freddie Mac (the Federal Home Loan Mortgage Association). Fannie Mae was created in the Depression era to purchase mortgages, allowing lenders to free up money to make more mortgages. In 1968, Fannie Mae was converted into a private, shareholder-owned corporation. At the same time, Freddie Mac was created to compete with Fannie Mae in order to create an efficient secondary mortgage market. In the 1980s, Fannie Mae created the mortgage-backed security. Over the years, the two corporations had also been politically pressured to ease loan requirements to low-income and middle-income earners. This meant greater risks for corporations that were responsible to their shareholders to make a profit yet still were exposed to political pressure. By 1999, Steven Holmes of the New York Times warned, “Fannie Mae is taking on significantly more risk, which may not pose any difficulties during flush economic times. But the government-subsidized corporation may run into trouble in an economic downturn, prompting a government rescue similar to that of the savings and loan industry in the 1980s.” A series of accounting scandals then hit Fannie Mae as well, and the troubled agency was taking water.

By 2007, Holmes’s prediction came true. As more subprime loans went into default, the two mortgage giants teetered. Although they were not legally backed by the federal government, most investors believed that the government would not let them fail. In 2008, the two companies were placed in “conservatorship” inside another federal agency. Shareholders, many of whom believed that Fannie and Freddie were “guaranteed” by the federal government, now held nearly worthless shares in the two mortgage giants, which together held more than half of all U.S. mortgages in 2008.

The Bush administration, already badly damaged by the political fallout from the failing war in Iraq and the president’s response to Hurricane Katrina, suddenly had to deal with a new levee about to burst—the entire American banking system seemed unable to hold back the floodtide, largely begun by underwater mortgages and the inability of a growing number of Americans to meet their monthly mortgage payments, especially in the case of adjustable-rate mortgages, which increased as interest rates climbed. By the time Barack Obama was elected president, the insurance giant AIG announced that it had lost $24.5 billion in the summer of 2008, and the Bush administration announced a plan to save it. In November, the ailing financial giant Citigroup, once the standard-setter for global bankers, reached a rescue agreement with the Treasury, the Federal Reserve, and the Federal Deposit Insurance Corporation (FDIC). It would mean that the government was about to become a major shareholder in the bank. Also in November, Bush’s secretary of the treasury, Henry Paulson, announced that the $700 billion bank bailout plan known as the Troubled Asset Relief Program (TARP) would be used to stabilize and stimulate credit markets.

Then two of Detroit’s “big three” automakers—General Motors and Chrysler, the backbone of American manufacturing for a century—asked Congress for a taxpayer-financed rescue.

It was a series of extraordinary events, a cataclysm of failing financial dominoes that had not been seen in America since the time of the Great Depression—and the fear of another Great Depression was not exaggerated. In his book The Big Short, the financial writer Michael Lewis wrote, “Every major firm on Wall Street was either bankrupt or fatally intertwined with a bankrupt system.”

In Too Big to Fail, his exhaustive history of the crisis, the financial columnist Andrew Ross Sorkin described it this way:

In the span of a few months, the shape of Wall Street and the global financial system changed almost beyond recognition. Each of the former Big Five investment banks failed, was sold, or was converted into a bank holding company. Two mortgage-lending giants and the world’s largest insurer were placed under government control. And in early October [2008], with a stroke of the president’s pen, the Treasury—and by extension, American taxpayers—became part-owners in what were once the nation’s proudest financial institutions, a rescue that would have seemed unthinkable only months earlier.

Once in place in January 2009, the Obama administration basically had to continue a jerry-rigged rescue plan begun by the Treasury Department and Federal Reserve under the outgoing Bush administration while also trying to battle a long, deep recession that was pushing unemployment higher. The job losses were adding to the crisis, as the unemployed were unable to pay their mortgages and consumer spending ground to a near-halt. In February 2009, the American Recovery and Investment Act (the “stimulus”) was passed, intended to boost the nation’s struggling economy, with a combination of tax cuts and spending. (No Republicans in the House and only three Republican senators supported the legislation.)

A year later, the levees seemed to have held, but at enormous cost to taxpayers. The notion that Americans were bailing out wealthy bankers and investment firms while millions of average Americans lost their homes and their jobs helped fuel rage against Washington and Wall Street. Some of this rage emerged in the Tea Party movement, which attracted Americans angry over the size of government and its intrusion into the free market. And as Andrew Ross Sorkin put it:

That of course raises a more pointed question. Once the crisis was unavoidable, did the government’s response mitigate it or make it worse? To be sure, if the government had stood aside and done nothing as a parade of financial giants filed for bankruptcy, the rest would have been a market cataclysm far worse than the one that actually took place. On the other hand, it cannot be denied that federal officials—including Paulson, Bernanke and Geithner—contributed to the market turmoil through a series of inconsistent decisions. They offered a safety net to Bear Stearns and backstopped Fannie Mae and Freddie Mac but allowed Lehman to fall into Chapter 11, only to rescue AIG soon after. What was the pattern? What were the rules? There didn’t appear to be any, and when investors grew confused—wondering whether a given firm might be saved, allowed to fail, or even nationalized—they not surprisingly began to panic.

AMERICAN VOICES

CONGRESSMAN BARNEY FRANK, discussing the financial crisis on 60 Minutes, December 11, 2008:

The problem in politics is this: You don’t get any credit for disaster averted. Going to the voters and saying, “Boy, things really suck, but you know what? If it wasn’t for me, they would suck worse.” That is not a platform on which anybody has ever gotten elected in the history of the world.

How did America elect its first black president?

In mid-October 2008, a few weeks before the presidential election pitting the one-term Democratic senator Barack Obama of Illinois against Arizona’s veteran Republican senator, the Vietnam War hero John McCain, NBC News and the Wall Street Journal released a poll taken among registered voters. Of the respondents, 2 percent said race made them more likely to vote for Barack Obama, 4 percent said it made them less likely to vote for Barack Obama, 2 percent said they were not sure how it swayed them, and 92 percent said race was not a major factor.

Those numbers may reflect people saying what they thought the pollsters wanted to hear. But even so, the fact that more than nine out of ten people would claim that race was not a factor was still extraordinary. In a country where racial politics, racial relations, the history of slavery, and the civil rights movement have been such an emotionally scarring part of the national debate for such a long time, the idea that a very large majority said race was not a factor in their vote was nothing short of a shock.

What surprised a good many veteran political observers just as much was that this young, little-known senator from Illinois had burst onto the scene and defeated the Democratic powerhouse Hillary Clinton in a primary race that the senator from New York and former first lady thought was her prize—a step to becoming America’s first female president.

Just as they would be in the presidential campaign, the policies of the outgoing president George W. Bush and Americans’ desire for change were key issues throughout the primary campaign. Bush was unpopular. Polls consistently showed that only 20 to 30 percent of the American public approved of his job performance.

On that count, Barack Obama held a distinct advantage. As the war in Iraq became increasingly unpopular, it was Obama who could campaign as the Democrat who had opposed the war. As a senator, he had voted against the measure authorizing the war; Hillary Clinton had voted for it. Apart from that crucial difference, there were few serious policy disagreements between the two Democrats. (Days after his election, Obama offered the post of secretary of state to Senator Clinton, his bitter primary rival, who accepted the job.)

Obama’s primary victory was fueled largely by that discontent, along with a desire for “change.”

During the general election campaign, the major party candidates ran on a platform of change and reform in Washington. Domestic policy and the economy eventually emerged as the main themes in the last few months of the election campaign after the onset of the 2008 economic crisis.

But the unpopular war in Iraq was a key issue during the campaign before the economic crisis. When John McCain said the United States could be in Iraq for as long as the next 50 to 100 years, his remark proved costly. Running against George Bush as much as against John McCain, Obama linked McCain, a stalwart supporter of the war in Iraq, to the unpopular President Bush; this was not difficult, since McCain himself said he supported Bush 90 percent of the time that the president was in office.

McCain attempted to depict Obama as inexperienced, but the desire for “change” seemed more important to Americans than “experience.” It seemed there was an awful lot of experience behind the Bush team, and many Americans thought that this team had made a mess of things. Of what value was experience?

McCain also undercut his own argument for experience with the historic selection of the first woman to run for vice president on the Republican ticket, Governor Sarah Palin of Alaska, who quickly became a lightning rod of controversy in the election. Palin had been governor only since 2006, and before that had been a council member and mayor of the small Alaskan town of Wasilla. When several media interviews suggested that Palin lacked basic knowledge on certain key issues, serious doubts were raised about her qualifications to be vice president or president.

But the war and experience—Obama’s or Palin’s—soon took a backseat to the economic crisis engulfing the country. The campaign played out against the recession and the credit crisis. McCain’s prospects suffered as he made some costly misjudgments about the economy and was once again linked to the Republican administration that was being blamed for the crisis. It didn’t help when he told an interviewer that he didn’t know how many houses he and his wife owned. (The answer was seven.) His out-of-touch image took another hit when, on September 15, the day of the Lehman Brothers bankruptcy, McCain declared that “the fundamentals of our economy are strong.”

The first African-American to be elected president of the United States, Barack Obama won the election with 52.93 percent of the popular vote to McCain’s 45.65 percent. No third-party candidate had a serious impact on the race this time, unlike in 2000, when Ralph Nader was widely thought to be a “spoiler” who hurt Al Gore’s prospects. In the electoral vote, Obama won 365 votes to McCain’s 173, a fairly convincing if not sweeping victory.

“He had secured a victory,” wrote John Heilemann and Mark Halperin in their book Game Change:

that was as dazzling as it was historic. His 53 percent of the popular vote was the largest majority secured by a Democrat since Lyndon Johnson. He swept the blue [traditionally Democratic] states, captured the battlegrounds of Pennsylvania, Ohio and Florida, and picked up red [traditionally Republican] states across the country: Colorado, Indiana, North Carolina, Virginia. He dominated among black voters (95–4), Hispanic voters (66–32), and young voters (66–32). His share of the white vote, 43 percent, was higher than what Gore or Kerry had attained—and among whites age eighteen to twenty-nine, he trounced McCain, 68–31.

In both the primary campaign against Hillary Clinton and the general election, Obama had effectively used the Internet, especially in raising campaign contributions and appealing to younger voters, a generation much less influenced by political traditions of race, geography, or gender. And in that sense, his election was transforming.

Markos Moulitsas Zuniga, founder of an influential Internet site called Daily Kos, a liberal-progressive blog, saw Obama’s win as representing “the future of our nation—young and multicultural. And the exit polling suggests that Republicans are headed for some rough waters ahead if they don’t recognize this. . . . The white vote kept McCain peripherally competitive. But Republicans are tentatively holding onto a shrinking portion of the electorate, while Democrats enjoy massive advantages with the fastest growing demographics.”

AMERICAN VOICES

PRESIDENT-ELECT BARACK OBAMA, speaking at Chicago’s Grant Park following his victory, on election night, November 4, 2008:

If there is anyone out there who still doubts that America is a place where all things are possible; who still wonders if the dream of our founders is alive in our time; who still questions the power of our democracy, tonight is your answer.

President Obama arrived in the White House to be greeted by two wars and a financial crisis unequaled since the Great Depression. The severe economic downturn began officially in December 2007, a fact not declared until much later, and ended in June 2009. This crisis, which some have called the Great Recession, played out while the Obama administration also pledged to tackle major reforms in the financial industry, salvage a teetering American automobile industry confronting the once unthinkable idea of the demise of General Motors, close the notorious and controversial prison facility maintained by the U.S. military at a base in Guantánamo, Cuba—and overhaul the nation’s health-care policies.

In a little over a year, the Obama administration—usually in the face of united Republican opposition—had completed what the financial journalist David Leonhardt of the New York Times called “16 months of activity that rival any other since the New Deal in scope or ambition. Like the Reagan Revolution or Lyndon Johnson’s Great Society, the new progressive period has the makings of a generational shift in how Washington operates.”

In fairly rapid succession, the Obama administration crafted an economic stimulus bill that also set out to reform and remake the nation’s educational system. In May 2009, Obama nominated Sonia Sotomayor to replace the retiring Supreme Court justice David Souter. Born in the Bronx, New York, of Puerto Rican descent, she had been appointed to the federal court by George H. W. Bush and elevated by Bill Clinton. After being nominated by Obama, Sotomayor was confirmed by the Senate in August 2009, becoming the first Hispanic justice (and the third woman) in the history of the Supreme Court. In August 2010, Obama made a second appointment to the Supreme Court when his solicitor general, Elena Kagan, the former dean of Harvard Law School, was confirmed by the Senate. When Kagan took her seal in October 2010, it marked the first time that three women had served on the Supreme Court concurrently. As both women replaced so-called liberal members of the Supreme Court, they were not expected to change the balance of what had become one of the most “conservative” Courts in recent history.

After months of contentious and brutal debate, Obama also signed a health-care reform bill—a few weeks after the death of its greatest champion in the Senate, Edward Kennedy of Massachusetts—that created the largest expansion of the nation’s social safety net in half a century. The bill was intended to expand insurance coverage largely for middle-class and poor families and paid for some of it by taxing households making more than $250,000 a year. Obama then set out to revise the nation’s financial rules, in a way to specifically address the excesses that led to both the dot-com crash and the meltdown that in turn led to the Great Recession.

The crucial midterm elections of 2010 arrived with the nation still mired in a sluggish economy, with few new jobs being created, the wars in Iraq and Afghanistan continuing, and a disgruntled electorate unhappy with both Congress and President Obama. As popular dicontent grew over taxes, the role of the government, and uncertainty about the health-care reform that had been passed, there was a significant power shift in Washington. Winning sixty-three seats in the House, in what the president himself called a “shellacking,” the Republicans regained control of the lower chamber of Congress. The Republicans also gained six Senate seats, and while the Democrats retained their majority there, the Obama administration would be moving into its second half with his abilities to shape the country’s agenda sharply curtailed.

AMERICAN VOICES

JUDGE LEWIS A. KAPLAN, of the United States District Court in Manhattan, October 6, 2010:

The court has not reached this conclusion lightly. It is acutely aware of the perilous nature of the world in which we live. But the Constitution is the rock upon which our nation rests. We must follow it not when it is convenient, but when fear and danger beckon in a different direction.

Judge Kaplan wrote this as part of a decision that barred prosecutors from using a crucial witness in the trial of a former Guantánamo detainee. The witness excluded by the judge had been identified and located after the accused man had been interrogated in a secret overseas jail run by the CIA. The accused terrorist’s lawyers said he had been tortured there.

To the American Nazi party, Hustler magazine, and other odious figures in Supreme Court history, add the Rev. Fred Phelps Sr. and the members of the Westboro Baptist Church in Topeka, Kan. Their antigay protests at the funeral of a soldier slain in Iraq were deeply repugnant but protected by the First Amendment. . . . One friend of the court brief called the protestors’ message “uncommonly contemptible.” True, but it is in the interest of the nation that strong language about large issues be protected, even when it is hard to do so.

—New York Times editorial, October 7, 2010

In a case that was being heard by the United States Supreme Court, the New York Times and other press organizations filed a brief in support of a Kansas church group that held protests at the funerals of fallen American soldiers to express their opposition to homosexuality. The Kansas church members believed that “God hates homosexuality and hates and punishes America for its tolerance of homosexuality, particularly in the United States military,” according to the Times editorial. The Court had to decide if the First Amendment protected these protests.

Both of these cases—an accused terrorist being protected by the U.S. Constitution’s guarantees of a fair trial and the church group that intruded on the grief of soldiers’ families and claimed the protection of free speech—are part of the messy ripples of history.

The great historian Edward Gibbon once called history “little more than the register of the crimes, follies, and misfortunes of mankind.” Voltaire called it a trick played by the living upon the dead. According to Thomas Carlyle, history is “a distillation of rumor.” And Henry Ford said history is “more or less bunk.”

This American history is a little bit of these and then some. But one thing history is not is boring. History is alive and human—and changing all the time. We need to rewrite it. And we need to learn from it. America has survived a lot. Revolution. A civil war. Two world wars. Depressions and recessions. Presidents and politicians, bad and good. A cold war that took the world to the brink of mass destruction on more than one occasion. And then a terrorist attack that shook the very foundations of the country’s security and sense of trust.

As the country has taken its first uncertain steps into the new century, remembering America’s history becomes all the more important. What’s past, after all, is prologue.