An Alternative Post-9/11 History
Terrific dangers and troubles that we once called “foreign” now constantly live among us. If American lives must end, and American treasure be spilled, in countries that we barely know, then that is the price that change has demanded of conviction and of our enduring covenant.
—President Lyndon B. Johnson, January 20, 1965
Philadelphia is passionate about its sports teams. The city’s fan bases are regularly ranked among the most loyal in all of sports, and Philadelphians are so fanatical that the National Football League’s Eagles briefly put a courtroom in the team’s stadium to deal with unruly fans. When the team won its first Super Bowl in 2018, fans celebrated by tearing down Philadelphia’s light posts, flipping cars, and setting fires in the streets.1 This is a city whose fans once famously heckled and threw snowballs at Santa Claus. So on September 20, 2001, the 19,117 fans at a Philadelphia Flyers game did something quite in character for the City of Brotherly Love: they booed. Why they did so is the surprising part.
That night, nine days after September 11, President George W. Bush spoke to a joint session of Congress, and during the second intermission of an exhibition game between the Flyers and the rival New York Rangers, Bush’s remarks were played in the arena. As the players returned to the ice, the big-screen operator at the team’s home stadium, First Union Center, turned off the speech in preparation for the third period. Under a torrent of boos, the speech was quickly turned back on, and for the next thirty-three minutes, the fans and players watched in silence. Afterward, both teams shook hands. The remainder of the game was canceled and declared a 2–2 tie.2
It might have been the first time in the city’s history that fans chose to hear the words of a politician over rooting for their team.
This unusual display of “country before team” is an important reminder that presidents maintain a powerful bully pulpit, particularly when it comes to explaining global events to the American people. In total, eighty-two million citizens watched President Bush that evening—the largest-ever audience for a presidential address. The speech was delivered at the recommendation of the Speaker of the House, Dennis Hastert, who suggested to Bush that he needed to address the nation after the catastrophe, just as President Franklin Roosevelt had done after Pearl Harbor. Like Roosevelt’s famous declaration that December 7, 1941, would be a date that lived in “infamy,” Bush’s speech had a seismic impact on the nation. His remarks that night turned out to be among the most consequential—and, in time, most calamitous—of the twenty-first century and served as a powerful framing event for all Americans in understanding the post-9/11 world.3
On September 20, 2001, Americans were still grieving for those who had been killed. They were scared, unsure of what had happened and what might come next. As the fire at Ground Zero continued to burn and the search for bodies had only just begun, they looked to their president for answers.4 Bush followed firmly in the footsteps of Harry Truman, Lyndon Johnson, and countless other politicians—he scared the hell out of them. “Our war on terror begins with al-Qaeda, but it does not end there. It will not end until every terrorist group of global reach has been found, stopped, and defeated,” Bush defiantly declared. He announced the start of a global war on terrorism and prophetically promised a “lengthy campaign.” Bush claimed that “freedom and fear are at war,” and like Truman, he cast the struggle against jihadist terrorists who lived in caves in Afghanistan as an existential fight between good and evil. “The advance of human freedom, the great achievement of our time and the great hope of every time, now depends on us,” he declared. “Our nation, this generation, will lift the dark threat of violence from our people and our future. We will rally the world to this cause by our efforts, by our courage. We will not tire, we will not falter, and we will not fail.”5 After the speech, Bush told his chief speechwriter, “I have never felt more comfortable in my life.”6
Bush’s speech had its desired effect. U.S. media outlets were sympathetic to Bush’s call for a strong military response, and the public was primed to lend support. According to one study, 83 percent of those who viewed the president’s speech to Congress felt it made them “more confident in this country’s ability to deal with this crisis.” They were also “significantly more likely than those who did not watch it to support Bush’s handling of the crisis, the use of force against Afghanistan, and the use of force against other states that may have supported al-Qaeda.”7
Four months later, in January 2002, Bush delivered his first State of the Union Address as a wartime president. He told the nation of a North Korea, Iraq, and Iran “axis of evil” that threatened the United States, even though none of those countries had any connection, even indirectly, to the 9/11 attacks. He also raised the specter of terrorists getting their hands on weapons of mass destruction and declared, “the United States of America will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons.”8
In May 2002, Bush again took to the bully pulpit in a commencement speech at the U.S. Military Academy at West Point. There he told the graduating cadets, “the gravest danger to freedom lies at the perilous crossroads of radicalism and technology.” He offered a new doctrine of preemptive war that called for the United States to “take the battle to the enemy, disrupt his plans, and confront the worst threats before they emerge.”9
That fall, Bush married all of these themes together as he marshaled public support for invading Iraq to protect the United States, both from the menace of that country’s alleged stockpiles of weapons of mass destruction and from Saddam Hussein’s purported connection to al-Qaeda-affiliated terrorists. In March 2003, the United States invaded Iraq and initiated a war that lasted nearly eight years, took the lives of forty-six hundred Americans, and reached at least $1 trillion in direct costs.
This history ought to be well-known to the American people. The global war on terror, the wars in Afghanistan and Iraq—and then Libya and Syria—the drone strikes that became ubiquitous under the next two presidents, and a newfound domestic vigilance to “see something” and “say something” defined the first decade and a half of twenty-first-century America.
But imagine for a moment that a different speech had been broadcast that evening to the Flyers fans in Philadelphia. Bush could still have urged Americans to grieve those who were lost on 9/11 and pledged vengeance against those who directly attacked the United States. But what if he had also reminded his audience that the threat posed by terrorist groups like al-Qaeda was serious but manageable?
Imagine if Bush had proclaimed, “The one thing the terrorists want more than anything else is for this great nation to overreact to this terrible act of violence. They want us to play into their hands, as Osama bin Laden himself has stated. America is stronger than that, and the world is overwhelmingly on our side. We will patch up the nation’s wounds, punish those who are responsible, and strengthen our defenses to ensure that this never happens again. But above all, we will act smartly and with restraint, secure in the knowledge that the international community—and peace-loving people everywhere—are overwhelmingly on the side of the United States.” If Bush had delivered these words, America—and the world—would likely be a very different and much safer, healthier, and more prosperous place today.
The strategic mistakes made in the prosecution of the war on terror and invasion of Iraq have been examined and reexamined, and there is no need to fully rehash them here. Less scrutinized, however, are the indirect costs of the conflicts and, in particular, the tremendous opportunity costs. Money and attention spent on fighting phantom threats in the Middle East could have been better utilized not only to keep citizens safe from the domestic threats and systemic risks that actually harm them but also to further promote U.S. national security interests abroad.
In this chapter, we envision a plausible alternative post-9/11 history and detail the small but crucial steps that the Bush administration did right (implementing policies that better protected the U.S. homeland at little cost). We also look at the great many things that it got wrong (initiating the occupation of Afghanistan after the toppling of the Taliban and invading and occupying Iraq). The response to 9/11 offers perhaps the most enduring lesson about the catastrophic consequences—and lost opportunities—that come from foreign-threat inflation.
The Smart Response to 9/11
One of the great ironies of 9/11 is that the initial response to the worst terrorist attack on American soil was smart, modest, and appropriate. Though the attacks seemed to be a transformative event in the nation’s history, it was neither a “strategic surprise” nor a “failure of imagination,” as the 9/11 Commission later determined. Before 9/11, political leaders were well aware of the growing problem of transnational terrorism, including the weaponization of civilian airliners.10 Scholars and experts had repeatedly warned about the threat from al-Qaeda and had called for commonsense and inexpensive counterterrorism and homeland security policies, such as better airport security, improved information sharing between intelligence and law enforcement agencies, and stronger border security.11
A tragic combination of political inertia, disinterest, and bureaucratic turf protection had made the United States needlessly vulnerable. But policy makers learned from their mistakes. Much of the negligence and policy shortcomings that predated 9/11 were addressed through new laws and regulations, increased government spending, and an expansion of America’s homeland security and intelligence infrastructure.12 These improvements ranged from the obvious and immediate to the complex and ongoing.
For example, prior to 9/11, the Federal Aviation Administration (FAA) mandated that aircrews keep cockpit doors closed and locked while in flight, but this regulation had never been rigorously observed. The 9/11 Commission could never determine exactly how hijackers accessed the cockpits of the four planes seized that day, whether they took the keys from flight attendants, forced them to open the door, or somehow lured a pilot outside.13 Had the cockpit doors remained closed, the airliners would never have been turned into flying missiles. The Transportation Security Administration (TSA), which was created in the wake of 9/11, updated the requirements for cockpit security, replacing flimsy doors and simple latches with hardened, bulletproof doors that had electronic locking devices. Congress provided $97 million to defray the costs for airlines—a negligible price relative to the outsized impact on airline security.14
The 9/11 Commission also proposed legislation to “set standards for the issuance of birth certificates and sources of identification, such as driver’s licenses.”15 This led to the Real ID Act of 2005, which required states to include certain information on licenses (such as full legal name, date of birth, and signature), demand more identification material, and ensure that licenses are more securely produced. Since the implementation of the act, about 90 percent of driver’s licenses comply with these new standards and are now far more difficult to forge.16 This means that the airlines and the TSA have a far better chance of accurately detecting individuals attempting to evade no-fly lists or at least ensuring that they receive secondary screening.
The most consequential homeland security improvement originated in the 9/11 Commission’s recommendation that intelligence agencies better share intelligence information and “break down stovepipes” that had previously prevented such cooperation. For example, when the CIA learned in January 2000 that an individual with al-Qaeda connections, Khalid al-Mihdhar, had been issued a U.S. visa, it kept that information from the FBI and the State Department. Two months later, when it received evidence that another al-Qaeda member, Nawaf al-Hazmi, also had a U.S. visa and had purchased a plane ticket to Los Angeles, it again failed to inform the FBI.17 Both men were among the nineteen 9/11 hijackers.
Improved information sharing was addressed with the 2004 Intelligence Reform and Terrorism Prevention Act, which established the Director of National Intelligence (DNI) to ensure better cooperation among intelligence agencies. The act also established the National Counterterrorism Center (NCTC), which is the government’s analytical hub for all intelligence related to foreign terrorism. Today, the FBI and CIA better share intelligence about potential terror plots, and there is a central clearinghouse for all terror-related information.18
This focus, not surprisingly, led to a doubling in federal counterterrorism spending after 9/11.19 While still disproportionate to the terrorist threat actually facing the United States, it is a relatively small fraction of taxpayer dollars, particularly in comparison to the trillions spent invading, occupying, and rebuilding Iraq and Afghanistan.20
These post-9/11 measures have made Americans vastly safer from terrorism today. If the homeland security apparatus currently in place had existed on September 11, 2001, it is highly unlikely that Americans would have any reason to remember that date.21
But preventing terrorist attacks is about more than dollars and cents. One of most important factors in keeping the United States safe from Islamic terrorism is the significant challenge in recruiting of Muslim Americans to the jihadist cause. Despite the Islamophobic sentiments expressed by some politicians and media figures, Muslims in America overwhelmingly reject terrorist violence (few, for example, have joined the Islamic State caliphate in Syria and Iraq, particularly compared to Muslim populations in western Europe).22 Many Muslim Americans have worked with law enforcement in identifying individuals vulnerable to recruitment by extremist groups or potential perpetrators of lone-wolf attacks. Indeed, the decision by President Bush to show solidarity with the American Muslim community—including visiting a mosque in Washington, DC, not long after the attacks—helped to prevent the kind of radicalization that is all too familiar to European authorities.23
The Immediate Costs of Threat Inflation: $3.8 Trillion
While the Bush administration and Congress made several smart homeland security choices after 9/11, their decisions overseas tell a different story. The so-called global war on terrorism became the dominant, and at times exclusive, frame for Bush’s post-9/11 foreign policy. In the days and weeks after the attacks, Bush was greeted each morning with the Threat Matrix, a collection of potential terrorist attacks that included even the most unlikely and least verifiable threats amassed by the intelligence community. The director of central intelligence George Tenet later wrote that it was impossible to read what crossed his desk “and be anything other than scared to death.”24 The theater of post-9/11 security measures affected policy makers as well. According to Condoleezza Rice, then the national security adviser, “large, menacing men swathed in black and armed with assault rifles and shotguns suddenly showed up everywhere” around the White House; “it had a huge impact on our psyches.”25
With the Bush administration already committed to an aggressive policy of preventive war against states that allegedly harbored or assisted terrorists, there would be little political or media pushback in inflating the terrorist threat. “The people who did this act on America, and who may be planning further acts, are evil people,” Bush said just two weeks after 9/11. “They don’t represent an ideology. They don’t represent a legitimate political group of people. They’re flat evil. That’s all they can think about, is evil.”26 Two weeks later, he declared even more directly, “Our war is against evil.”27
Counterterrorism quickly became the defining language of official Washington, a catch-all justification for presidential priorities, from a new volunteerism initiative to a push for increased domestic energy production. “We will prevail in the war, and we will defeat this recession,” Bush said in his 2002 State of the Union Address, as he conflated the war abroad with his economic policies at home. Fear that a congressional inquiry into the events of 9/11 would weaken America’s fight against the terrorists was even used as a rationale to shut it down.28
Fighting terrorism also became the basis on which America’s bilateral and multilateral relations were judged. “Either you are with us, or you are with the terrorists” was the binary choice that Bush presented to all other 191 countries in his speech to Congress.29
Viewing the world through such a lens led to policies that significantly undermined America’s reputation and credibility. In January 2002, the United States began dispatching terrorist suspects from Afghanistan and Pakistan to a detention center at the Guantanamo Bay Naval Base in Cuba. They were housed there without Geneva Convention protections and under a new and dubious legal designation: “enemy combatant.”30 The first twenty prisoners to arrive were stripped naked, placed in diapers and orange jumpsuits, blindfolded, and shackled to the floor of a military transport aircraft, because “these are people that would gnaw through hydraulic lines in the back of a C-17 to bring it down,” according to Gen. Richard Myers, then chairman of the Joint Chiefs of Staff.31 In the years after these prisoners were flown to Guantanamo, terrorists repeatedly used the imagery of the orange-clad suspects, held in six-by-eight-foot open-air, wire-mesh cages, for recruiting purposes. As of May 2018, Guantanamo remains open with forty detainees held, five of whom were recommended for release by a high-level government panel, as well as twenty-six indefinite detainees who will likely never be charged, prosecuted, or let go.32 President Donald Trump issued an executive order in January 2018 that directed the military to send even more prisoners to Guantanamo, ensuring that nearly seventeen years after it opened, it is no closer to being shuttered.33
Fear of terrorism also drove the White House to adopt policies that not only were illegal but also violated basic American values. Torture, and in particular waterboarding, of terrorist suspects became frighteningly routine. The CIA operated a secret rendition, interrogation, and torture program with black sites in Afghanistan, Lithuania, Poland, Romania, and Thailand. These unethical and illegal efforts brought marginal national security gain. There is little evidence that information obtained through torture prevented any attacks against the United States.34 The use of such measures instead had a damaging cascade effect when horrifying pictures emerged of U.S. soldiers in Iraq using the same torture methods against detainees at Abu Ghraib, a notorious prison complex west of Baghdad where Saddam Hussein had imprisoned and tortured his political enemies.35
The diplomatic and reputational costs of these policies have been tremendous. A Carr Center for Human Rights Policy at Harvard University study found that the use of torture “incited extremism in the Middle East, hindered cooperation with U.S. allies, exposed American officials to legal repercussions, undermined U.S. diplomacy, and offered a convenient justification for other governments to commit human rights abuses.”36 Concerns over the use of torture jeopardized relationships that were critical for the war on terrorism, stalling military cooperation, extradition treaties, and access to foreign airports, air bases, and airspace.37 For example, in 2003, the Netherlands planned to send troops to Afghanistan to support the U.S.-led mission, until public opposition to torture delayed parliamentary authorization for the deployment for three years.38 The Finnish parliament deferred the ratification of an extradition treaty between the United States and European Union, and both Ireland and Great Britain enacted strict requirements for landing U.S. military planes on their soil.39
There were also more direct financial costs for America’s obsessive focus on terrorism. More than forty countries were recruited to play a role in the Iraq invasion and occupation, and many received compensation. In addition, the United States began subsidizing counterterrorism efforts around the globe. Pakistan alone has received more than $34 billion in economic and security assistance since 2002.40 This is the same country that remained home to Osama bin Laden for nearly a decade and provided sanctuary to Taliban insurgents fighting U.S. troops in Afghanistan. It is not hard to imagine the many ways $34 billion, or a fraction of that total, could have been spent on more modest counterterrorism efforts.
Finally, there are larger foreign policy opportunity costs. The war on terrorism undermined other long-standing U.S. foreign policy objectives, such as democracy promotion—an issue that moved to the forefront of Bush’s second-term foreign policy agenda. But Bush quickly found those efforts thwarted by his war on terror prerogatives. In Ethiopia, during the run-up to national elections that would be dominated by allegations of electoral fraud, prime minister Meles Zenawi’s government expelled several major U.S. democracy-promotion organizations. The Bush administration complained but ultimately offered little protest. After all, Ethiopia played a crucial role in the U.S. fight against Islamist rebels in Somalia. American leaders pushed for legislative elections in Gaza in 2006, but out of fear of upsetting counterterrorism allies in Israel and Egypt, they rejected the outcome when Hamas, a U.S.-designated “foreign terrorist organization,” prevailed. Similar episodes played out with Kazakhstan and Azerbaijan, two countries led by authoritarian strongmen but critical players in the U.S. war in Afghanistan.41
Few post-9/11 foreign policy decisions, however, would prove more disastrous than U.S. involvement in Afghanistan. By the fall of 2001, the initial U.S. objective there was achieved: the Taliban had been removed from power in Kabul, and al-Qaeda training camps in southern Afghanistan had been eliminated. However, after Osama bin Laden and his senior aides were allowed to escape to Pakistan and the Taliban became a structured insurgent force, U.S. attention waned, particularly in the face of herculean economic, political, and developmental hurdles involved in rebuilding Afghanistan. The United States refused to work with Afghan political leaders to broker a diplomatic settlement with the Taliban. After all, the war on terrorism had been predicated on destroying terrorist organizations, not making deals with them.42 With focus moving to the next front in the war on terror, Iraq, critical military assets—including elite special operations forces and Predator drones—were diverted to the Persian Gulf. As a result, the political and security situation inside Afghanistan began to fester, and the Taliban insurgency reemerged.43
By 2009, the Taliban had made significant inroads in reestablishing itself as a viable insurgent group. While the Taliban still posed little threat to the United States—and had few operational ties to al-Qaeda—U.S. leaders were practically powerless in resisting the call to do something to stabilize Afghanistan. President Barack Obama sent nearly fifty thousand additional U.S. troops into the country in order to break the Taliban’s momentum—an effort that did nothing to bring the war closer to conclusion, as Afghanistan remains mired in civil war today. The human toll of the war has been more than twenty-three hundred U.S. troops and at least thirty-two thousand Afghan civilians.44 As of this book’s publication, the war in Afghanistan—and the Taliban insurgency—continues, making the seventeen-year war the longest in American history. Amazingly, its price tag now exceeds that of the war in Iraq.
Invading Iraq
While the war on terrorism dominated the attention of the Bush administration in the days and weeks after 9/11, it was the ill-fated decision to invade Iraq in March 2003 that created a true foreign policy catastrophe. The U.S. invasion and occupation has led—directly and indirectly—to more than one million deaths.45 Another 7.6 million people have been displaced by the war.46 A decade and a half after Saddam Hussein was toppled, Iraq suffered the highest number of terror attacks and terror fatalities in the world, and a host of terrorist organizations, militias, and proxy forces continued to destabilize the country.
Close to 4,500 U.S. active-duty, national guard, and reserve service members died fighting in Iraq.47 Another 110 U.S. private citizens were killed in Iraq while supporting the war effort.48 In addition to the ultimate sacrifice of U.S. troops are the unknown yet essential contributions of private contractors in America’s wars. For every U.S. troop in-country, there was at least one military contractor providing support, with the total number peaking at 164,000 in 2008.49 More than 1,500 of these contractors lost their lives—approximately one-third of whom were American citizens.50
Beyond the human toll, the financial costs of the Iraq War have been stratospherically high. Before the 2003 invasion, secretary of defense Donald Rumsfeld pegged the total bill for the war at “something under $50 billion.” When the White House economic adviser Lawrence Lindsey projected the war’s cost at 1 or 2 percent of gross domestic product—$100 to $200 billion—he was summarily pushed out of his job.51 Andrew Natsios, the director of USAID, said that the costs of reconstruction would top out at around $1.7 billion, and deputy secretary of defense Paul Wolfowitz claimed that Iraq could “finance its own reconstruction, and relatively soon.”52
In fact, the high end of Lindsey’s estimate was off by a factor of four. The United States spent more than $819 billion on the invasion, occupation, and reconstruction of Iraq between 2003 and 2017.53 This number includes only direct war spending, which peaked at $12 billion a month in 2008. Reconstruction efforts alone, such as building roads and repairing water systems, and providing assistance to small businesses reached more than $170 billion—one hundred times higher than Natsios’s estimate.54 By comparison, the first Gulf War cost $61 billion and was largely subsidized by Germany, Japan, and Persian Gulf states.55 According to one independent government investigation, that total is actually nearly equal to the amount of money spent between 2003 and 2011 on reconstruction efforts in Iraq.56
The direct costs to American taxpayers for U.S. involvement in Iraq and Afghanistan have—so far—reached $3.8 trillion.57 That works out to approximately $20,000 paid by each individual taxpayer.58 This staggering figure includes all war-related costs, veterans costs, and interest payments related to war borrowing made through November 2017. The sum is more than the United States spent on any of its previous wars, except World War II ($4.5 trillion in current dollars)—and the meter continues to run.59
The Long-Term Costs, Now and Forever: $7 Trillion
As enormous as these numbers are, they do not do full justice to the outsized costs of America’s military response to 9/11. The tally for America’s war on terror did not simply stop when active combat ended or when public attention moved on to something else. It continues to mount today as interest accrues on wartime borrowing, as veterans still require treatment, and as their families shoulder the mantle of their care. Neta Crawford, codirector of the Costs of War Project at Brown University, likens these additional expenses to an iceberg: “Some of it is visible, but most goes unseen. With climate change, people think that all icebergs melt, but this one doesn’t melt; in fact, it only gets bigger.”60 Beyond the immediate toll for U.S. troops and their families, as well as for Iraq, Afghanistan, and their neighbors, these long-term expenditures will impact every U.S. government program for the rest of the twenty-first century.
Unlike for previous U.S. conflicts, the Bush administration and Congress chose to finance the wars in Iraq and Afghanistan not through short-term tax increases but rather by borrowing money.61 Going to war in Iraq and Afghanistan on a credit card added about $2 trillion to the national debt. In fact, at least one-third of the federal debt accrued after 2003 is directly attributable to just these two conflicts.62 American taxpayers—and their children and grandchildren—will be servicing this debt for decades to come. In fact, they are actually paying for it now. The increased debt burden from the war contributed to an increase in interest rates, which, by 2010, were estimated to be 0.35 percentage points higher as a direct result of deficit-financed war spending. This higher interest rate in turn raised fixed mortgage rates, costing the average American homeowner an extra $600 a year in mortgage payments.63
Based on current estimates, when the interest accrued on wartime borrowing and future borrowing is combined, the total costs of Iraq and Afghanistan could reach $7 trillion through 2056.64
However, it is the direct costs in government spending that may be the most enduring legacy of the war on terrorism. Hundreds of billions of dollars will continue to be needed to care for those who have returned from war. In all, more than 2.7 million active-duty, national guard, and reserve members served in Afghanistan and Iraq between 2001 and 2017.65 By way of comparison, fewer American troops and reservists were deployed during the Vietnam War (2.6 million) and the Korean War (1.8 million)—and more than half of all post-9/11 troops were deployed more than once.66 These multiple deployments have been shown to significantly increase the likelihood of troops suffering from combat-related trauma.67 In fact, the average wounded veteran from Iraq and Afghanistan has an astounding 7.3 recognized disabilities.68
All of this comes with a significant and underappreciated cost. In 2001, the United States was spending $1.76 billion on disability benefits for veterans of the Gulf War, a conflict that took 148 American lives and left fewer than one thousand service members wounded.69 Now consider for a moment the amount of spending that will be required for veterans of the wars in Iraq and Afghanistan, which left 6,800 killed and more than 52,000 wounded.70 Already the annual disability compensation given to vets of the war on terrorism is $15 billion. These payments will rise exponentially over the next several decades.71
The families of service members killed on active duty or from service-related injuries and those receiving VA disability benefits at the time of death are also eligible for pensions of at least $1,200 a month.72 It bears noting that eighty-eight dependents still receive benefits from the Spanish-American War, which ended in 1898. And astonishingly, 151 years after the surrender at Appomattox, the VA still pays benefits to an eighty-seven-year-old daughter of a Civil War veteran. As of August 2017, Irene Triplett was receiving $73.13 every month from her father’s pension.73 Since life expectancy in the United States has nearly doubled since the Civil War, there will be many Irene Tripletts who will be receiving monthly tax-free pension checks into the twenty-second century and beyond.
These costs are already piling up. In 2001, $91 billion was appropriated annually for both the Military Personnel and the Defense Health programs. In 2015, that number had risen to $160 billion.74 These costs are, in many ways, qualitatively different from those of previous conflicts because of the nature of modern wartime injuries. While the most common malady for returning vets is hearing loss, the defining physical and mental injuries from the wars in Iraq and Afghanistan have been traumatic brain injury (TBI) and posttraumatic stress disorder (PTSD).75 Between 15 to 23 percent of returning service members have a TBI, while one in five have suffered from PTSD.76 Such injuries create lifelong infirmities—and enduring costs. In 2014 and 2015, the federal government spent $116 million on TBI care for Iraq and Afghanistan veterans, and the VA estimates that TBI will cost another $500 million total for post-9/11 veterans from 2016 to 2025.77 While $3.3 billion was spent in 2012 for veterans suffering from PTSD, demand will increase because both traumas can go undiagnosed for years, with symptoms manifesting themselves decades later.78 Then there are the indirect expenses. Through 2010, $2.2 billion has been lost in earnings by military veterans because of traumas and disabilities.79 Moreover, PTSD can be transferred to family members (called “secondary traumatic stress”), the long-term consequences of which are unknown.80
It is nearly impossible to accurately quantify the impact that the war on terrorism has had on state and federal budgets or the U.S. economy. Nearly 60 percent of veterans had family responsibilities, and more than one million children had a parent deployed overseas.81 Children whose parents were sent to Iraq or Afghanistan—compared to children of civilian parents—perform worse in school, experience greater anxiety or clinical depression, and attempt suicide at greater rates. They are also frequently required to fulfill parental or nursing responsibilities.82 In fact, 1.1 million Americans serve as military caregivers for veterans of post-9/11 wars, at an estimated $5.9 billion in lost productivity.
In total, Linda Bilmes of Harvard University estimates that through 2056, the costs for disability, medical, and administrative costs of caring for post-9/11 veterans will total more than $1 trillion—and that is without taking into account lost productivity or the personal and emotional responsibilities borne by families, neighbors, and local communities. In short, the financial burden of America’s overreaction to 9/11 will be borne by multiple generations of Americans for decades to come—and not just those who served.83
What Might Have Been, Part I
What if the reaction to 9/11 had been more modest? What if the focus of the Bush administration had been on strengthening homeland security but eschewing a strategy of preemptive war? Perhaps the greatest tragedy of the post-9/11 period is the millions of deaths that could have been prevented with the money spent on America’s wars.
Take for example, one of the more unusual elements of the global war on terror: a public health initiative pushed by President Bush while he was making the case to go to war in Iraq. In his January 2003 State of the Union Address, in which he uttered the notorious (and inaccurate) claim that the “the British Government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa,” Bush unveiled the President’s Emergency Plan for AIDS Relief (PEPFAR). The need for action could not have been more pressing. In 2002, three million people died from AIDS, five million more were stricken by the disease, and forty-two million had been infected in total—effectively a death sentence given the lack of available antiretroviral treatments. That year, only fifty thousand out of 29.4 million HIV-positive people living in sub-Saharan African had received these life-extending medicines.84
Bush asked Congress to authorize $15 billion over five years for PEPFAR—nearly $10 billion of which was new money—to combat the disease in fourteen of the most afflicted countries in Africa and the Caribbean. Bush’s request represented a more than 1,400 percent increase in U.S. funding for international HIV/AIDS programs.85 As he later recounted, “I hoped it would serve as a medical version of the Marshall Plan.”86 It more than achieved that goal.
Within just the first four years after Bush unveiled the program, PEPFAR’s activities and spending on HIV care, prevention, and treatment averted 1.2 million deaths in twelve countries: Botswana, Cote d’Ivoire, Ethiopia, Kenya, Mozambique, Namibia, Nigeria, Rwanda, South Africa, Tanzania, Uganda, and Zambia.87 A four-year Institute of Medicine review concluded that PEPFAR had been “globally transformative” and had “saved and improved the lives of millions.”88
What is most remarkable about PEPFAR is its relatively paltry sum of $15 billion. Contrast that with the trillions of dollars spent in Iraq and Afghanistan, for far more dubious benefit. Just a fraction of that money could have done a world of good elsewhere. For example, providing universal health care is one of the best public health strategies for immediately helping the world’s poor. Studies have shown that, at a cost of approximately $6.7 billion, expanding existing and proven health programs could, for example, reduce the number of deaths from pneumonia and diarrhea.89 An expenditure of $9.6 billion on maternal nutrition and mineral and vitamin supplementation (less than two months of what it cost to fight the Iraq War) could have averted the deaths of nine hundred thousand children under five years old across the developing world.90 Similarly, insecticide-treated bed nets (ITNs) have been shown to reduce deaths from malaria infections (which killed more than 740,000 people each year between 2000 and 2005) by up to 44 percent.91 At the time, each ITN cost less than three dollars and lasted three years.92 For a few million dollars above what the United States was contributing to the effort, a hundred thousand lives, or more, could have been saved.
Noteworthy progress could have also been made in improving childhood nutrition. In 2000, the World Health Organization warned that 3.4 million children died each year from being underweight.93 This is a public health challenge that just a small financial contribution could have significantly impacted. For example, a 2002 World Bank program in Senegal recruited health workers to counsel and monitor mothers of children under the age of three. For just $23 million over four years, more than two hundred thousand participating children were prevented from growing up underweight and thus dying prematurely.94 Similar results occurred in Colombia, where from 2001 to 2005, fifteen-dollar monthly grants to households resulted in increased food consumption and improved nutritional quality for young children.95 Although extraordinary gains have been made in recent years on a host of public health issues, if just a small portion of the money spent on promoting democracy and fighting terrorism in the Middle East had been spent on helping people live happier, healthier, and longer lives, it could have made a world of difference.
These global investments are not simply worthwhile acts of charity and compassion. Rather, when people’s quality of life is improved—when they are less worried about feeding their family and when children go to school—it increases stability, improves economic performance, and limits the potential for conflict. This makes America safer, since more stable and peaceful countries are less likely to serve as safe havens for transnational terrorists or to be at war with allies that U.S. troops would be obligated to protect. Former Marine general—and later secretary of defense—James Mattis expressed this sentiment best to Congress in 2013: “If you don’t fund the State Department fully, then I need to buy more ammunition.”96 Few people, unfortunately, were listening to Mattis, including his future commander in chief, Donald Trump, who in his first two budgets as president, proposed slashing spending for the State Department and USAID by more than 25 percent a year.97
What Might Have Been, Part II
Beyond the global effect, the $3.8 trillion spent in Iraq and Afghanistan could have transformed the lives of the American people. As we detailed earlier, the risks and systemic harms that pose the greatest threats to Americans are neither foreign terrorist masterminds nor malicious hackers. Rather, Americans are killed in vastly greater numbers by disease, guns, and narcotics. They are hurt by failing infrastructure, poor schools, and growing levels of income inequality. If the United States had redirected even a fraction of the hundreds of billions spent “fighting terrorism” toward reducing the challenges that Americans face at home, not only would Americans be safer, but their quality of life would be dramatically better too.
Start with the number-one killer of Americans: noncommunicable diseases (NCDs). In 2004, 2.14 million people died from NCDs like cancer, heart disease, and respiratory disease.98 One critical preventive factor for reducing the prevalence of NCDs and the onset of all diseases and ailments is ensuring affordable access to health care. The number of Americans who lacked health care coverage increased from 38.7 million in 2000 to 45.8 million in 2004.99 At the time, public health groups estimated that expanding coverage to all the uninsured would cost an additional $48 billion—or four months of funding the war in Iraq.100 Expanding health care coverage nationwide would have prevented 17,000 premature deaths annually—or one life for every 830 newly insured Americans.101
We know that U.S. deaths from gun violence would have been diminished if Congress and state legislatures had passed more restrictive gun laws. However, had money been spent on expanding mental health coverage to better ensure that people receive diagnosis and treatment for severe depression, that too would have substantially decreased gun deaths. In 2007, a legislative proposal garnered wide support from mental health providers and organizations because of its promise to do just that. The Community Mental Health Services Improvement Act would have given grants to mental health providers to expand and improve coverage, instituting programs such as “tele-mental health” services for underserved areas. The proposal would have required approximately $100 million a year in funding, but Congress never even voted on it. Hiring extra police officers would also have led to more illegal gun seizures and a reduction in overall gun crimes.102 The money spent during just one day on the Iraq War in 2008 could have put nine thousand more state and local police officers on the streets in high-gun-violence communities.103
While the United States spent $170 billion rebuilding Iraq’s and Afghanistan’s roads, water systems, and energy systems, its own crumbling critical infrastructure was in dire need of repairs and upgrades. The American Society of Civil Engineers (ASCE) releases a report card every four years on the status of U.S. infrastructure. In 2001, the ASCE graded U.S. infrastructure with a D+ overall and estimated that $1.3 trillion was needed over the next five years to bring the country’s infrastructure to an “acceptable level.”104 Needless to say, no such funds were allocated, and in the ASCE’s next report in 2003, it listed $1.6 trillion as the amount needed.105 Its following report in 2009 raised the number to $2.2 trillion.106 As a consequence of this underinvestment, between 2002 and 2015, the United States fell eleven spots in the overall international ranking of infrastructure quality, from five to sixteen.107
For one concrete example of how infrastructure spending in the middle of the first decade of the twenty-first century would have improved Americans’ well-being, go no further than the water you and your children drink. In February 2001, the Environmental Protection Agency (EPA) estimated that the cost to maintain, upgrade, or replace aging water and wastewater infrastructure over the ensuing twenty years would amount to approximately $151 billion.108 This includes $83.2 billion to repair or replace aging water lines, the deterioration of which poses significant health risks, including elevated lead and copper levels in the blood stream of children. That same year, the EPA estimated that $19.4 billion was needed immediately for water treatment to protect Americans from developing chronic health effects, including cancer and birth defects, after unnecessary exposure to nitrates and other chemical contaminants.109 Needless to say, with the United States focused on wars in the Middle East, such funding was not forthcoming. Not surprisingly, by 2009 the EPA reported that the price tag for fixing America’s water supply had more than doubled, from $151 billion in 2001 to $334.8 billion.110
A congressional report at the time, War at Any Price? The Total Economic Costs of the War beyond the Federal Budget, offered a menu of alternatives for the Iraq conflict that was running a tab of $435 million every single day.111 The same money spent on surging 30,000 U.S. troops to Iraq in 2007 could have added 5,500 teachers to U.S. classrooms. It could have also enrolled 57,500 low-income children in the Head Start program. The money could have also helped 150,000 low-income students through Pell Grants, a lifeline for poor adults seeking to go to college.112
Finally, there are the enormous opportunity costs of the war on terrorism. Economists have often demonstrated that defense spending has, at best, a temporary and limited positive impact on the economy. For every $1 billion spent on defense, 11,200 direct and indirect military-related jobs are created. The same amount creates 5,600 more jobs if spent on clean energy, 6,000 more if spent on health care, and 15,500 more if spent on education. By one estimate, if the money spent in Iraq and Afghanistan through 2014 had been channeled into clean-energy industries, health care, and education, two million more Americans would have been gainfully employed during that time period.113 These domestic investments to improve and protect Americans’ lives were never seriously considered during the same time that Americans were fighting and dying in Iraq.
At the end of the day, few people would look back on the war on terrorism as a raging success or, at the very least, a good return on investment. In fact, the U.S. reaction to September 11 is perhaps the greatest “own goal”—when a soccer or hockey team unintentionally scores its own net—in American history.
Ironically, the United States should have listened more closely to the man responsible for 9/11. Osama bin Laden often said that one of the key goals in attacking America was to produce an overreaction that would mire the United States and its allies in a war with the Muslim world. Bin Laden, who argued that the Soviet war in Afghanistan “had bled Russia for ten years, until it went bankrupt,” even claimed in a September 2007 video message that “the mistakes of Brezhnev [who had ordered Soviet troops in Afghanistan] are being repeated by Bush.”114
If al-Qaeda’s actions on 9/11 were intended to fundamentally weaken the United States economically and politically, it succeeded beyond the terrorist organization’s wildest imaginations.