ELEVEN
The Terrible Twos

MAN LEAPS FROM WINDOW, SAVED BY UNCOLLECTED TRASH, January 3, 2011, 12:07 PM ET NEW YORK (Reuters)—A man who jumped out of a ninth-floor window in New York was alive on Monday after he landed in a giant heap of trash uncollected since the city’s huge snowstorm a week ago.

Two scenes from Capitol Hill—five years apart—pretty well sum up America’s reckless behavior in the last decade. The first took place on March 18, 2005, when a group of America’s greatest baseball players testified before Congress. It wasn’t pretty. Curt Schilling, Rafael Palmeiro, Mark McGwire, Sammy Sosa, and José Canseco appeared together at one table—sitting “biceps-to-biceps”—before the House Committee on Oversight Government Reform for a hearing on steroids in baseball. They had come in response to repeated threats by Congress to pass legislation that would govern drug testing in baseball and other sports. ESPN.com described McGwire’s testimony:

In a room filled with humbled heroes, Mark McGwire hemmed and hawed the most. His voice choked with emotion, his eyes nearly filled with tears, time after time he refused to answer the question everyone wanted to know: Did he take illegal steroids when he hit a then-record 70 home runs in 1998—or at any other time? Asked by Rep. Elijah Cummings, D-Md., whether he was asserting his Fifth Amendment right not to incriminate himself, McGwire said: “I’m not here to talk about the past. I’m here to be positive about this subject.” Asked whether use of steroids was cheating, McGwire said: “That’s not for me to determine.”
José Canseco, whose best-selling book, Juiced, drew law-makers’ attention, said anew that he used performance-enhancing drugs as a player. Baltimore Orioles teammates Sammy Sosa and Rafael Palmeiro said they haven’t … “Steroids were part of the game, and I don’t think anybody really wanted to take a stance on it,” Canseco said. “If Congress does nothing about this issue, it will go on forever.”

It was painful to watch these heroes of our national pastime confess by their evasions that their record-setting performances not only were the product of hard work on the field and in the gym but were boosted by steroid injections in a dark corner of the locker room (and one of them, Palmeiro, later actually failed a drug test).
Almost five years later, on January 13, 2010, in another committee hearing room just down the hall, Congress was at it again, with another panel investigating steroid use—this time financial steroids. The scene was eerily similar, but instead of baseball stars sitting biceps-to-biceps, it was investment bankers sitting briefcase-to-briefcase. Their huge bonuses and paydays were the Wall Street equivalent of grand-slam home runs—home runs also hit, it was suspected, with artificial stimulants. Some of America’s biggest financial sluggers jammed together at one long witness table: Goldman Sachs CEO Lloyd Blankfein, JPMorgan Chase CEO Jamie Dimon, Bank of America CEO Brian Moynihan, and Morgan Stanley chairman John Mack. This was the first public hearing of the Financial Crisis Inquiry Commission.
Here is how Reuters described what happened:

Wall Street’s chiefs acknowledged taking on “too much risk” … but stopped short of an apology as they sparred with a commission looking into the origins of the financial crisis …
With U.S. unemployment near a 26-year high after the worst recession in decades, public fury is growing over the cost of U.S. taxpayer bailouts and huge bonuses for bankers, now that the banking industry has stabilized from the 2008 meltdown.
Phil Angelides, chairman of the commission and a former state treasurer of California, confronted the pugnacious, arm-waving Lloyd Blankfein, chief executive of Goldman Sachs, over his firm’s pre-meltdown practices.
Angelides compared Goldman’s practice of creating, then betting against, certain subprime mortgage-backed securities to “selling a car with faulty brakes and then buying an insurance policy on the buyer.”

Just as baseball players in the 1990s injected themselves with steroids to build muscle artificially for the purpose of hitting more home runs, our government injected steroids into the economy in the form of cheap credit so that Wall Street could do more gambling and Main Street could do more home buying and unskilled workers could do more home-building. The fastest-growing job sectors during the steroid-injected bubble years of the early 2000s were construction, housing, real estate, homeland security, financial services, health care, and public employment—all of them fueled by low interest rates and deficit spending. New value-creating industries grew very little.
Warren Buffett likes to say that when the tide goes out, you see who isn’t wearing a bathing suit. The economic tide went out with the financial meltdown and deep recession at the end of the first decade of the twenty-first century, and it showed with brutal clarity who was swimming naked.
It was us.
Thanks to the peace dividend, the creation of the dot-com industry, the portable-computing and cell-phone industries, and the tax increase pushed through by President Bill Clinton, the first decade after the end of the Cold War was, on balance, positive for America. We almost erased the deficit, and employment grew steadily. The Clinton administration tried to pass an energy tax and nearly succeeded in doing so. Welfare was reformed, and corporate America seemed to be adjusting and adapting to the flat world, because it had no choice. Alas, the second decade after the Cold War ended—the first decade of the twenty-first century—was not so benign. There is really no other way to say it: By the standards of elementary prudence and our own history, we went nuts.
When America failed to see what a profound challenge the end of the Cold War posed, this could be chalked up to ignorance or inattention. We simply didn’t understand the world in which we were living. But when we decided to go to war on math and physics, we did so with eyes wide open. And when we did all of these things at once, we made a radical departure from the norms of American history. That is why we call this initial decade of the twenty-first century the “Terrible Twos.”
This term comes originally from child psychology. It refers to the developmental stage, beginning sometime after a child turns two, when the child becomes cranky, moody, and willful about almost everything. Pediatricians reassure anxious parents of such cantankerous toddlers that the behavior pattern is normal. They’ll grow out of it. American behavior in the Terrible Twos, by contrast, was anything but normal, and we have not yet grown out of it.
As a country, we lost the plot. We forgot who we were, how we had become the richest and most powerful country in the history of the world, where we wanted to go, and what we needed to do to get there. We failed to update our five-part formula for greatness—education, infrastructure, immigration, research and development, and appropriate regulation—just at a time when changes in the world, especially the expansion of globalization and the IT revolution, made adapting that formula to new circumstances as important as it had ever been. Then we fell into the pit of the Great Recession, while fighting two wars in the Middle East and being the first generation of Americans not only to fail to raise taxes to pay for a war but actually to cut them.
In short, we were the generation of Americans that threw out its umbrella just before the storm. In so doing, we broke with one of the main patterns of American history. “In the past we not only met challenges, we did it in ways that left people in our dust—so as to emphatically assert or reassert our leadership,” said Dov Seidman, the author of How. We did nothing of the sort in the Terrible Twos, and that has left us in very difficult circumstances. “Instead of being twenty years behind, we should be twenty years ahead right now—so we are actually forty years behind where we should be,” Seidman added.
And while the steroid-enhanced sluggers’ achievements remain, at least for now, part of baseball’s record books, the artificially created wealth of the Terrible Twos has evaporated. The numbers don’t lie. On January 2, 2010, as that radical decade was coming to an end, The Washington Post did the math—the real math. It ran an article by Neil Irwin entitled “Aughts Were a Lost Decade for U.S. Economy, Workers,” which is worth quoting at length:

For most of the past 70 years, the U.S. economy has grown at a steady clip, generating perpetually higher incomes and wealth for American households. But since 2000, the story is starkly different. The past decade was the worst for the U.S. economy in modern times, a sharp reversal from a long period of prosperity that is leading economists and policymakers to fundamentally rethink the underpinnings of the nation’s growth. It was, according to a wide range of data, a lost decade for American workers. The decade began in a moment of triumphalism—there was a current of thought among economists in 1999 that recessions were a thing of the past. By the end, there were two, bookends to a debt-driven expansion that was neither robust nor sustainable. There has been zero net job creation since December, 1999.
No previous decade going back to the 1940s had job growth of less than 20 percent. Economic output rose at its slowest rate of any decade since the 1930s as well. Middle-income households made less in 2008, when adjusted for inflation, than they did in 1999—and the number is sure to have declined further during a difficult 2009. The Aughts were the first decade of falling median incomes since figures were first compiled in the 1960s. And the net worth of American households—the value of their houses, retirement funds and other assets minus debts—has also declined when adjusted for inflation, compared with sharp gains in every previous decade since data were initially collected in the 1950s.

The financial shenanigans that produced the meltdown triggered by the collapse of the investment bank Lehman Brothers on September 15, 2008, made a huge contribution to these dismal, shocking, and unprecedented figures. The titans of banking have a lot to answer for. But financial misdeeds were not the only cause. Just as responsible for the nation’s abysmal economic performance during the Terrible Twos, if not more responsible for it, was the nation’s collective failure to maintain and upgrade the American formula that had served us so well for so long. We let each one of the pillars of our formula erode significantly during the last decade, and that, in our view, is what made the Terrible Twos so terrible. Here is a scorecard.
On October 24, 2010, The Hartford Courant ran a cartoon by the paper’s resident cartoonist, Bob Englehart, featuring four versions of the famed recruiting poster—the one with Uncle Sam pointing outward. In the first poster Uncle Sam is saying, “I WANT YOU.” In the second poster, he has both hands up, flashing stop, under the caption “NO, WAIT. NOT YOU.” In the third poster he is pointing out again, under the caption “WELL, OK , YOU.” In the final poster he has both hands up again, warning stop, under the caption “NO, WAIT …”
We wonder if he drew that cartoon in anticipation of a study that made headlines on December 21, 2010, which found, according to an Associated Press report that day, that “nearly one-fourth of the students who try to join the U.S. Army fail its entrance exam, painting a grim picture of an education system that produces graduates who can’t answer basic math, science and reading questions.” The study, conducted by the Education Trust, a Washington, D.C.–based children’s advocacy group, “found that 23 percent of recent high school graduates don’t get the minimum score needed on the enlistment test to join any branch of the military. Questions are often basic, such as: ‘If 2 plus x equals 4, what is the value of x?’”
The AP story noted that this was the first time the U.S. Army had released such test data publicly. Tom Loveless, an education expert at the Brookings Institution think tank, was quoted saying the results echo those on other tests. In 2009, 26 percent of seniors performed below the basic reading level on the National Assessment of Educational Progress. Other tests, like the SAT, look at students who are going to college. “A lot of people make the charge that in this era of accountability and standardized testing, we’ve put too much emphasis on basic skills,” Loveless said. “This study really refutes that. We have a lot of kids that graduate from high school who have not mastered basic skills.”
In chapter 6, we cited the unimpressive showing of American fifteen-year-olds in the international PISA test, which measures student skills in reading, math, science, and critical thinking. But many other warning signs that America’s education system was underperforming at all levels showed up in the Terrible Twos.
In a speech to the Council on Foreign Relations (October 19, 2010), Arne Duncan, the secretary of education, issued his own report card on the status of American education. On a broad set of metrics of educational attainment, we didn’t do well.

Just one generation ago, the United States had the highest proportion of college graduates in the world. Today, in eight other nations, including South Korea, young adults are more likely to have college degrees than in the U.S. In South Korea, 58 percent of young adults have earned at least an associate’s degree. In America, just 42 percent of young adults have achieved the same milestone. In many other developed countries, the proportion of young adults with associate’s or bachelor’s degrees soared in the last 15 years. Here in the United States, we simply flat-lined. We stagnated, we lost our way—and others literally passed us by … Just as troubling, about one in four high school students—25 percent—in the U.S. drops out or fails to graduate on time. That’s almost one million students leaving our schools for the streets each year. That is economically unsustainable and morally unacceptable. High school dropouts today are basically condemned to poverty and social failure. One of the more unusual and sobering press conferences I participated in last year was the release of a report by a group of top retired generals and admirals that included General Wesley Clark and Major General James Kelley. They were deeply troubled, as I am, by the national security burden created by America’s underperforming education system. Here was the stunning figure cited in the generals’ report: 75 percent of young Americans, between the ages of 17 to 24, are unable to enlist in the military today because they have failed to graduate from high school, have a criminal record, or are physically unfit. So, to borrow a phrase from the space race era—yes, Houston, we have a problem.

In a follow-up essay in Foreign Affairs (November–December 2010), Duncan added that young Americans today have almost identical college completion rates as their parents. In other words, we’ve made no improvement. The numbers tell the story.
“Currently,” Duncan wrote, “about one-fourth of ninth graders fail to graduate high school within four years. Among the OECD countries, only Mexico, Spain, Turkey, and New Zealand have higher dropout rates than the United States.” The numbers do not improve as American students move through the educational system.

College entrance exams suggest that merely one quarter of graduating high school seniors are ready for college, and 40 percent of incoming freshmen at community colleges have to take at least one remedial class during their first semester. In June, the Center on Education and the Workforce projected that by 2018, the U.S. economy will need about 22 million more college-educated workers, but that, at current graduation rates, it will be short by at least three million. With not enough Americans completing college, the center warned, the United States is “on a collision course with the future.”

American colleges and universities, Duncan added, still have one of the highest enrollment rates in the world—“nearly 70 percent of U.S. high school graduates enroll in college within one year of earning their diplomas. But only about 60 percent of students who enroll in four-year bachelor’s programs graduate within six years, and only about 20 percent of students who enroll in two-year community colleges graduate within three years.”
Much of what Duncan described is taking place in middle-class communities, but the picture that emerges from more challenged areas is breathtakingly bleak. A May 2011 study by the Detroit Regional Workforce Fund found that 47 percent of adult Detroit residents, or about 200,000 people, are functionally illiterate—which means that nearly half the adults in the city can’t perform simple tasks such as reading an instruction book, reading labels on packages or machinery, or filling out a job application. Depressingly, about 100,000 of those functionally illiterate adults have either a high school diploma or the GED equivalent. You can stimulate the Detroit economy all you want, but even if jobs come back, people who can’t read won’t be able to do them.
We as a country already pay staggering sums to fund remedial education for students who enter the workplace with high school and college degrees—degrees that were supposed to prepare them for jobs but did not. A 2004 study of 120 American corporations by the National Commission on Writing (a panel established by the College Board) concluded that a third of the employees in the nation’s blue-chip companies wrote poorly and that businesses were spending as much as $3.1 billion annually on remedial training. The New York Times’s education writer, Sam Dillon, reported (December 7, 2004) that

R. Craig Hogan, a former university professor who heads an online school for business writing [in Illinois], received an anguished e-mail message recently from a prospective student: “i need help,” said the message, which was devoid of punctuation. “i am writing a essay on writing i work for this company and my boss want me to help improve the workers writing skills can yall help me with some information thank you.” … “E-mail is a party to which English teachers have not been invited,” Dr. Hogan said.

On education, in short, we have not updated our formula for greatness the way we did when we made sure that every American had access to a tuition-free high school education. “We have not made an equivalent commitment to the twenty-first century—to say everyone should be able to get postsecondary schooling free of tuition,” said Lawrence Katz, the Harvard labor economist. Just when we needed to speed up, we stayed where we were. Katz quoted a telling, discouraging statistic: “American fifty-five-year-olds are still the most educated people in their cohort in the world. But American twenty-five-year-olds are in the middle of the pack. That,” he added, “is a new phenomenon.”
If all Americans could compare Berlin’s luxurious central train station today with the grimy, decrepit Penn Station in New York City, they would swear we were the ones who had lost World War II. When you ride from New York to Washington on the Amtrak Acela, America’s bad imitation of a Japanese bullet train, trying to have any kind of sustained cell-phone conversation is an adventure, to say the least. Your conversation can easily be aborted three or four times in a fifteen-minute span. Whenever, we, the authors, have a cell-phone conversation from the Acela, one of us typically begins by saying, “Speak fast, I’m not calling from China. I’m on the Acela.” Our airports? Some of them would probably qualify as historic monuments. We would nominate both Los Angeles International and several terminals at John F. Kennedy in New York for this distinction. LAX’s dingy, cramped United Airlines domestic terminal feels like a faded 1970s movie star who once was considered hip but has had one too many face-lifts and simply can’t hide the wrinkles anymore. But in many ways, LAX, JFK, and Penn Station are us. We are the United States of Deferred Maintenance. (China, by contrast, is the People’s Republic of Deferred Gratification.)
In the Terrible Twos, our roads got more crowded, our bridges got creakier, our water systems got leakier, and the lines in our airports got longer. In 2009, the American Society of Civil Engineers (ASCE) issued a Report Card for America’s Infrastructure, and gave America an overall grade of D. The report also gave individual grades to fifteen infrastructure categories. None got higher than C+. “Decades of underfunding and inattention have endangered the nation’s infrastructure,” the engineers said, adding that since the ASCE’s last report card in 2005, there has been little change in the condition of America’s roads, bridges, drinking-water systems, and other public works, but the cost of repairing them (when they do get repaired) has risen. ASCE estimated in 2009 that America’s infrastructure needed $2.2 trillion in repairs—up from the $1.6 trillion price tag in 2005.
“In 2009, all signs point to an infrastructure that is poorly maintained, unable to meet current and future demands, and in some cases, unsafe,” the engineers said. A story on the Environment News Service (January 28, 2009) about the infrastructure study noted that the engineers gave “solid waste management the highest grade, a C+. The condition of the nation’s bridges receives the next highest grade, a C, while two categories, rail as well as public parks and recreation scored a C–. All other infrastructure categories were graded D or D–, including: aviation, dams, hazardous waste, inland waterways, levees, roads, schools, transit and wastewater.”
The condition of American infrastructure is even worse than the report suggests. “The U.S. government defines 18 of America’s infrastructures as ‘critical’ to the nation,” wrote Mark Gerencser in an article entitled “Re-imagining Infrastructure” in The American Interest (March–April 2011). “Of the 18 categories, three are basic, underlying ‘lifeline’ infrastructures: energy, transportation and water. As it happens, all three are beyond mature; they are nearing the end of their useful operating lives and are in desperate need of recapitalization and modernization to accommodate both new needs and the increased demands of our population growth.” The ASCE report quoted Pennsylvania governor Ed Rendell as saying, “The longer we wait the more expensive it will be … This is as urgent an imperative as health care.” We have already waited too long—we’ve let things slide for two full decades. The price for making a comeback, for becoming again the people and the country we used to be, is only mounting.
In March 2010, a large gala dinner was held at the National Building Museum in Washington, D.C.—black ties, long dresses. But this was no ordinary dinner. There were forty guests of honor. So here’s our brainteaser for readers: We will give you the names of most of the honorees, and you tell us what dinner they were attending. Ready?
Linda Zhou, Alice Wei Zhao, Lori Ying, Angela Yu-Yun Yeung, Lynnelle Lin Ye, Kevin Young Xu, Benjamen Chang Sun, Jane Yoonhae Suh, Katheryn Cheng Shi, Sunanda Sharma, Sarine Gayaneh Shahmirian, Arjun Ranganath Puranik, Raman Venkat Nelakanti, Akhil Mathew, Paul Masih Das, David Chienyun Liu, Elisa Bisi Lin, Yifan Li, Lanair Amaad Lett, Ruoyi Jiang, Otana Agape Jakpor, Peter Danming Hu, Yale Wang Fan, Yuval Yaacov Calev, Levent Alpoge, John Vincenzo Capodilupo, and Namrata Anand.
Sorry, wrong, it was not a dinner of the China-India Friendship Association. Give up? All these honorees were American high school students. They were the vast majority of the forty finalists in the 2010 Intel Science Talent Search, which, through a national contest, identifies and honors the top math and science high school students in America, based on their solutions to scientific problems. As the list of names makes clear, most finalists hailed from immigrant families, largely from Asia.
If you need any convincing about the virtues of immigration, attend the Intel science finals. We need to keep a constant flow of legal immigrants into our country, whether they wear blue collars or lab coats. It is a part of our formula that very few countries can copy. When all of these energetic, high-aspiring people are mixed together with a democratic system and free markets, magic happens. If we want to keep that magic, we need immigration reform that guarantees that we will always attract and retain, in a legal, orderly fashion, the world’s first-round aspirational and intellectual draft choices.
The overall winner of the 2010 Intel contest—a $100,000 award for the best project out of the forty—was Erika Alden DeBenedictis of New Mexico, who developed a software navigation system that would enable spacecraft to “travel through the solar system” more efficiently. To close the evening, Alice Wei Zhao of North High School in Sheboygan, Wisconsin, was chosen by her fellow finalists to speak for them. She told the audience: “Don’t sweat about the problems our generation will have to deal with. Believe me, our future is in good hands.”
We are sure she is right, as long as America doesn’t shut its doors—but that is exactly what it is doing. In the past, the country overcame its shortages in science and engineering talent by importing it. That practice is unfortunately becoming more difficult and less common.
A comment by Vivek Wadhwa, an Indian-born scholar of this subject, makes the point pithily: “America is suffering the first brain drain in its history and doesn’t know it.” Wadhwa, an entrepreneur himself, and a senior research associate at the Labor & Worklife Program at the Harvard Law School and an executive in residence at Duke University, has overseen a number of studies on the connection between immigration and innovation. They all show that it is vital to America’s future to nurture that connection and to strengthen our attraction for talent because so many other countries are now strengthening theirs.
“As the debate over the role of highly skilled immigrants intensifies in the U.S., we’re losing sight of an important fact: America is no longer the only land of opportunity for these foreign-born workers,” Wadhwa noted in Bloomberg BusinessWeek (March 5, 2009).

There’s another, increasingly promising, destination: home. New research shows that many immigrants have returned to their native countries—especially India and China—to enjoy what they see as a better quality of life, better career prospects, and the comfort of nearby family and friends. The trend has accelerated in the past few years, in part because these workers have also lost patience with the U.S. visa backlog. At the end of 2006, more than a million professionals and their families were in line for the yearly allotment of just 120,000 permanent-resident visas. The wait time for some has been longer than 10 years.

All this matters, Wadhwa writes, “because immigrants are critical to our long-term economic health. Although they represent just 12% of the U.S. population, they have started 52% of Silicon Valley’s tech companies and contributed to more than 25% of U.S. global patents. They make up 24% of science and engineering workers with bachelor’s degrees and 47% of those with Ph.D.s.” He and two colleagues conducted a survey of 1,203 Indian and Chinese immigrants to the United States who had returned to their home countries. The vast majority were young and highly skilled, and had earned advanced degrees. Asked why they had left, 84 percent of the Chinese and 69 percent of the Indians cited professional opportunities. For the vast majority, a longing for family and friends was also a crucial element. Asked if U.S. visa issues played a role in their decisions, a third of the Indians and a fifth of the Chinese answered in the affirmative. Most of the returnees, Wadhwa said, “seem to be thriving. With demand for their skills growing in their home countries, they’re finding corporate success. About 10% of the Indians polled had held senior management jobs in the U.S. That number rose to 44% after they returned home. Among the Chinese, the number rose from 9% in the U.S. to 36% in China.”
Some opponents of reforming the visa system to attract and keep more highly skilled non-Americans have charged that giving a job to a foreigner takes a job away from a U.S. citizen. In some cases, Wadhwa noted in a Bloomberg BusinessWeek article (May 4, 2009), that is true. Some companies have used H-1B visas to hire foreign labor to lower their labor costs. “But in the aggregate, the preponderance of evidence shows that the more foreigners are working in science and technology jobs in the U.S., the better off the U.S. economy is. Increasingly, the number of H-1B holders in a region correlates to increased filings of patents in that region. And for every 1% increase in immigrants with university degrees, the number of patents filed per capita goes up 6%.”
American immigration policy today is just “plain stupid,” concluded Peter Schuck of the Yale Law School and John Tyler, general counsel of the Ewing Marion Kauffman Foundation, which studies innovation. They noted in an essay in The Wall Street Journal (May 11, 2011) that of “more than one million permanent admissions to the U.S. in 2010, fewer than 15% were admitted specifically for their employment skills. And most of those spots weren’t going to the high-skilled immigrants themselves, but to their dependents.” The H-1B program that gives a pass for high-skilled immigrants to work in America on renewable three-year visas, which can lead to permanent status, is tiny. “The current number of available visas,” they added, “is only one-third what it was in 2003.”
It cannot be said often enough: Well-paying jobs don’t come from bailouts. They come from start-ups, which come from smart, creative, inspired risk takers. There are only two ways to get more of these people: growing more at home by improving our schools, and importing more by recruiting talented immigrants. Surely we need to do both. “When you get this happy coincidence of high-IQ risk takers in government and a society that is biased toward high-IQ risk takers, you get above-average returns as a country,” argued Craig Mundie, the chief research and strategy officer of Microsoft. “What is common to Singapore, Israel, and America? They were all built by high-IQ risk takers and all thrived—but only in the U.S. did it happen on a large scale and with global diversity, so you had a really rich cross-section.”
In the Terrible Twos we combined cutbacks in higher education with limits on admitting talented immigrants to our shores. The combination is eating away at our capacity to produce and attract creative risk takers at a time when other countries are better and better able to keep their own at home.
If we don’t reverse this trend, over time “we could lose our most important competitive edge—the only edge from which sustainable advantage accrues—having the world’s biggest and most diverse pool” of high-IQ risk takers, said Mundie. “If we don’t have that competitive edge, our standard of living will eventually revert to the global mean.”
Unfortunately, in the Terrible Twos the American political system failed to enact legislation to reform the nation’s immigration system. President George W. Bush made a mighty effort but was blocked largely by members of his own party, who were so outraged by illegal immigration that they could not think straight about the vital importance of legal immigration. “The H-1B visa program—that is the key to making us the innovators of energy and computers,” said Senator Lindsey Graham, the South Carolina Republican, who has been critical of his own party’s obstinacy on this issue. “It has been for most of our life. If you wanted to get really smart and have a degree that would allow you to be a leader in the world, you came to America. Well, it’s hard as hell to get to America now. And once you get here, it’s hard to stay.”
Immigration reform that better secures the borders, establishes a legal pathway toward citizenship for the roughly twelve million illegal immigrants who are here, and enables, even recruits, high-skilled immigrants to become citizens is much more urgent than most of us realize. We need both the brainy risk takers and the brawny ones. Low-skilled immigrants may not be able to write software, but such people also contribute to the vibrancy of the American economy. As the Indian American entrepreneur Vivek Paul once remarked to Tom: “The very act of leaving behind your own society is an intense motivator. Whether you are a doctor or a gardener, you are intensely motivated to succeed.”
In the fall of 2010, Tom had a visit from Kishore Mahbubani, a Singaporean academic and retired diplomat. In the course of their conversation, Tom told him of the Obama administration’s plan to set up eight innovation hubs to work on the world’s eight biggest energy problems. It was precisely the kind of project that could expand the boundaries of basic science across the entire energy field and launch new industries. Tom explained that the program had not yet been fully funded because Congress, concerned about every dime America spends these days, was reluctant to appropriate the full request of $25 million for each energy breakthrough project, let alone for all eight at once. Only three projects were therefore moving ahead, Tom told him, and none of the three would get the full $25 million. Mahbubani interrupted him in mid-sentence.
“You mean billion,” he said.
“No,” Tom replied, “we’re talking about $25 million.”
“Billion,” Mahbubani repeated.
“No. Million,” Tom insisted.
Mahbubani was stunned. He could not believe that while his little city-state was investing more than $1 billion to make itself a biomedical science hub and attract the world’s best talent, America was debating about spending mere millions on game-changing energy research. That, alas, is us today: Think small and carry a big ego. This may seem to be a minor issue, but it is not.
Nations usually thrive or languish not because of one bad big decision but because of thousands of bad small ones—decisions in which priorities get lost and resources misallocated so that the country does not achieve its full potential. That is what happened to America in the Terrible Twos. A graph from The Washington Post (April 30, 2011) makes the point: At a time when the pace of change in the global economy and the rising economic importance of knowledge make increasing investment in research and development an urgent priority, our spending in this vital area is actually declining.
e9781429995115_i0005.jpg
Source: From The Washington Post, © April 30, 2011 The Washington Post. All rights reserved. Used by permission and protected by the Copyright Laws of the United States. The printing, copying, redistribution, or retransmission of the Material without express written permission is prohibited.
In 2005, both the Senate and the House encouraged the National Academies (of sciences, engineering, and medicine) and the National Research Council to conduct a study of America’s competitiveness in the global marketplace. They produced a report entitled Rising Above the Gathering Storm, which assessed America’s standing in each of the principal areas of innovation and competitiveness—knowledge capital, human capital, and the existence of a creative “ecosystem.” According to the National Academies website, “Numerous significant findings resulted … It was noted that federal government funding of R&D as a fraction of GDP has declined by 60 percent in 40 years. With regard to human capital, it was observed that over two-thirds of the engineers who receive PhD’s from United States universities are not United States citizens. And with regard to the Creative Ecosystem it was found that United States firms spend over twice as much on litigation as on research.”
The Gathering Storm report eventually led to a bill called the America COMPETES Act, which authorized investments in a broad range of basic research. It did so on the grounds that

a primary driver of the future economy and concomitant creation of jobs will be innovation, largely derived from advances in science and engineering … When scientists discovered how to decipher the human genome, it opened entire new opportunities in many fields including medicine. Similarly, when scientists and engineers discovered how to increase the capacity of integrated circuits by a factor of one million, as they have in the past forty years, it enabled entrepreneurs to replace tape recorders with iPods, maps with GPS, pay phones with cell phones, two-dimensional X-rays with three-dimensional CT scans, paperbacks with electronic books, slide rules with computers, and much, much more.

Most of the original funding for the expanded research recommended by the Gathering Storm report got passed only due to the stimulus legislation enacted after the financial meltdown in 2008—and most of that was for only a limited duration. So in 2010, the same group gathered and issued an update, entitled Rising Above the Gathering Storm, Revisited: Rapidly Approaching Category 5.
“So where does America stand relative to its position of five years ago when the Gathering Storm report was prepared?” the new report asked. “The unanimous view of the committee members participating in the preparation of this report is that our nation’s outlook has worsened. While progress has been made in certain areas … the latitude to fix the problems being confronted has been severely diminished by the growth of the national debt over this period from $8 trillion to $13 trillion.”
To drive home the point, the updated report began with a series of statistics, which included the following:
In 2009 United States consumers spent significantly more on potato chips than the government devoted to energy research and development—$7.1 billion versus $5.1 billion.
China is now second in the world in its publication of biomedical research articles, having recently surpassed Japan, the United Kingdom, Germany, Italy, France, Canada, and Spain.
In 2009, 51 percent of U.S. patents were awarded to non-U.S. companies. Only four of the top ten companies receiving U.S. patents last year were U.S. companies.
Federal funding of research in the physical sciences as a fraction of GDP fell by 54 percent in the twenty-five years after 1970. The decline in engineering funding was 51 percent.
Sixty-nine percent of U.S. public school students in the fifth through eighth grade are taught mathematics by a teacher without a degree or certificate in mathematics.
Ninety-three percent of U.S. public school students in the fifth through eighth grade are taught the physical sciences by a teacher without a degree or certificate in the physical sciences.
Thirty years ago, 10 percent of California’s general revenue fund went to higher education and 3 percent to prisons. Today nearly 11 percent goes to prisons and 8 percent to higher education.
The total annual federal investment in research in mathematics, the physical sciences, and engineering is now equal to the increase in U.S. health-care costs every nine weeks.
China’s Tsinghua and Peking Universities are the two largest suppliers of students who receive Ph.D.’s—in the United States.
And finally, our embarrassing favorite: 49 percent of U.S. adults do not know how long it takes for the Earth to revolve around the Sun.
An essential part of America’s traditional formula for prosperity is the appropriate regulation of commerce through quality government institutions. When conceived and administered properly, regulation has occupied a middle ground: neither so strong as to stifle innovation, entrepreneurship, and economic growth, nor too light to prevent the excesses and failures to which the free market is susceptible. In the Terrible Twos we managed the trick of going too far in both directions.
The thicket of federal regulations under which the private sector must operate continued to grow during the last decade. In 2007, the Code of Federal Regulations, which includes the text of existing regulations, totaled 145,816 pages, and it has since expanded. It is difficult to believe that every one of the listed regulations enhances the well-being of American citizens. As The Economist noted in a February 18, 2012, essay on regulation in America,

Governments of both parties keep adding stacks of rules, few of which are ever rescinded. Republicans write rules to thwart terrorists, which make flying in America an ordeal and prompt legions of brainy migrants to move to Canada instead. Democrats write rules to expand the welfare state. Barack Obama’s health-care reform of 2010 had many virtues, especially its attempt to make health insurance universal. But it does little to reduce the system’s staggering and increasing complexity. Every hour spent treating a patient in America creates at least 30 minutes of paperwork, and often a whole hour. Next year the number of federally mandated categories of illness and injury for which hospitals may claim reimbursement will rise from 18,000 to 140,000. There are nine codes relating to injuries caused by parrots, and three relating to burns from flaming water-skis.

Moreover, regulation can have unintended adverse consequences. In 2005, Congress, under pressure from the credit card and financial services industries, passed the Bankruptcy Abuse Prevention and Consumer Protection Act, a law making it much more onerous for a person or an estate to file for Chapter 7 bankruptcy and then start over with a clean slate. Under the new law, explained the website eFinanceDirectory.com, “you can no longer claim Chapter 7, and therefore dismiss all of your debts, unless you make less than your state’s median wage. Chapter 7 now stipulates that you must take a debt management class affiliated with the National Foundation for Consumer Credit at least 6 months BEFORE you’re eligible to apply for bankruptcy.” We are not in favor of encouraging recklessness, but we are in favor of encouraging risk-taking. And some experts speculate that one reason for the sharp drop-off in entrepreneurial start-ups during the Great Recession—a drop of 23 percent as opposed to the usual 5 percent in previous recessions, according to McKinsey’s research—is that fewer people are willing to take calculated risks and start new companies owing to this change in the bankruptcy laws. Ever since the dot-com boom, many small entrepreneurs have used their credit cards as their original source of venture capital. Now it is much riskier to do so.
In the Terrible Twos, however, other areas of the financial and energy sectors in the United States suffered from too little regulation. The catastrophic financial meltdown of 2008 occurred in the wake of considerable deregulation of the nation’s financial system, which was spurred by, among other things, the belief that the financial industry could largely regulate itself, and that the separations between traditional commercial banking, on the one hand, and investment banking and proprietary trading on a bank’s own behalf, on the other—separations put in place to prevent a recurrence of the Great Depression—were no longer necessary. This belief turned out to be wrong, and devastatingly so.
To be sure, the 2008 subprime meltdown was the product of many causes. A mountain of excess savings built up in Asia was looking for a higher return and flowed to subprime bonds—which paid significantly higher interest rates because they were made up of mortgages granted to people who were higher lending risks. The government directly relaxed mortgage standards to help more Americans buy homes. Banks and rating agencies relaxed their standards to get their share of the subprime housing bubble. Government failed to regulate exotic new financial instruments such as derivatives, under pressure from a financial industry that wanted free rein in this lucrative new area.
The University of California Berkeley economist Barry Eichengreen argues that the subprime crisis was partly a case of regulation and regulators not having kept up with the consolidation and internationalization of the commercial banking, investment banking, and brokerage industries. In other words, we did not update our formula in this area. Over the previous two decades, some of the key firewalls erected after the 1929 crash came down, along with regulations stipulating the amount of reserves banks had to keep on hand. The merging of the different financial industries was actually “sensible and well-motivated,” argues Eichengreen. It lowered the costs of stock trading for consumers, reduced borrowing costs, and created new financial products that, in theory, could promote growth in different markets. The problem was that this kind of global financial integration was a total misfit with the fragmented, outdated American financial regulatory system, so it was very hard for regulators to get the full picture of the level of risk and leverage different players in the market were taking on. “At the most basic level,” Eichengreen argued in an October 2008 paper entitled Origins and Responses to the Crisis, “the subprime crisis resulted from the tendency for financial normalization and innovation to run ahead of financial regulation.” It ran so far ahead that not only did the regulators not fully understand the level of risks being piled up by different financial houses, but even the CEOs of these firms did not understand what the rocket scientists turned bankers were concocting under them.
One of those new financial instruments—a derivative known as the credit-default swap, a form of private insurance that paid off if a subprime package of loans defaulted—was specifically kept out of the jurisdiction of government regulators through aggressive lobbying by the financial industry. We wound up with a trillion-dollar market in these swaps without either meaningful government oversight or transparency. Its implosion helped to create the worst financial crash since 1929.
That lack of oversight was a bipartisan effort. In 1999, Republicans passed legislation specifically exempting credit-default swaps from regulation—and President Bill Clinton signed it. There is a fine line between a regulatory environment that promotes the risk-taking that is necessary in a market economy and an environment that fosters destructive recklessness. In the Terrible Twos, we crossed that line, in part because some important people, chief among them Federal Reserve chairman Alan Greenspan, came to believe that the markets could be “self-regulating”—and that big financial institutions would police themselves because it would be in their self-interest to do so—and in part because the financial industry used its ever greater clout on Capitol Hill to ensure lax regulation in the new markets it had pioneered and to “capture” regulators. It did so in order to maximize risk-taking so as to create astronomical sums of personal wealth for its executives.
Better regulation and regulators might not have prevented the economic crisis in the last part of the Terrible Twos, but it surely would have made the crisis less severe. In the wake of the crisis, Congress passed and the president signed the Dodd-Frank Wall Street Reform and Consumer Protection Act, which imposed new regulations on the financial industry with the goal of making its operations safer. The banking industry, however, did everything it could to weaken that legislation, and the ultimate impact of the reforms remains to be seen.
The result, as the Columbia University economist Jagdish Bhagwati observed, was that a financial industry built to finance “creative destruction” (the formation of new companies and industries to replace old ones) ended up promoting “destructive creation” (the buying and selling of financial instruments with little intrinsic value), the collective implosion of which threatened the whole economy.
The fact that virtually none of the main culprits in bringing about this huge destruction of wealth has suffered any legal penalties suggests that our regulations need some updating. At the very least, we should heed what Warren Buffett told the Berkshire Hathaway annual shareholders meeting (April 30, 2010): “Any institution that requires society to come in and bail it out for society’s sake should have a system in place that leaves its CEO and his spouse dead broke.”
Striking the proper balance between the under- and overregulation of financial markets will be difficult. There is no magic formula for this, and we surely do not wish to stifle all innovation in this area. But finding that balance is crucial because, as we saw in 2008 and thereafter, a major failure in this sector of the economy can inflict massive and long-lasting damage on the economy as a whole.
A critical reason that America has failed to update its formula by reinvesting in education, infrastructure, and research and development, and hasn’t adjusted our immigration policy to promote economic growth or implemented appropriate economic regulations, is that all these require collective action—America as a whole has to act—and lately we have lost our capacity for collective action. One reason for this damaging form of paralysis is the growth of inequality in America, itself the product, among other things, of the further flattening of the world. As we have tried to demonstrate, that flattening has created a global market for the goods and services of people skilled enough to take advantage of it. The earnings in this huge global market can be staggering for the “winners.” Consider what a basketball player such as LeBron James can earn in this era when the National Basketball Association sells its branded products from Stockholm to Shanghai—we are talking tens of millions of dollars—compared to the biggest star of the early 1950s, George Mikan of the then Minneapolis Lakers, whose earnings were limited to the United States and were measured in the tens of thousands of dollars.
It is harder to generate collective action when people are living in different worlds within the same country, argues the Nobel Prize–winning economist and Columbia University professor Joseph E. Stiglitz. Historically, Americans have tended to be less troubled by inequality than citizens of other countries. Both the myth and the reality of individual opportunity and upward mobility in America have been so powerful and so deeply ingrained that the socialist narrative of government-sponsored redistribution has never taken root. But the income gaps during the Terrible Twos grew so large, and could well grow larger still, that inequality now threatens to fracture the body politic in ways that could undermine our ability to do big hard things together.
According to Stiglitz, the top 1 percent of Americans now takes in roughly one-fourth of America’s total income every year. In terms of wealth rather than income, says Stiglitz, the top 1 percent now controls 40 percent of the total. This is new. Twenty-five years ago, the corresponding figures were 12 percent and 33 percent, he noted. Meanwhile, people in the top 1 percent have seen their incomes rise 18 percent over the past decade, while the incomes of those in the middle have actually fallen. For men with only high school degrees, the decline has been especially pronounced: 12 percent in the last twenty-five years alone, said Stiglitz.
Today the rich don’t need the benefits of collective action, said Stiglitz, because they can create their own “subsociety” with its own collective goods. “They have their country clubs, which are their own parks. They have their own private schools. They don’t have to go to public schools and they would not want to have their kids educated there. They have their own transportation system with private jets and chauffeured cars, so they don’t really care about the deterioration in public transport. They don’t care if there are long lines at the airport, because they are not in them.”
This kind of economic inequality very often leads to more political inequality and then more economic inequality, argued the MIT economist Daron Acemoglu and the Harvard political scientist James Robinson, co-authors of Why Nations Fail, in a March 11, 2012, essay on The Huffington Post:

The U.S. generated so much innovation and economic growth for the last 200 years because, by and large, it rewarded innovation and investment. This did not happen in a vacuum; it was supported by a particular set of political arrangements—inclusive political institutions—which prevented an elite or another narrow group from monopolizing political power and using it for their own benefit and at the expense of society. When politics gets thus hijacked, inequality of opportunity follows, for the hijackers will use their power to gain special treatment for their businesses and tilt the playing field in their favor and against their competitors … So here is the concern: economic inequality will lead to greater political inequality, and those who are further empowered politically will use this to gain a greater economic advantage by stacking the cards in their favor and increasing economic inequality yet further—a quintessential vicious circle. And we may be in the midst of it … Yale University political scientist Robert Dahl painted a picture of U.S. politics in the 1960s through the lenses of politics in New Haven as a system in which not only the wealthy but even the little man had voice. But that system is in decline. Money matters much more in politics today than it did in the 1960s, and we are currently witnessing its import rising. The wealthy have greater access to politicians and to media, and can communicate their point of view and interests—often masquerading as “national interest”—much more effectively than the rest of us. How else can we explain that what is on the political agenda for the last several decades has been cutting taxes on the wealthy while almost no attention is paid to problems afflicting the poor, such as our dysfunctional penal system condemning a huge number of Americans to languish in prisons for minor crimes? How else can we explain, as political scientist Larry Bartels has documented, that U.S. Senators’ votes represent the views of their rich constituents but not those of their poor ones?

The Occupy Wall Street movement, which spontaneously spread to public parks in cities all across America in 2011 and 2012, never generated the big numbers of, say, the civil rights or anti-Vietnam War movements. It also lacked a single unified agenda or leader. And yet it seemed to touch a chord and the chord was the one cited by Acemoglu and Robinson—America’s income gap has become a political power gap. We see it as a warning sign. It is the first time in the postwar era that “fairness” and economic inequality have generated so much national attention and debate. It was not an accident, and we ignore this warning at our peril.
If during the Cold War we had let the key features of our formula for greatness—which are the major determinants of economic growth and therefore of power and influence in the world we are living in—deteriorate the way we did during the Terrible Twos, it would have been considered the equivalent of unilateral disarmament. Politicians would have accused one another of creating or tolerating an “education gap” or an “infrastructure gap,” like the “missile gap” of the 1950s. The charges and countercharges would have dominated national elections. In the Terrible Twos, something far worse happened.
We didn’t notice. Declining numbers in the important categories of national life became normal.
Then we made things even worse. Having underestimated the challenge posed to America by 11/9—November 9, 1989, the day the Berlin Wall fell—we compounded the error by overestimating the challenge of 9/11. We spent the rest of the decade focusing our national attention and resources on the losers from globalization—al-Qaeda, Iraq, Pakistan, and Afghanistan—when our major long-term challenge comes from the winners, most of them in Asia. We devoted ourselves to nation-building in Mesopotamia and the Hindu Kush when we should have been concentrating on nation-building at home.
Since the authors of this book both supported the war in Iraq, to date the more controversial and expensive of the two projects, we need to say what we got wrong and what we still believe. Both of us believed then and believe now that finding a way to bring democracy into the heart of the Arab world was a strategic and moral imperative. We knew it would be difficult and costly, and said so at the time, but even so, we underestimated just how difficult and how costly. We have nothing but regret for the excessive price that America and Iraq have had to pay in lives and treasure.
The losers from globalization—specifically al-Qaeda and Saddam Hussein—did pose significant security problems. We had to strike back against the al-Qaeda perpetrators of 9/11, not simply to deter another attack but also to disrupt what they might have been planning next. But Saddam Hussein was not part of 9/11. The Bush administration asserted that his regime had to be toppled because it had weapons of mass destruction. Neither of us shared that view. Michael believed that the weapons of mass destruction that Saddam Hussein was thought to possess—chemical weapons—did not pose a severe enough threat to justify an attack. What did justify removing him from power was the prospect that at some point in the future he would acquire the far more dangerous nuclear weapons.
Tom’s view was that the long-term threat to the United States from the Middle East came less from weapons of mass destruction than from people of mass destruction, produced in a region where autocratic regimes were stifling and enraging the people they governed. That was his view in 2001, not just in 2011, when uprisings around the Arab world occurred, triggered by deep frustration and anger at the long-ruling dictatorships and motivated as well, in many cases, by democratic yearnings. His hope was that America could collaborate with a free Iraq to create a decent, democratic model of development in a region that had none.
It was neither foolish nor irresponsible for President George W. Bush to want to use Iraq as a lever to pry open the closed and autocratic world of Arab politics. It was, however, both foolish and irresponsible to try to do so without a well-thought-out plan, without enough troops, and without an adequate understanding of the scale and complexity of what was required. Execution matters.
America’s initial policy in Iraq offers, alas, a metaphor for much of American public policy in the Terrible Twos: Our reach exceeded our grasp and ability to execute. We simply, casually, and wrongly assumed that things would work out. We willed the ends but not the means.
We cannot say what would have happened had we done things well in Iraq. We can say, though, that if a decent, democratizing Iraq does finally emerge one day it will be something well worth having, and the lives and treasure expended there will not have been in vain. To the contrary, in that case we will have supported something transformational, of great value to both Iraqis and the world. But even in that case, we will have overpaid for the benefits we get, although by how much will depend on how Iraq evolves. Especially given America’s other needs, the American intervention there has cost far too much in lives, money, and the government’s attention. The same is true for the wars in Afghanistan and Libya, enterprises with far less potential strategic benefit. In sum, America—and we very much include ourselves in this mistake—acted as if the world that was created on 9/11 was a whole new world. The events of 9/11 did reveal a serious security threat. They posed a real problem. But it was not, in retrospect, the equivalent of a life-threatening disease that required dropping everything and changing everything; it was a chronic disease that we had to keep under control but in a way that allowed us to get on with the rest of our lives. September 11 was diabetes; it wasn’t cancer. And the rest of our lives that we had to get on with involves addressing the four major challenges of the post–Cold War era by updating and upgrading our formula for greatness.
We overpaid not only for Iraq but for homeland security as well, because no politician wanted to be accused of negligence by some future investigatory commission. Moreover, we paid for all of this—Iraq, homeland security, Afghanistan, Libya—with borrowed money. We gave ourselves a tax cut rather than a tax hike, added a new entitlement—Medicare prescription drugs—and did it all on the eve of the biggest entitlement payout in American history, which will come with the retirement of the baby boomers.
The contrast with a previous era and a previous Republican presidency is striking. In the 1950s, the Eisenhower era, we used a major conflict—the Cold War—as a lever to upgrade our formula for success to ensure a prosperous future for the nation as a whole. In the Terrible Twos, the George W. Bush era, we used another conflict—the war with al-Qaeda, Saddam Hussein, and radical Islam—to avoid doing the things we had to do to assure a prosperous future. In the first period we sacrificed for and invested in the future. In the second we indulged and splurged in the present at the expense of the future.
Looking back on the last decade, we could not help but be struck by some of the lyrics to one of the songs in the 2009 movie Crazy Heart. Jeff Bridges won the Oscar for best actor for his portrayal of an alcoholic country singer trying to make a comeback. The song, entitled “Fallin’ & Flyin’,” makes an all too fitting anthem for the Terrible Twos.

I was goin’ where I shouldn’t go
Seein’ who I shouldn’t see
Doin’ what I shouldn’t do
And bein’ who I shouldn’t be
A little voice told me it’s all wrong
Another voice told me it’s all right
I used to think that I was strong
But lately I just lost the fight
 
Funny how fallin’ feels like flyin’
For a little while
Funny how fallin’ feels like flyin’
For a little while
 
I got tired of bein’ good
Started missin’ that ol’ feelin’ free
Stop actin’ like I thought I should
And went on back to bein’ me
I never meant to hurt no one
I just had to have my way
If there’s such a thing as too much fun
This must be the price you pay

That was America in the Terrible Twos, and we have only begun to pay the price. How did it all happen? The short answer: Our political system got paralyzed and our values system got eroded.