A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade, regardless of its mechanical skill.
—Vannevar Bush, head of the US World War II scientific effort1
IN JUNE 1940, THE FUTURE OF THE WORLD HUNG IN THE BALANCE. GERMANY HAD ATTACKED THE Netherlands, Belgium, and France just over a month earlier, and the Nazi victories were nothing short of stunning. Using military technology in new and inventive ways, the Germans demonstrated a form of warfare that combined quick movement, powerful weaponry, and dominance of the air. On paper, and according to conventional thinking, the combined British and French forces should have been able to stop the German advance, but within six weeks, the British were scrambling to evacuate their beleaguered forces from Dunkirk, and Paris fell.
America waited indecisively on the fringes of this fast-spreading conflict, with a competent but small navy, an air force that had fallen behind its potential adversaries, and an army that was so short of rifles that soldiers had to practice with brooms instead. In all of 1939, the United States built only six medium tanks.2
US military technology at the start of World War II was also seriously flawed. There were “grave defects with the depth-control mechanism and the exploder” in US torpedoes; many did not detonate when they hit targets.3 There was no consistently reliable way to track the presence of German U-boats in the Atlantic—thousands of sailors died as a result, and Britain came close to starvation.4 American armor was initially no match for what the Germans had in the field and under development.
A mere four years later, led by newly developed American technology, the Allies scored a decisive victory against both Germany and Japan. The United States had transformed warfare through the development and rapid deployment of advanced radar, proximity fuses, more effective armor, automated fire control mechanisms, amphibious vehicles, and high-performance aircraft—as well as by more effective ways to limit bacterial infections and control malaria.5 The German submarine fleet, once so close to victory in the Atlantic, was broken by the use of techniques, including radar detection, that seemed fantastical just a few years before. Japan’s surrender was forced by the detonation of two atomic bombs, based on essentially brand-new science.
How did this technological transformation happen—and so quickly? Start with June 12, 1940, and a visit to the White House by Vannevar Bush.
Vannevar Bush was an accomplished man. Previously vice president and dean of engineering at MIT, on the eve of World War II, he was running the Carnegie Institution for Science, a leading research organization in Washington, DC. Tough and experienced as an administrator, Bush was also a technology visionary and an entrepreneur with two successful start-ups under his belt, including as cofounder of Raytheon—an early technology company that grew up to become a substantial military contractor.
Bush represented American private enterprise, both academic and profit-making, at its best. Like many private-sector leaders of his generation, he also had a deeply rooted dislike of government involvement both in the economy and in science.
Bush had good reason to feel on edge waiting for his first White House meeting with President Franklin Delano Roosevelt. Despite the urgency of the moment, Bush did not have a new weapon or potential technology to unveil. Instead, his idea was prosaic and literally written on a single sheet of paper. In short, Vannevar Bush wanted to create a new government committee.
Washington, DC, has never been short of committees, and the summer of 1940 was no exception. But what Bush had in mind was no ordinary additional level of bureaucracy. The powerful people, with a clear mandate to develop weapons, would no longer just be admirals and generals or established industrial companies or even the private sector’s top research labs but rather Bush and a few university colleagues, none of whom had experienced combat. By any political standards, this was a breathtaking move—and by outsiders with very little political experience. Thirty years later, this was Bush’s assessment:
There were those who protested that the action of setting up N.D.R.C. [the National Defense Research Committee] was an end run, a grab by which a small company of scientists and engineers, acting outside established channels, got hold of the authority and the money for the program of developing new weapons. That, in fact, is exactly what it was.6
It worked. FDR was well aware that war was approaching—and was looking for good ideas that would not trigger congressional opposition. The president’s prior experience as assistant secretary of the navy encouraged him both to think about military technology and to be skeptical of admirals. Bush had prepared the ground well through key advisors, and FDR approved the idea inside of fifteen minutes. The National Defense Research Committee (NDRC) sprang into being.
Bush proved an inspired choice, abrasive enough to get the job done but also always focused on improving coordination and cooperation, even among people who did not like him. His friends were good at recruiting and managing talented scientists—other founding members of the NDRC included Karl Compton (president of MIT), James B. Conant (president of Harvard), Frank B. Jewett (president of Bell Labs and the National Academy of Sciences), and Richard C. Tolman (dean of the graduate school at Caltech).
Bush had long rubbed shoulders with all the smartest scientific people, who worked on everything from theories about the atom to far-fetched notions about how electricity passes through various materials. His new government committee idea was, in effect, simply about harnessing these individuals and their protégés in a productive manner to the coming effort of national defense.
This team, and the people they worked with, built—for its day—an enormous operation. At the peak of the activity, Bush directed the work of around thirty thousand people, of whom six thousand were scientists. Perhaps two-thirds of all physicists in the United States were employed in this operation.7 There had never been a greater concentration of scientific effort in the world.
In 1938, on the eve of world war, federal and state governments spent a combined 0.076 percent of national income on scientific research, a trivial amount. By 1944, the US government was spending nearly 0.5 percent of national income on science—a sevenfold increase, most of which was channeled through Bush’s organization from 1940.8 The effects of this unprecedented surge were simply incredible and, for America’s enemies, ultimately devastating.
Then, in 1945, Vannevar Bush had what may be considered his most profound insight. The war had been won, in part, because scientists under his general direction had figured out how to apply the existing stock of knowledge for military purposes and because American industry proved very good at turning those ideas quickly into a large number of physical goods—weapons of war.
What was needed next, Bush argued, was a redirection to focus on winning the peace. In a simple yet forceful way, Bush asked: What are the scientists to do next? His answer: a lot more science—funded by the federal government. In a 1945 report titled Science: The Endless Frontier, prepared for President Roosevelt, Bush argued dramatically that more than narrowly defined national security was at stake. Invention, including in ways that could not be forecast, would save lives, increase the standard of living, and create jobs.
Government itself should not do science. Bush was scathing about bureaucracy in general and had the metaphorical scars to prove that military bureaucracy in particular was not conducive to scientific inquiry.
At the same time, based on Bush’s deep personal experience, the private sector—firms, rich individuals, and the best universities—by itself could not fund and carry out the innovative science that was required. Private business was very good at incremental change based on existing knowledge. But by the mid-twentieth century, the age of individual inventors providing breakthroughs by themselves was substantially over, and private sector science was being carried out in large-scale corporate labs. The executives running these labs were not generally inclined to fund the invention of new technologies that could undermine or even ruin their company’s existing business model.
The controversial yet deep insight of Bush’s wartime model was combining the traditionally quite separate world of corporate management with the quirky faculty of universities to find solutions for what the military needed—sometimes even before the military knew precisely what that was. In his report, Bush proposed that the United States combine that university–private sector partnership with ongoing large-scale government funding to produce a postwar innovation machine.
This is—eventually—exactly what the United States did. The idea of large-scale government funding for university-based science took a while to gain traction, and the precise structures created were not exactly what Bush had in mind.9 Nevertheless, in the decades that followed World War II, his broad vision was implemented to a substantial degree.
An essential part of this approach was a transformation of higher education, including a great expansion in the number of university-trained engineers and scientists—made possible through federal government support, beginning with the GI Bill of 1944. New sectors developed, millions of jobs were created, and these vacancies, including occupations that had not previously existed, were filled by people with recently acquired high levels of skill. For example, the government backed investments in the technology needed to develop jet aircraft, creating the basis for a large commercial sector, which in turn needed—and was able to hire—thousands of skilled mechanics and engineers.
This combination of new technology and a larger number of skilled people increased productivity and created the scientific and practical basis for almost everything that characterizes our modern economy. For the next two decades, wages rose for university-educated people and—a key point—also for those with only a high school diploma.
The catalyst for this effort was federal government funding at a scale never previously experienced, which generated some of the highest-return investments the world has ever seen.
From 1940 to 1964, federal funding for research and development increased twentyfold. At its peak in the mid-1960s, this spending amount was around 2 percent of annual gross domestic product—roughly one in every fifty dollars in the United States was devoted to government funding of research and development (equivalent, relative to GDP, to almost $400 billion today). The impact on our economy, on Americans, and on the world was simply transformational.
There is a good chance you are alive today because of this work. Penicillin was a pre–World War II British invention—a brilliant albeit accidental discovery. But it was the Americans, under Bush’s leadership, who figured out how to scale up production and distribute millions of high-quality doses around the world.10 This effort sparked interest in other potentially important soil microorganisms, leading indirectly to the development of streptomycin (effective against tuberculosis) and other antibiotics.11 Cortisone and other steroids were created.12 An ambitious worldwide anti-malaria campaign was launched.13 Childhood vaccination, the decline in maternal mortality, and the control of infectious disease more broadly all sprang directly from this work. The leading American pharmaceutical companies of today owe their expansion and subsequent fortune to this public push, which began under Bush’s auspices, to improve medical science.14
Digital computers are another area where the federal government had great impact.15 By 1945, the US military saw that it faced important problems—from the automated control of naval guns to the management of complex early-warning radar systems—that called out for faster computation than was humanly possible. Their funding for both basic research and more applied development eventually made possible both the development of new machines (hardware)—including transistors, which became the essential silicon-based component—and the instructions that run on those machines (software). From this national defense–oriented investment flowed everything that has changed how we handle, analyze, and use information—up to and including Apple’s iPhone.
The examples go on, including jet aircraft, satellites, improved telecommunications, and the internet. It is hard to find any aspect of modern life that has not been profoundly affected by innovation that can be traced back either to the Bush-era efforts or to inventions that were supported by various government programs in the years that followed.
Prior to 1940, university education was primarily a luxury, available to only a few people. Subsequent to the expansion in the potential for technological progress—and the availability of government support for research and teaching—the number and quality of places for studying science, engineering, and all their applications greatly increased. For the first time, the United States became the best country in the world to study, develop, and commercialize new technology.
The backbone of the US economy in the postwar years was built on a visionary model that created not just great companies and amazing products but also a large number of good jobs—the basis for the largest and most successful expansion of a middle class that the world has ever seen. The fruits of government investment were indirectly shared with all citizens through a US corporate sector that provided stable employment at high wages with relative equality, at least by today’s standards.
Median family income in the United States doubled from 1947 through 1970. The increase in wealth was shared throughout the country, with growth not only on the coasts but in the industrial Midwest and the newly dynamic South.
The broader benefits of this new technology were felt around the world. There was a general American desire to support a more stable world—primarily to avoid a repetition of the Great Depression and two world wars. However, what drove the spread of useful and productivity-enhancing technology was not primarily any form of altruism or even a deliberate desire to help. Ideas, once manifest in the form of a usable technology, are hard to control and spread to wherever people find them appealing.
Naturally, other countries responded—by investing in their own scientific endeavors, in effect trying to create their own version of what was working well in the United States. This became the age of deliberate government-supported but private sector–led technological innovation.
Despite a remarkable run of technological and economic success, the United States now faces serious problems. In World War II and during the Cold War, the country built a powerful and stable engine of growth through the application of scientific research to practical problems. The associated technologies proved transformative, resulting in new products, new companies, and an almost insatiable demand for American goods and services around the world.
Unfortunately, we failed to maintain the engine. From the mid-1960s onward, based on concerns about the environmental, military, and ethical implications of unfettered science, compounded by shortsighted budget math, the government curtailed its investments in scientific research. Economic difficulties during the 1970s, followed by the Reagan Revolution and the anti-tax movement, resulted in an even broader retreat from federally funded activities. Most recently, the impact of a global financial crisis in 2008 and consequent economic pressures—known as the Great Recession—have further squeezed investments in the scientific future.
Federal spending on research and development peaked at nearly 2 percent of economic output in 1964 and over the next fifty years fell to only around 0.7 percent of the economy.16 Converted to the same fraction of GDP today, that decline represents roughly $240 billion per year that we no longer spend on creating the next generation of good jobs.
Should we care? If there is socially beneficial research and product development to be done, surely the innovative companies of today will take this on?
In fact, they won’t. Invention is a public good, in the sense that every dollar of spending on science by a private company is paid for by that company (a private cost), while some of the benefits from discoveries invariably become public—ideas, methods, and even new products (once patents expire) are shared with the world.
The private sector, by definition, focuses solely on assessing if the private returns—to this firm, its managers and investors—of any investment are high enough to justify the risks. Executives running these companies do not account for the spillover benefits that accrue from producing general knowledge, and they do not share proprietary research that might benefit others.
Moreover, new invention in the private sector is constrained by financing. The venture capital sector that has created so many high-tech success stories has, at the same time, avoided the type of very-long-run and capital-intensive investments that lead to technological breakthroughs—and create new industries and jobs.
As a result, the government retreat from research and development has not been fully offset by the private sector. Consequently, our stock of knowledge increased more slowly than it would have otherwise—over time, this means lower growth and less job creation.
Missed opportunities for invention directly contribute to the stagnation of incomes. From World War II through the early 1970s, our economy—total gross domestic product—grew close to 4 percent per year on average.17 Over the last forty years, our growth performance has slipped, averaging under 3 percent per annum since the early 1970s and decelerating further to under 2 percent per annum since 2000.18 By the mid-2020s, the Congressional Budget Office expects annual growth in total GDP will average only 1.7 percent per year.19
At its core, economic growth is all about what happens to productivity—how much we collectively produce per person.20 The information technology revolution is much hyped—smartphones for everyone!—but has proven profoundly disappointing in terms of its impact on productivity, and there is no sign this will soon change. The boom-bust decade that started the 2000s only further undermined our ability to grow.
Good jobs—at decent wages with reasonable benefits—are disappearing and being replaced by low-paying jobs that do not support a sufficient standard of living. A process of job destruction is a normal part of any market economy and also existed during the boom years of the 1950s and 1960s. But the new information technology that failed to boost overall productivity growth served to accelerate the elimination of high-paying jobs that were previously held by people with just a high school education. As a result, after doubling in only twenty-three years after World War II, the median US household income grew by only 20 percent over the next forty-five years.
While we have retreated from Vannevar Bush’s innovation engine, the rest of the world is picking up the slack. Total research funding is growing at a much faster rate, relative to the economy, in the rest of the world than it is in the United States, led in many countries by active government policies. This is particularly true in our largest economic rival, China, whose rising investments have paid off, including in areas such as computing and, increasingly, medical research, where the United States once dominated.
The middle class is already under enormous pressure, with stagnant wages and a rising cost of higher education that makes it harder and harder to move up the economic ladder. At the same time, there is a discernible and hard-to-reverse geographic impact: good jobs are created disproportionately in a small number of cities, largely on the East and West Coasts. Restrictive zoning policies and high land prices in these cities make it difficult for many people to migrate to where the good jobs are, leaving them behind in slower-growing areas and contributing to a sense of economic unease.
We need a transformative and politically sustainable new way to boost growth and create jobs—by jump-starting our growth engine.
The economic slowdown of the past few decades is not inevitable. Our economy can become dazzling again—both in terms of inventions and, more importantly, in terms of the prospects for most Americans. To do this, America needs to become much more of a technology-driven economy. That sounds surprising because most of us think we are a country driven by leading technologies and technological players. After all, isn’t Silicon Valley already the engine of world growth?
Actually, no—Silicon Valley impacts only a small part of the US economy.21 The American private sector invests in new products but not in basic science. To really improve the performance of the American economy—and to raise incomes across the board—we need to invest heavily in the underlying science of computing, human health, clean energy, and more.
The necessary conditions are largely in place. We have the world’s leading universities, favorable conditions for starting new business, and plenty of capital willing to take risks. We have learned a great deal about what works and what does not in terms of the public-private partnership around science and innovation.
What we need is a sustained public- and private-sector push that scales up the innovation system, focusing on the creation of ideas that can be converted into technology—just like the early work on digital computers ended up creating an entirely different structure for the organization and dissemination of information. This will require the type of commitment to the federal funding of science that helped support our post–World War II boom.
We should support this with a major expansion in science education across all ages, with the goal of producing—and employing—many more university graduates with technical skills. This combined increase in demand and supply can, over time, create millions of new, high-paying jobs.
But to make this push both economically sensible and politically sustainable, we need to distribute the benefits of growth more broadly, in two senses.
First, we must ensure that the new high-tech jobs do not follow the pattern of the past forty years and fall into just a narrow set of “superstar” cities on the East and West Coasts. There are dozens of other cities throughout the United States that meet the conditions for creating a new technology hub. These are cities that have the preconditions for success—a large pool of skilled workers, high-quality universities, and a low cost of living—and where people desperately want more jobs at good wages. But they are places that are losing out today because they do not have enough scientific infrastructure to become new centers of innovation, nor do they have the base of venture capital that can turn new ideas into profitable companies.
The federal government can select the best places using the type of competitive selection mechanism most recently employed by one of the country’s most valuable companies. In late 2017, as we mentioned in the prologue, Amazon announced that it would place a second headquarters operation somewhere in North America, creating perhaps around fifty thousand good jobs. A total of 238 cities and regions from all over the United States (and Canada)—irrespective of political inclination—submitted bids, laying out various kinds of welcome carpets, including tax breaks and supportive infrastructure.
Amazon, however, eventually chose two locations that will help it make a presumably bigger profit—partly by receiving the largest possible tax breaks. This is what companies do: they serve the interests of their shareholders, not the public. The result is a zero-sum tax competition that does nothing to raise the wealth of the nation as a whole.
The competition we have in mind would serve the interest of the nation, not individual companies. Places would compete not on the basis of tax breaks but on the basis of their qualifications to become a new technology hub. This would involve demonstrating the proper preconditions for scientific innovation, including research infrastructure, and support for better scientific education, from high school through college. It would involve ensuring sustainable development plans for the area so that we don’t just create new congested and high-cost-of-living cities. Places would need to demonstrate partnerships with the private sector that can lead from lab science to product development.
Second, we should share the benefits of innovation more directly with the US taxpayer. For too long, the government has funded basic research—such as digital computers, the internet, and the Human Genome Project—that has essentially become windfall profits for a small number of investors who are able to get in early into enough technology-development projects.22 The increasing shift in the returns from production toward capital owners (people who own companies, property, and so on, rather than workers), combined with falling effective rates of taxation on those returns to capital, leaves many Americans rightly suspicious of government investments that lead to more profitable firms.
As part of our competitive criteria for areas to attract the additional federal science funding, local governments would need to provide a way for taxpayers to share directly in the upside. For example, local and state government could hold a large, publicly owned parcel of land for development in and around these new research hubs—with the government getting the upside, in higher rents or capital appreciation, as this land becomes increasingly valuable. Profits would be paid out directly to citizens as a cash dividend every year.
We have a great model of how to do this from a relatively conservative state: the Alaska Permanent Fund, which distributes the revenues from natural resources (oil and gas) equally to all state residents. An annual innovation dividend would be paid out in cash terms equally to all Americans, illustrating vividly the returns from the public’s investment in advancing science.
Taxpayers take risks all the time, whether they know it or not. Ever since the creation of the American Republic—and much more so since 1940—the federal government has invested in pushing frontiers forward, first in a geographical sense and more recently in terms of technology.
When projects go wrong—like the collapse of solar manufacturer Solyndra, which borrowed more than $500 million from the federal government—there are accusations, investigations, and some attempt to assign blame. The taxpayer has to absorb the losses.
When projects go well—radar, penicillin, jet planes, satellites, the internet, and most recently the Human Genome Project—great fortunes are created, but only for the lucky few. It’s time for all Americans to get a serious piece of the upside from accelerated innovation.
The first part of the book focuses on the largely forgotten history of how publicly funded science contributed to victory in World War II and then created the underpinnings for the dynamic American economy of the postwar period. The heroes of this story are not household names, but they can rightly claim a substantial share of the credit for our US postwar economic boom. We then explain how scientific overconfidence, conflicts with politicians, and budgetary concerns inclined the public sector to curtail science funding.
The second part explains the economic case for a major push today on publicly funded research and development. We explain why the private sector systematically underinvests in science. We also show how publicly funded science continues to be innovative and job-creating, albeit at too low a level.
Finally, we bring these lessons to bear on how to rebuild the American growth engine, based on the enormous opportunities for growth outside of coastal megacities. We propose a detailed plan for expanding scientific efforts and ensuring the benefits are broadly shared.
June 1940 represented a moment of deep crisis for the world—and one to which, eventually, the United States responded dramatically. The issues we face today are less obviously about national security but, because they affect the sustainability of our economic well-being, may prove just as profound.
To what extent will the United States create the good jobs of the future? While we hesitate, other countries invest heavily in new science and its applications. We are already being overtaken in key sectors. Respond now or again risk being left in the dust of rival nations.