Chapter Twenty

A NEW ECONOMY, A NEW WORLD, A NEW WAR

WARS ARE OFTEN ENGINES of technological development, and none more so than the greatest war in history, the Second World War. The developments would have come in time regardless, but the war, with its bottomless funds for research and pressing demand for results, accelerated the process considerably, in some cases by decades. With further funding provided by the exigencies of the cold war, these new technologies transformed the world economy in only a generation.

The need for bombers capable of carrying heavy loads long distances produced a quantum leap in both airframe design and construction techniques. After the war these were quickly adapted to civilian uses, and the price of air travel declined to the point where usage increased dramatically. When the large airframe was combined with the jet engine, which allowed near sonic speed, and radar, which allowed the far closer spacing of aircraft near airports through air traffic control, the airplane quickly became the dominant form of long-distance passenger transportation.

Within a decade of the introduction of the Boeing 707 in 1958, both the Atlantic passenger steamship and the long-distance passenger train were near extinction and the world had shrunk by an order of magnitude. What had been a three-day trip between New York and Los Angeles was now five hours; the trip between New York and London took seven hours, not nearly a week.

The large rocket, capable of moving a significant payload hundreds of miles, was developed by the Germans and perfected in the V-2, which weighed fourteen tons and could deliver a one-ton warhead to a target two hundred miles away. V-2s began to hit Britain in late 1944, but they came into use far too late to affect the outcome of the war. Their obvious utility both as a terror weapon and as a means of exploiting the potential of outer space, however, caused a scramble at the end of the war between Western and Soviet forces to secure both the remaining stockpiles of rockets and the scientists who had developed them.

Both the Soviet Union and the United States poured vast resources into rocket research to develop larger vehicles, longer range, and greater accuracy. By the end of the 1950s the rocket, mated to the hydrogen bomb—another technology born in the Second World War—had changed the nature of war. ICBMs, capable of utterly destroying whole cities in an instant, made wars between Great Powers irrational because they were now unwinnable in any real sense. So the United States and the Soviet Union, locked in a profound geopolitical struggle, had to find other ways in which to compete.

But ICBMs also engendered a deep fear that events might spin out of control, as they had on the eve of the First World War, and lay the world waste in a nuclear holocaust. Twice, in the Cuban missile crisis of 1962 and the Yom Kippur War of 1973, that came perilously close to happening.

One means for the United States and the Soviet Union to battle for supremacy was via proxy wars, such as those in Korea, Vietnam, and Afghanistan. Another was by using the technology of the rocket to explore outer space. The Soviet Union stunned the world on October 4, 1957, when it launched the world’s first earth-orbiting satellite, Sputnik. (Its purpose was purely propagandistic, as its radio signal, heard around the world, transmitted no information, serving merely as a beacon. But it was very effective propaganda.) The United States soon launched a satellite of its own, and hundreds more followed. A “space race” quickly developed that was won in 1969 by the United States when it landed men on the moon.

Many of these satellites were for military purposes, such as spying, but many others had both civilian and military uses such as communications and weather data gathering. By the late 1960s geosynchronous satellites were able to transmit television pictures that could be received simultaneously by anyone, anywhere, who was able to pick up the signal. The global village, first named by the philosopher Marshall McLuhan in 1960, but which had begun with the laying of the Atlantic cable nearly a hundred years earlier, was now at hand.

The economic applications of space technology, especially since the end of the cold war when many of them were declassified, are nearly limitless and increasing every day. Agriculture, transportation, cartography, navigation, and communications are but a few. Geopositioning satellites have made it possible to determine, with the help of a simple device, one’s location within a few feet. The devices are now appearing in many automobiles, giving directions through the use of synthetic voices, a technology that would have seemed utterly miraculous only a couple of decades ago.

Communications satellites, together with an ever-increasing number of undersea cables, have helped greatly to lower the cost of long-distance telephony, leading to an astonishing upsurge in its use. In 1950 about a million overseas phone calls were initiated in the United States. By 1970 the number had grown to twenty-three million; by 1980 to two hundred million. By 2001, as the cost plummeted, the number was 6.3 billion and rising fast.

The collapse in the cost of international communications allowed the world’s financial markets to become tightly integrated and increasingly to function as one seamless market, operating twenty-four hours a day. Just as the teenage runners had once bound together the New York financial market by dashing back and forth between the stock exchange, banks, and brokerage houses, keeping all apprised of the latest prices, now a cat’s cradle of undersea cables and satellite links bound together the new global markets.

This had profound political as well as economic consequences. As early as 1980, a unified market trading the major world currencies was in place, trading on average a trillion dollars a day at that time. In 1981 France elected a socialist government under François Mitterrand, and he tried to implement a traditional socialist program, raising taxes on high incomes and nationalizing parts of the French economy, including banks. The French franc quickly plunged on the currency market and continued to do so until the French government had no choice but to reverse course.

It was a pivotal moment in world history. For the first time a free market was able to dictate policy to a Great Power. Just as when the newspapers became a mass medium in the mid-nineteenth century, a major new player in the game of national and international politics had come onto the stage. And the world’s governments learned that the old gold standard—which had been implemented by the quasi-governmental Bank of England—had been replaced by the global currency market standard. It was a standard more flexible, more exacting, and far more democratic than the one that it replaced. Inflation, the number-one economic concern twenty-five years ago, has largely vanished from the list of the world’s financial concerns.

 

NO TECHNOLOGY COMING OUT of the Second World War can begin to compare with the computer for creating a divide between the past and the present.

The word computer has been a part of the English language since the middle of the seventeenth century. But until the middle of the twentieth it referred to a person who calculated for a living, compiling such data as actuarial and navigation tables. (They were mostly women, who were thought more reliable for such work.) Humans, however, have two big limitations as computers. They can only do one calculation at a time, and they make mistakes. In the middle of the nineteenth century a mathematician named William Shanks calculated the irrational number pi out to 707 decimal places, an astonishing intellectual feat. It would be more than a hundred years before it was discovered that he made an error at digit 527 and the last 180 digits of his calculation, therefore, are wrong.

The idea of calculating by machine is a very old one. An Englishman named Charles Babbage, after correcting astronomical tables, lamented as early as the 1820s that “I wish to God these calculations had been executed by steam!” He later began to build, from finely machined brass parts, a hand-powered calculating machine, but never finished its construction. He also designed an analytical machine that was a mechanical precursor of a true computer because it was programmable. It too was never finished.

As governments and businesses grew in size and came to rely on ever more statistics, the need to speed up the processing of data became acute. The 1880 United States census, tabulated by hand, required seven years of mind-numbing work to complete. To help with the next census, a young mining engineer and statistician named Herman Hollerith devised a solution based in part on the eighteenth-century Jacquard loom, which had allowed the machine weaving of complicated cloth patterns. Hollerith’s device used punch cards with holes. When a needle passed through a hole, it completed an electrical circuit by dipping into a tiny cup of mercury, and a counter ticked upward.

Hollerith’s device was able to tabulate the data on punch cards at the rate of a thousand cards an hour, and the sixty-two million cards generated by the 1890 census were processed in only six months. (Ironically, a fire in 1921 destroyed the database of the 1890 census, and while the totals are known, almost all the individual data are lost.) Hollerith formed a company that merged with two other companies and in 1924 changed its name to the International Business Machine Corporation, IBM for short.

Because of the need to calculate the trajectories of artillery shells quickly and to decrypt codes, the American and British governments poured money into developing a true electronic computer during the Second World War. The first successful general purpose one was called ENIAC (an acronym for Electronic Numerical Integrator and Computer) completed at the University of Pennsylvania by Presper Eckert and John Mauchly in 1946, after three years of effort.

It was a monster, the size of a bus. It filled forty filing cabinets, each nine feet high, with eighteen thousand vacuum tubes and thousands of miles of wiring. The vacuum tubes and cooling system used as much electricity as a small town, and it was, by modern standards, glacially slow. Programming was done by physically switching the wiring on switchboardlike grids. People had to stand by constantly to replace the vacuum tubes as they blew out and to remove the occasional errant insect (the origin of the term debugging).

Computers shrank rapidly in size and cost, especially after 1947, when Western Electric, the manufacturing arm of American Telephone and Telegraph, developed the transistor. The transistor does exactly what vacuum tubes do, but are much smaller, are much more durable, and cost far less to operate and to manufacture. By the 1960s banks, insurance companies, government agencies, and large corporations depended on computers to do, at a fraction of the cost, the work once done by hundreds of thousands of clerks. IBM dominated this market with such machines as the 7090, introduced in 1959.

But computers remained very large and mysterious, hidden away in special air-conditioned rooms, tended by men wearing white coats who spoke languages no one else understood. These computers intruded not at all into the daily lives of most people. They also remained very, very expensive for a reason that mathematicians call the tyranny of numbers.

The power of a computer is relative not only to the number of transistors but also to the number of connections between them. If there are two transistors, only one connection is needed. If there are three, then three connections are needed to fully interconnect them. Four transistors need six, five need ten, six need fifteen, and so on. As long as these connections had to be made, essentially, by hand, the cost of building more powerful computers escalated far faster than did their power.

The solution to this problem was the integrated circuit, first developed in 1959 by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor. An integrated circuit is simply a series of interconnected transistors laid down on a thin slice of silicon by machine. In other words, the transistors and the connections between them are manufactured at the same time. In 1971 Intel produced the first commercial microprocessor, which is nothing less than a very small computer laid down on a silicon chip.

The tyranny of numbers was broken. For while the cost of designing a microprocessor and building the machines necessary to produce it are extremely high, once that investment is made, the microprocessors themselves can be turned out like so many high-tech cookies, bringing the cost of each one down by orders of magnitude. They quickly increased in complexity and therefore in power and speed.

Gordon Moore, a founder of Intel, predicted in the early days of the company that the number of transistors on a chip, and thus the chip’s computing power, would double every eighteen months. He proved correct, and “Moore’s law,” as it is called, is expected to continue to operate for the foreseeable future. The first Intel microprocessor had only twenty-three hundred transistors. The Pentium 4, the current standard for personal computers, has twenty-four million. As the power increased, the price per calculation collapsed. Computing power that cost a thousand dollars in the 1950s costs a fraction of a cent today. Its use, therefore, began to increase by orders of magnitude, an increase that has not begun to level off.

The computer, like the steam engine, produced an economic revolution, and for precisely the same reason: it caused a collapse in the price of a fundamental input into the economic system, allowing that input to be applied to an infinity of tasks that previously had been too expensive or simply impossible. The steam engine brought down the price of work-doing energy; the computer brought down the price of storing, retrieving, and manipulating information.

Previously, only human beings could do this sort of work; now machines could increasingly be employed to do it far faster, far more accurately, and at far, far lower cost. And just as the steam engine could bring to bear enormous energy on a single task, the computer can bring a seemingly infinite capacity to calculate and manipulate information. A computer model of the early universe designed in the 1980s was estimated to have required more calculations than had been performed by the entire human race prior to the year 1940.

Computers began to invade everyday life with astonishing speed. It was more than sixty years between Watt’s first rotary steam engine and the coining of the phrase Industrial Revolution, but it was clear that a computer revolution was under way less than a decade after the first microprocessor was produced. The first commercial products were handheld calculators that quickly sent the adding machine and the slide rule into oblivion. Word processors began to replace the typewriter in the mid-1970s. And microprocessors, unseen and usually unnoted, began to be used in automobiles, kitchen appliances, television sets, wristwatches and a hundred other everyday items. Many new products—cordless phones, cell phones, DVDs, CDs, VCRs, digital cameras, PDAs—would not be possible without them. By the 1990s they were ubiquitous. The modern world would cease to function in seconds if microprocessors were all to fail.

But while computers had become much smaller and much cheaper, they were still very difficult to use by people without considerable special training. In the early 1970s the Xerox Corporation, at its Palo Alto Research Center, developed many ways to make computers far easier for nontechnical people to use, including the mouse and the graphical-user interface. But Xerox was unable to develop a marketable product using these new concepts. Steven Jobs and Stephen Wozniak, the founders of Apple Computer, did so. When IBM entered the PC market in 1981, using an operating system developed by Microsoft, the market for personal computers took off and has been increasing exponentially ever since as the price has dropped relentlessly.

Today tens of millions of children and adolescents have on their desks, and use constantly, computing power that would have been beyond the reach of all but national governments thirty years ago. Their developing brains are literally being wired to use computers as an adjunct of their own intellect. Further, they have at their beck and call what is without question the most extraordinary machine ever developed by a species with an abiding genius for machinery.

A personal computer can play chess, or any other game, better than all but grand masters; keep books; store and retrieve vast amounts of data; edit photographs; produce CDs; play movies; create art; and do a thousand other tasks. No one who lived before the last third of the twentieth century would regard a personal computer that costs no more than 5 percent of average annual income as anything but magic, or an elaborate fraud.

But PCs can do still more. They can communicate. Personal computers have become the portal into an entirely new but already major part of the human universe, the Internet. Just as the railroad proved the most consequential spinoff technology of the steam engine, so the Internet has been for the computer. Once again, it was war, or rather the possibility of war, that brought it into being.

After the launching of Sputnik in 1957, the Defense Department formed the Advanced Research Projects Agency (ARPA) to organize and coordinate science and technology projects with military applications. In 1962 Paul Baran of the RAND Corporation was asked to propose means whereby command and control could be maintained after a nuclear strike. Communications networks had always been either centralized, with all communication through a central hub, or decentralized, with a number of hubs, with subnetworks. Telegraph and telephone networks were structured this way, with switchboards serving as the hubs.

Both were highly vulnerable to a nuclear strike. This not only increased the possibility the system would fail, but also increased the possibility that one side or the other in a confrontation would be tempted to strike first, for fear that it might not be able to respond to a strike.

Baran proposed a “distributed network” with no central hubs, only an infinity of nodes, similar to the crossroads in a street grid. If one or more nodes was wiped out, messages could still travel by other routes. A web of computers that came to be called ARPANET was established in 1968 with a total of four computers connected to it by phone lines, three in California and one at the University of Utah.

In 1972 the first e-mail program was designed, and a protocol named TCP/IP was designed the following year to allow different computer systems, even those using different languages, to connect easily via the net, which by then had twenty-three computers connected to it. One of the designers of TCP/IP, Vinton Cerf, coined the term Internet the following year, because it was beginning to connect not only individual computers but also subnetworks of them. In 1983, by which time there were 563 computers on the net, the University of Wisconsin developed the Domain Name System, which made it much easier for one computer to find another on the net. By 1990 there were more than three hundred thousand computers on the Internet and the number was growing explosively, doubling every year.

But it was still mostly a network connecting government agencies, universities, and corporate research institutions. Then, in 1992, Tim Berners-Lee, an Englishman working for CERN, the European nuclear research consortium, wrote and released without copyright the first Web browser, a program that allows people to easily find and link to different sites set up for the purpose. The World Wide Web (WWW) was born. Individuals and corporations quickly saw the potential of this new means to communicate as well as advertise and sell their products. In 1994, when the number of users of the Internet was still only nearing four million, Pizza Hut began to sell pizza via its Web page.

Internet usage exploded in the mid-1990s, and today, a mere decade later, uncountable millions of computers around the globe are linked together by this system. It is by far the most powerful communications tool ever developed. The Internet has already caused profound restructuring of many businesses.

Indeed, all businesses that are essentially brokers—businesses that bring buyers and sellers together and take a small percentage of any sale that takes place, such as real estate, travel agents, stock and insurance brokers, auction houses—are finding their business changing its nature or disappearing altogether. The Internet, especially together with such search engines as Google, simply makes it far easier for buyers and sellers to find each other without a broker.

Retailers as well began to sell more and more via the Internet, delivering around the country and overseas, often overnight by means of such delivery services as FedEx and UPS. Internet retail sales, which are often cheaper because of lower overhead and no sales tax, have been growing at the rate of better than 30 percent a year for the last seven years. Amazon.com, one of the first of these Internet retailers, now controls about 10 percent of the retail book market in the United States and is expanding quickly into other areas.

The news media as well are undergoing fundamental change as a result of the Internet. Soon after the dawn of the mass media in the 1830s, the cost of entering the news business increased sharply. James Gordon Bennett had begun the New York Herald in 1835 with only $500 in capital. The New York Times, founded sixteen years later, needed $85,000. Radio and television also required large amounts of capital (and a government license) to reach the public.

But the Internet made it possible for anyone with a PC and a Web page design to enter the news business, and thousands did. In 1998 Matt Drudge of The Drudge Report broke one of the biggest news stories of the 1990s, the Monica Lewinsky scandal. Weblogs (called blogs for short, their authors bloggers) sprang up by the tens of thousands as people began to express their opinions through this new medium. The good ones attracted large audiences and quickly developed real influence. The effect of the Internet has been to sharply democratize the media by allowing many more voices to be heard.

Because the Internet needed almost no infrastructure not already in place, it developed spontaneously, with little government help or interference and no government direction. Once the world’s most communicative animal discovered this powerful and very inexpensive new means of communication, the species flocked to use it. Those countries whose elites depended on close control of the population and their access to information to maintain their power, found their power slipping and often vanishing altogether.

The computer and its most important creature, the Internet, became the most potent weapon against tyranny since the concept of liberty itself.

 

WHILE WAR HAD BROUGHT the computer into existence, the computer profoundly changed the nature of warfare. In the industrial age, success in war depended more than anything else on which side could field the most men with the most guns, ships, and airplanes. Quantity trumped quality. The Soviet Union, unable to match the West in technological development, had depended on this fact and the ability of its vast intelligence operation to steal Western technology, to maintain its superpower status, and to fight the cold war.

But with computer guidance, bombs became smart, allowing them to be much more accurate and far less lethal to civilians even in crowded urban neighborhoods. More sophisticated radars made possible by the microprocessor changed the nature of air battles. In 1982 the Israeli air force was able to use unmanned surveillance aircraft to mimic fighter planes attacking Syrian radar sites in the Bekaa Valley in Lebanon. When the radars turned on in order to track the aircraft, real fighter planes swept in, honing in on the radars’ own beams, and destroyed them. With its battle management radar, supplied by the Soviets, out of action, the Syrian air force was effectively blind and the Israelis shot ninety-six Syrian planes, also Soviet supplied, out of the sky, losing none of their own.

With electronics advancing very rapidly, the Soviets could not keep up on their own, or even steal fast enough to do so. The military advantage of its huge army and massive numbers of ships, tanks, and planes was rapidly eroding. When the United States began supplying the Afghans with handheld Stinger antiaircraft missiles, the Soviet control of the air and thus its military advantage in Afghanistan’s rugged terrain vanished, and the war there became unwinnable. The Afghan war quickly became the Soviet Union’s Vietnam, and the Soviet government found itself unable to hide the truth from its people.

Ronald Reagan seized the advantage and pushed through Congress a massive rearmament program, increasing defense spending by 50 percent in real terms in the first six years of his administration. He also announced the development of a space-based missile defense system, promptly dubbed “Starwars,” that would have cost billions but would have nullified the Soviet’s nuclear capacity if it worked. Reagan gambled, correctly, that the Soviets would not be able to take the chance that it would fail.

The president had quite deliberately decided to use the country’s strongest weapon—the American economy—to win the cold war, just as Roosevelt had used it to win the Second World War. The United States could afford these massive expenditures. The Soviet Union, it turned out, could not. Its economy, controlled by bureaucracies, not markets, and profoundly corrupt, was in far worse shape than American intelligence estimated.

The top-down Soviet government was paralyzed in the early 1980s by the deaths of three general secretaries in quick succession. But when Mikhail Gorbachev took over in 1985, he tried both to negotiate with the United States to reduce the Soviet Union’s defense expenditures, and to loosen controls on Soviet economy and society so that the country could become more productive and use the new possibilities created by the microprocessor more effectively.

But once the people sensed that the hand of tyranny was lifting, the Soviet government quickly lost control over events. First the Communist governments in the East European satellites collapsed and then the Soviet Union itself fell apart. The non-Russian republics declared independence, and the Soviet Union ceased to exist in 1991, when the hammer and sickle was lowered over the Kremlin and the old flag of Russia replaced it.

The Soviet Union, which had presented itself to the world for its entire existence as the wave of the future, a claim accepted—remarkably, in retrospect—by a large part of the Western intelligentsia, was revealed to have been nothing more than a Russian and very old-fashioned empire, the last empire on earth to have been based on military power. The third Great Power conflict of the twentieth century, a global one like the first two, was over after nearly fifty years.

The United States now stood alone, unchallenged as the most powerful country in the world, and with no challenger in sight. But the United States and its allies were not the only victors in the cold war. Capitalism and democracy were also victors, as socialism in all its many forms had been shown to be universally a failure as an economic system. It simply could not produce the goods and services that the United States and other capitalist countries had in such abundance and which the new communications media displayed to the world.

The so-called Second World, the Communist bloc, disappeared with the end of the cold war, leaving a world of modern, developed countries, such as the United States, Western Europe, and Japan; countries that were rapidly developing along modern lines and growing explosively, such as South Korea, Taiwan, mainland China, India, and Brazil; what used to be called the Third World, countries that had yet to cast off the old ways of top-down government and economies controlled by economic oligarchies, such as the Arab world and much of Latin America and Africa; and failed states such as Rwanda, Haiti, and Liberia, hell-holes of poverty and chaos.

 

WHILE CONGRESS HAD BEEN WILLING to fund Reagan’s military buildup, it had not been willing to enact his cuts in domestic social programs. As a result, the annual federal deficit mounted sharply. Just as in the 1970s, it more than tripled in dollar terms, from $909 billion in 1980 to $3.2 trillion in 1990. But because the rampant inflation of the 1970s had been stopped, the size of the debt relative to GDP—the true measure of a national debt—had increased rapidly. Only 34.5 percent of GDP in 1980, by 1990 it was 58.15 percent and climbing rapidly. It was the first time in American history that the national debt had increased in these terms in peacetime.

And the American economy, which had grown so robustly in the 1980s that it added productive capacity equal to that of the entire West German economy—the largest in Europe—to what it already had, began to stall shortly after the end of the Reagan presidency in 1989. The rising national debt, however, did not stall. By 1994 it had reached $4.6 trillion and equaled 68.9 percent of GDP.

The recession of 1990–91 was the mildest of the twentieth century, and the economy soon began to grow again, at first fitfully, and then more and more robustly as the possibilities of the Internet and the seemingly limitless potential application of the microprocessor became manifest. The debt, if it did not shrink, grew far more slowly, thanks in part to the sale of assets of failed S&Ls that had been taken over by the government, to rapidly declining defense expenditures, and to rising tax revenues that increased even faster than Congress’s prodigious predilection for spending. By 1998 the federal operating budget was in surplus for the first time in thirty years and stayed in surplus for the next three years.

About the only perceived weakness in the American economy was the unfavorable trade balance, which deepened relentlessly during this time. But that was to a large extent an artifact not of American weakness, but of the weakness of foreign economies. Japan’s once wondrous economy had peaked in 1989 and then sank into protracted recession from which it has yet to fully recover, its major stock market index falling by three-quarters from its high. Europe, the other of the globe’s major economic centers, was also not growing anywhere near as quickly as the United States, and many countries there had unemployment rates that stayed stubbornly above 10 percent.

Wall Street boomed as never before. While the Dow-Jones Industrial Average had tripled in the 1980s, it nearly quintupled in the 1990s, reaching more than 11,000 by the end of the decade. The NASDAQ index, heavily weighted with tech stocks, did even better. Under 500 in 1990, it rose to 5,700 in early 2000.

The late 1990s in the United States were the greatest period of wealth creation in the history of the world. The numbers are almost beyond imagination. In 1988 the richest man in the United States was Sam Walton, age seventy, founder of Wal-Mart, a retail chain that by that time was the third largest in the country. The secret of its success was its revolutionary use of the computer to track and control inventory and squeeze costs out of the operation. He was worth that year $6.7 billion.

Bill Gates, the founder of Microsoft and only thirty-three, was worth $1.1 billion, one of only forty-four Americans with a net worth over a billion dollars. That year it took $225 million to make the Forbes 400 list (that was up from $92 million in the list’s first year, 1982).

In 2000 the minimum wealth required to make the Forbes list was $725 million, and the average worth was $3 billion, with three-quarters of the people on the list worth more than $1 billion. The richest of all was now Bill Gates, worth $63 billion, almost ten times the wealth of the richest American only twelve years earlier. The Walton fortune, now in the hands of the heirs of Sam Walton, had grown to $85 billion, and Wal-Mart had become the largest retailer on earth with four thousand stores and $165 billion in annual sales, equal to the gross domestic product of Poland, a country with a population of nearly forty million.

As always in the American economy, most of richest were self-made. Indeed, 263 of the 400 richest Americans in 2000, almost two-thirds, created their own fortunes from scratch; only 19 percent of the people on the Forbes list in 2000 inherited enough money to qualify for it.

The enormous rise in the stock markets in the late 1990s was bound to end in a correction, and the bubble began to burst in March 2000. But there was no crash. Instead the averages declined, sometimes sharply, sometimes gently, although many individual stocks, especially those that had been publicly offered toward the end of the great bull market, lost much of their value. There was no reason to believe that anything more than a normal, if considerable stock market correction with a very mild recession was taking place.

 

THEN, ON THE MORNING of September 11, 2001, an almost lyrically beautiful late summer day, a hijacked airliner slammed into the north tower of the World Trade Center in New York. A few minutes later a second hit the south tower. In less than two hours both buildings collapsed, killing thousands of innocent people. A pall of smoke and dust spread across the nation’s largest city and its financial district that had been the very beating heart of world capitalism for three generations. It was a direct attack on the financial capital of the empire of wealth.

A third plane struck the Pentagon, the symbol of American military might, and a fourth crashed in a Pennsylvania field as its passengers gave their lives to prevent it from striking elsewhere. For the first time since Pearl Harbor, the United States had been attacked on American soil. It was the first time since British troops landed in Louisiana in December 1814 that the mainland itself had been under assault.

For the fourth time in less than a century, the United States was at war with forces who sought to prevent the spread of modernity, whose hallmarks are democracy and capitalism. But this time it was not a nation-state that had attacked, but rather a shadowy conspiracy of fanatics. It was an enemy far weaker by all ordinary measures of geopolitical power than the enemies of earlier wars, but also an enemy whose war-making potential would be far harder to destroy. No one thought the war would be easy or short or cheap.

But all, except perhaps for a few of its enemies, blinded by ideology, thought that the United States would prevail in this new struggle. As Cicero had explained in the final days of the Roman Republic two thousand years earlier, “the sinews of war are infinite money.” The American economy at the dawn of the twenty-first century was more nearly capable of producing those sinews than any other economy the world has ever known.