I
The events of Tuesday, September 11, 2001, drew a thick line under the dotcom era. During one radiant late-summer morning, nineteen young men armed with box-cutters and knives destroyed many of the intellectual assumptions that had underpinned the Internet bubble. The myth of American invulnerability, which resulted from the end of the Cold War and victory in the Gulf War, was only the most obvious one. Allied to that belief was an overwhelming faith in technology. Perhaps the most disturbing aspect of the terrorist attack was its determinedly low-tech nature and the ease with which it overcame the overwhelming technical advantage of the perceived foe: the U.S. financial and military establishment. The terrorists occasionally exchanged e-mails, but for the most part they communicated via voice and paper. Their principal weapons—jetliners loaded with fuel—were invented in the 1940s. Against such a primitive but dedicated enemy, the U.S. government’s inventory of high-technology surveillance equipment—spy satellites, reconnaissance aircraft, phone taps, and so on—counted for nothing. A single informer with pencil and paper would have been far more useful.
The promise of the Internet wasn’t just technological: it was also ideological. Once digital networks had liberated them from the confines of tradition and physical location, human beings would come together and transcend ancient divisions: tribal, religious, racial, and economic. After September 11, it seems ludicrous to speculate about an escape from history or geography. The gulf between radical Islam and the West is deeply rooted in both. In retrospect, the period from 1989 to 2001 looks like an historic aberration, an extended time-out, during which the normal conditions of human conflict and international rivalry were temporarily suspended (for most people living in the West, at least). In this vacuum, technological utopianism and several other varieties of mushy thinking flourished.
One of the mushiest was the belief in eternal prosperity. The terrorist attacks hit the post-bubble economy where it was most vulnerable: in the minds of consumers. Stunned by the dreadful images from New York and Washington, many Americans deferred or canceled expenditures, and firms throughout the country (not just those in the travel industry) faced potentially catastrophic falls in demand. Until the terrorist attacks, the only thing preventing a deep recession had been the surprising willingness of consumers to keep spending at record levels despite the economic slowdown. Now this prop was removed. When the stock market reopened on Monday, September 17, it suffered one of its worst weeks ever, during which the Dow fell by more than 1,300 points, or 14.8 percent, to 8,235.81. This sell-off represented a perfectly rational reaction by investors to heightened uncertainty about the future, but it had negative economic consequences of its own. With more stock market wealth being eviscerated, both business and consumer spending were likely to be hit further. If they were, corporate profits would suffer more damage, which would put yet more pressure on stock prices.
In short, the events of September 11, 2001, attacks gave another sharp twist to the deflationary circle that had been buffeting the economy since the Nasdaq’s collapse in March and April 2000. Had they occurred in a situation where consumers and firms were more upbeat about the future they would have been considerably less damaging. In financial terms, the direct impact of the terrorist attacks—the destruction of about 15 million square feet of office space and the temporary grounding of the airline industry—was relatively minor, but this was much less important than the indirect impact. By September 2001, the American economy was already on the brink of a psychological breakdown. The bear market was already more than a year and a half old. The Dow peaked in January 2000, the Nasdaq in March 2000. Repeated promises of an economic recovery had failed to materialize, and Alan Greenspan seemed unable to do anything about it. Confidence in the future seemed to be slipping away. In such circumstances, the sight of jetliners crashing into the World Trade Center and the Pentagon was a devastating blow.
Economists like to tally things in dollars and cents, but the significance of what happened cannot be expressed in purely monetary terms. It was too fundamental for that. For capitalism to operate successfully, at least two things have to be in place: a working system of laws, to ensure safety of person and property; and hope for the future, to encourage capital accumulation. If either doesn’t exist—and the phenomenon of commercial jets being used as incendiary bombs to destroy office buildings is potentially fatal to both of them—the economic system will break down. That is what appeared to be happening in the days immediately following September 11, when large swaths of American business practically ground to a halt. It wasn’t until the sheer savagery of the attacks started to recede into memory that any semblance of normality returned. By then, the financial outlook seemed utterly changed, although, in reality, the attacks had largely accentuated existing weaknesses in the economy.
II
Now that the dotcom era is receding into history, it is easier to see what is left of it. To begin with, the Internet remains a technological wonder. According to the latest estimates, more than 300 million people around the world use it to communicate in ways that would have been unthinkable a generation ago. As a research tool, mail service, and repository of human fascinations, the Internet is incomparable. Vannevar Bush, were he to rise from the grave and subscribe to America Online, would be astonished, not only at the volume of information at his fingertips but at its geographic and cultural diversity. None of this will be affected by the collapse of the stock market bubble. In decades to come, when companies like Webvan and TheGlobe.com are long forgotten, historians will look back on the 1990s as the decade during which the information society became a reality.
What the historians will have to say about e-commerce will be less complementary. Most of the early claims made about the online economy turned out to be grossly exaggerated. The Internet, it transpired, was not a “disruptive technology” that would destroy any company locked into old ways of doing things, such as selling books in stores, printing news on paper, or using people to sell stocks. The bookstores, newspaper companies, and brokerage houses are still in business, and most of them are doing fine. In each of these industries, old economy firms, far from being displaced by the Internet, have used it to get closer to their customers. Barnesandnoble.com allows people to return unwanted books to Barnes & Noble bookstores and continues to take market share from Amazon.com. The online edition of The Wall Street Journal boasts almost 500,000 subscribers, most of whom also buy the print edition. Merrill Lynch offers online trades for $29.95, but the majority of its clients also use the firm’s brokers, who are much more expensive. These “clicks and mortar” strategies work because they treat the Internet as a unique distribution channel, but don’t rely on it to the exclusion of all else. It is difficult to think of a single example in which an Internet company has supplanted a major old economy firm.
Most Internet start-ups failed because they were based on the mistaken premise that the Internet represented a revolutionary new business model, which it didn’t. It is a tool that companies can use to build their business if they can combine it with distinctive products and avoid ruinous price wars, but nothing more than that. The Internet companies that will survive will be those providing services that wouldn’t otherwise exist, such as eBay, and those providing services that complement existing products, such as Travelocity.com. It is no accident that both eBay and Travelocity.com deal in information goods. The Internet’s great strength is its capacity to process information. Any good that can be standardized and converted into ones and zeroes has the potential to support online commerce. When the legal issues surrounding copyright are finally resolved, this category of goods will include music. When the money is finally found to make broadband connections the norm rather than the exception, it will also include film. Some things it won’t include are groceries, pet food, and furniture. People still prefer to buy most goods in a store, where they can look at them, pick them up, and try them out. In 2000, online purchases came to less than 1 percent of total retail sales. In the publishing industry, where online stores were well established, only one book in fifteen was being bought online.
The main reason that Internet retailing has been such a financial disappointment is that it is more about retailing than the Internet. Any retailer is basically a distributor, purchasing goods from the manufacturers at wholesale and getting them into customers’ hands at retail. Whether the goods are delivered to the nearest shopping mall (regular retailers) or to the doorstep (online retailers) is a secondary matter. Either way, building a distribution system is an arduous and costly operation. Whereas a state-of-the-art retail Web site might cost $25 million a year to develop and maintain, a nationwide system of warehouses and delivery trucks can cost $1 billion to construct and another $200 million to operate. The savings that Amazon.com enjoyed by not having any stores turned out to be largely eaten up by the extra costs of packing, shipping, and promotion. (Since online firms don’t have a retail presence they have to spend more on advertising to remind consumers of their existence.)
Competitive pressures also make the Internet a tough place to do business. In the debate about whether online commerce would most closely resemble the economic model of “perfect competition” or “winner-takes-all,” the supporters of Adam Smith have come out on top. Barriers to entry remain low, and being the first mover is no guarantee of success. Pets.com and eToys both had plenty of name recognition, but they still went bankrupt. As even more of the bubble companies go out of business, the intensity of competition on the Internet will diminish, but profits will still be hard to find. Online shoppers are extremely price-sensitive, and they tend to compare prices on a variety of sites. The Internet has even affected the cost of life insurance, hardly a product that most people associate with e-commerce. Jeffrey Brown, an economist at Harvard, and Austan Goolsbee, an economist at the University of Chicago, calculated that between 1995 and 1997 the presence of Web sites where people could compare prices reduced the price of term life insurance by somewhere between 8 percent and 15 percent.1
The impact of information technology on the economy as a whole is also coming into sharper relief. Even before the terrorist attacks, the three central tenets of the New Economy thesis—lengthy slumps are a thing of the past; Alan Greenspan is omnipotent; and the “productivity miracle” is real—had all been seriously undermined. In the first quarter of 2000 productivity growth was zero. It picked up again in the second quarter, but not to the previous rapid rate. If the economy’s efficiency had really taken a quantum leap, as the optimists claimed, then productivity growth should have continued to accelerate despite the economic slowdown. The fact that it faltered suggests that a good deal of the previous surge was a temporary artifact of the boom, rather than a permanent change.
The “miracle” was further tarnished in August 2001, when the Bureau of Labor Statistics revised down its estimates of productivity growth for the previous few years: for 2000, from 4.3 percent to 3.0 percent; and for 1999, from 2.6 percent to 2.3 percent. For the period from 1995 to 2000 as a whole, the annual rate of productivity growth was revised down from 2.9 percent to 2.6 percent. Taken at face value, the latter figure indicates that productivity growth did pick up during the second half of the 1990s, if not as dramatically as once thought. (Between 1973 and 1995, productivity grew by just 1.4 percent a year.) But since some of the acceleration was undoubtedly cyclical, the underlying improvement was less dramatic than even the new numbers suggest. It now seems reasonable to assume that the underlying rate of productivity growth is about 2 percent, or perhaps 2.25 percent. If this is the case, the economy can sustain growth of about 3 percent, or perhaps 3.25 percent, a year.2 To be sure, this is an improvement over the Old Economy speed limit of 2.5 percent a year, but it hardly adds up to a miracle.
There are several reasons why the New Economy argument turned out to be flawed. One of the most basic was that it exaggerated the role that information technology plays in the economy. Despite the rapid growth of the Internet, firms still spend more money on old-fashioned capital equipment, such as drills and welding machines, than they do on computers, telephones, and other information gadgets. In 2000, at the end of the Internet boom, information technology industries accounted for just 8 percent of gross domestic product. Many big industries that employ millions of people, such as construction, catering, and health care, were largely unaffected by the Internet. In the final analysis, manufacturing still has more to do with assembling bits of wood and metal than it does with exchanging information. Wings have to be attached to planes; roofs have to be put on houses; airbags have to be installed in SUVs. The Internet helps with the planning and organization of such tasks, but it doesn’t turn screws or lay bricks. Nor does it operate on patients or serve businessmen lunch.
What of Alan Greenspan’s theory that information technology made the economy more efficient by reducing the level of uncertainty facing companies? Alas, that too remains unproven. Technology giants like Dell Computer and Cisco Systems had installed sophisticated software to track their customers’ needs, but they failed to predict the disastrous slump in demand at the end of 2000, and they were left with large stockpiles of unwanted inventories. That is not the only problem with Greenspan’s theory. If it were correct, the firms and industries that invested most heavily in information technology would have enjoyed the biggest gains in productivity growth. But some of the biggest buyers of information technology, such as the banking and media industries, recorded hardly any productivity growth at all. The industries that enjoyed the most productivity growth during the 1990s were the producers of information technology (such as the personal computer industry), which benefited from Moore’s Law, not the users of information technology.
For the Internet to alter living standards substantially over the long term, it would have to boost productivity growth throughout the economy, and of that there is little sign. One of the most optimistic, and widely quoted, estimates of the Internet’s impact came from Robert Litan and Alice Rivlin, two economists at the Brookings Institution, a Washington think tank, who claimed it could boost overall productivity growth by 0.4 percent a year.3 However, in arriving at this figure Litan and Rivlin assumed that the Internet would eventually generate big efficiency gains in health care, financial services, and government services—all sectors where productivity growth has lagged in recent years. The forecast may turn out to be accurate, but at the moment it is little more than wishful thinking.
The recent slowdown in productivity growth raises anew the question of where the Internet ranks in the history of great inventions. It is probably more important than the air-conditioner (pace Barton Biggs), but what about electricity? Clean drinking water? The internal combustion engine? Petroleum? Radio and television? Robert Gordon, an economist at Northwestern University, compared the Internet to all five of these inventions, most of which were developed in the last two-decades of the nineteenth century. His conclusion: “Internet surfing may be fun, but it represents a far smaller increment in the standard of living than achieved by the extension of day into night by electric light, the revolution in factory efficiency achieved by the electric motor, the flexibility and freedom achieved by the automobile, the saving of time and the shrinking of the globe achieved by the airplane, the new materials achieved by the chemical industry, the first sense of live two-way communication achieved by the telephone, the arrival of live news and entertainment into the family parlor achieved by radio and then television, and the enormous improvements in life expectancy, health, and comfort achieved by urban sanitation and indoor plumbing.”4 Some of Gordon’s individual points may be questionable, but his overall argument is difficult to dispute. The Internet is a revolutionary means of communication, but it hasn’t made people live longer, changed where they live, or made it any easier for them to get from Paris to New York.
III
When a speculative bubble bursts, there is usually an angry search for the he culprits who pumped it up. After the South Sea bubble, several directors of the South Sea Company were arrested, and the chancellor of the Exchequer, the British equivalent of the secretary of the treasury, was imprisoned in the Tower of London. The United States doesn’t have a Tower of London, but it does have an adversarial legal system, and there have already been lawsuits relating to the role that stock analysts like Mary Meeker and Henry Blodget played in the Internet boom. Most likely, the recriminations from the dotcom era will keep the courts busy for years to come.
Meeker, Blodget, and their opposite numbers at other Wall Street firms are obvious targets. For years, they promoted investments that were ridiculously overvalued, motivated, at least in part, by the desire to win investment-banking business for their employers. Even when the bubble burst, the analysts largely refused to acknowledge the obvious: that they had compromised their objectivity in order to ride the boom. But Meeker and Blodget were hardly the only ones to blame. They were just the most visible figures in an entire industry that created and ballyhooed Internet stocks. From John Doerr, who claimed that the Internet had been under-hyped, to Harvey Houtkin, who described day trading as “entertainment,” countless individuals took part in the big sell. Whether they actually believed what they were saying is an intriguing question. Some did. The Internet appealed to the same human proclivity that spiritual sects and political fanatics have exploited down the ages: a desire to toss out received wisdom and embrace a new creed. Even in the humdrum world of business there was an almost religious aspect to the Internet fervor, and it was by no means restricted to Silicon Valley. Many of the Harvard and Wharton MBAs who flocked to online start-ups sincerely thought they were making business history, rather than playing out their allotted roles in a vast Shakespearean farce.
The attitude of investors and other participants in the bubble varied. Some of them swallowed the notion that the Internet changed everything. Others harbored serious doubts but suspended their disbelief in the interests of making money. (As history has often demonstrated, there is nothing like an investment portfolio that doubles in value every year or two to soothe the skeptical mind.) Still others got caught up in the crowd. If they tried to break away, they risked being punished. When Morgan Stanley and Goldman Sachs declined the opportunity to take TheGlobe.com public, Bear Stearns stepped in and organized the most successful IPO in history. When Mary Meeker downgraded At Home and the stock continued to rise, she looked like a fool.
The fundamental lesson of a speculative bubble is that behavior that seems rational at the individual level can lead to collective insanity. Trapped in the logic of herd behavior, Wall Street will inevitably keep inflating the bubble until it bursts. It is up to journalists and government officials to try to maintain sanity, but in this case neither proved up to the task. Despite some honorable exceptions, the overall standard of reporting on the Internet stock phenomenon was dismal. In many cases, CNBC being only the most obvious example, the media became an active participant in the bubble, turning entrepreneurs like Jeff Bezos into celebrities, and parroting the Wall Street line on stocks like Yahoo! and Amazon.com. Economics was largely responsible for this deterioration. The Internet boom created a vast new source of advertising, which newspapers, magazines, and television networks rushed to exploit. Some journalists genuinely believed in the Internet; others were reluctant converts. Either way, the result was the same: more hot air pumped into the bubble.
The Federal Reserve, which was created expressly to prevent speculative excesses, also failed in its duty. If anybody had the legal, moral, and intellectual authority to prick the bubble, it was Alan Greenspan, but he refused to exercise this power until too late. After his “irrational exuberance” speech in December 1996, Greenspan rarely mentioned the stock market, and when he did, it was usually to say that prices reflected the actions of well-informed investors. There were some valid reasons for Greenspan’s hands-off policy. The long economic expansion that accompanied the stock market boom reduced welfare rolls, raised wages for the poor, and drew many previously excluded members of society into the mainstream. Greenspan was understandably reluctant to raise interest rates when there was no sign of inflation, but rising consumer prices are not the only indicator of an unbalanced economy. During the latter stages of the Internet boom, a soaring trade deficit, a plunging savings rate, and sharply rising indebtedness were all signs that the stock market was driving the economy into an increasingly precarious position. Still Greenspan stood aside—a fact that cannot be totally separated from his ideological beliefs.
As a fervent disciple of the free market, Greenspan believed that people’s investment decisions were largely a matter for them, even if they didn’t make sense. When Nobel Prize–winning economists warned publicly about a dangerous speculative bubble developing, Greenspan refused to act. As a faithful apostle of Ayn Rand, he believed that American capitalism was renewing itself before his eyes. In speech after speech, he stressed the historic changes that were sweeping the economy thanks to the application of information technology. The New Economy thesis would never have become so widely accepted if Greenspan hadn’t seized upon it and made it his own. He liked to play the role of professor, hedging his public statements with qualifications, but Wall Street ignored these qualifications, especially when it was repackaging his lectures for distribution to investors. The message passed to the public was unequivocal: the Fed chairman believed that the New Economy was a reality, therefore higher stock prices were justified. Greenspan knew this was happening and did little to stop it.
Silicon Valley, Wall Street, the media, and the Fed all played roles in the Internet bubble, but when all is said and done it was primarily a story of greed and gullibility on the part of the American public. H. L. Mencken once wrote: “The notion that Americans are a sordid, money-grubbing people, with no thought above the dollar, is a favorite delusion of continentals, and even the English, on occasion, dally with it. It has, in fact, little solid basis. The truth is that Americans, as a race, set relatively little store by money; surely all their bitterest critics are at least as eager for it.”5 Had he lived through the 1990s, Mencken might have altered his views. At the height of the boom, the United States was consumed by the idea of getting rich on the Internet. College students, cabdrivers, construction workers, dentists, doctors, journalists, congressmen, Hollywood stars—they were all buying and selling Internet stocks. No glossy magazine or television newsmagazine was complete without its profile of the latest Internet billionaire. While it is true that Internet stocks eventually spread to Britain, Germany, and Japan, the Internet mania in those countries was nowhere near as intense as it was in the United States, and for good reason.
The Internet economy was an American creation, dominated by American firms, peopled with instantly recognizable American types. There was Marc Andreessen, the gawky farm boy; Jerry Yang, the hardworking Asian immigrant; Jeff Bezos, the high school valedictorian; Henry Blodget, the smooth-talking preppie; James Cramer, the fast-talking Jewish kid; and Jim Clark, the crusty engineer. With this cast of characters, the Internet boom and bust was like an epic miniseries that grabbed the imagination of the country and held it rapt until the final episode had ended. But unlike the case of a television program, the viewers were not passive couch potatoes. By quitting their jobs and joining an Internet start-up, or by simply buying a few hundred shares in Yahoo!, they could become actors in the ongoing drama. The desire not to feel left out fuels all mass movements. Coupled with greed, it is virtually irresistible.
IV
After the bursting of the bubble, the mood of the country changed sharply. Dozens of day-trading firms closed down, while online trading firms such as E*Trade and Charles Schwab suffered big falls in turnover. George Gilder and Jim Cramer fell silent, or, if they didn’t, most people stopped noticing what they were saying. Americans finally switched off CNBC and concentrated on other things in life apart from the stock market. When At Home filed for bankruptcy at the end of September 2001, The New York Times relegated the story to the bottom of its business section.6 Especially in the wake of the terrorist attacks, many of the things that had seemed so important only two years previously now appeared mundane and minor. In 2000 and 2001, applications for the Peace Corps rose sharply, as college graduates and refugees from failed dotcoms looked for more worthwhile things to do. In the nation’s boardrooms, corporate executives set aside their plans for Internet tracking stocks and concentrated on more basic matters, such as preventing the company from going broke. In Washington, the bickering over how to spend the budget surplus was replaced by weighty debates about how to prevent an economic downturn from turning into a chronic slump. After several years indulging itself in fanciful thoughts about the future, America had gotten serious, leaving behind the trivial pursuits of a speculative bubble. Until the next one comes along, that is.
Which probably won’t be for quite a while. Memories of the Great Crash persisted for decades, and it wasn’t until the 1950s that there was another major bout of stock market speculation. The Internet bust wasn’t as traumatic as the Great Crash, but its ultimate consequences remain unclear. It wasn’t the events of October 1929 that seared the events of October 1929 into the public memory, after all, but the Great Depression that followed the crash. Until September 11, 2001, the notion that the Internet bust might be succeeded by another economic cataclysm seemed fanciful. The most likely outcome was a lengthy but relatively shallow recession, during which businesses gradually eliminated the excess capacity they had created during the boom years and consumers gradually rebuilt their savings. Following the terrorist attacks, the economic calculus changed dramatically. For the first time in at least a generation, it was possible to construct a plausible scenario for the American economy involving a sharp contraction in gross domestic product, a wave of major bankruptcies, and mass unemployment. This grim version of the future involved all three major categories of private-sector expenditure—business investment, consumer spending, and exports—falling together, as the United States, Japan, and Europe entered a global recession simultaneously. In such a situation, the onus would fall on fiscal and monetary policy, but as the experiences of the United States in the 1930s and Japan in the 1990s demonstrated, it is extremely difficult for a government to revive a depressed economy if the private sector doesn’t cooperate.
The depression scenario was only that, of course, and much depended on how quickly consumers would get over the initial wave of panic that the attacks produced. If Americans could be persuaded to return to the airports and the shopping malls pretty quickly, the economic outlook would improve appreciably, and there would be a fair chance of a reasonably rapid recovery once the excesses of the Internet boom had finally been wrung out of the system. Whichever outcome came to pass, however, one thing was certain: the halcyon days of the dotcom era would not return.
In November 2001, Henry Blodget resigned from Merrill Lynch after accepting a buyout offer that the firm had extended to thousands of its employees. Said Blodget to The New York Times: “It just seemed like a good time to pursue the next thing.”7