I
Speculative bubbles have occurred as far apart as Holland in the seventeenth century, Florida in the 1920s, and Japan in the 1980s. No one explanation fits all of them, but some common antecedents have been identified. Many bubbles, such as the 1840s railway mania in England and the Wall Street boom of the 1920s, are associated with exciting new inventions that create exaggerated hopes of profits. (In the 1920s, there were radio, talking pictures, and passenger aircraft.) War, particularly the end of a war, is another frequent precursor. The South Sea Bubble of 1720 followed the peace agreement that finally ended the hundred-year war between Britain and France. Here in America, it was after the Civil War that rascals like Jay Gould and Jay Cook organized their speculative pools, which helped send stock prices (temporarily) into the stratosphere. And bubbles are usually associated with periods of prosperity, when the future seems bright, investors are cocky, and there is easy access to money and credit. The Japanese bubble of the late 1980s followed twenty years of unprecedented economic success.
The Internet bubble fit the broad historic pattern, but it had its own idiosyncrasies. Technology provided the focus for the speculative mania, but it can’t fully explain what happened. Revolutionary inventions are nothing new, after all. Television and the jet aircraft arguably changed people’s lives more than the World Wide Web, but neither led to a speculative binge. Evidently, there was something about American society in the middle of the 1990s that made it susceptible to an outbreak of stock market hysteria.
The end of the Cold War surely played a role. In the wake of Nagasaki and Hiroshima, technological progress was associated in many people’s minds with nuclear destruction. It was difficult to be bullish about a trend that seemed likely to result in the destruction of humanity. From Thomas Pynchon’s Gravity’s Rainbow to Stanley Kubrick’s Dr. Strangelove, the books, plays, and movies of the Cold War era were filled with apocalyptic imagery. When the Warsaw Pact collapsed, popular attitudes toward technology changed. For the first time in decades, people felt safe. Science and technology began to appear as benign forces that had created the color television, the PC, and numerous lifesaving medicines. As the 1990s progressed, Theodore Kaczynski (a.k.a. the Unabomber) and a few others apart, it became difficult to find anybody who remained opposed to technical progress.
Socialism’s demise had other important consequences. Capitalism—American capitalism specifically—was almost universally accepted as the only viable model for economic development. People the world over looked to the United States not just for military leadership, but also for lessons in economics. The key to American success was widely believed to be a combination of free markets and technical progress. Janos Kornai, a Hungarian economist who now teaches at Harvard, encapsulated this thinking in an essay in the Journal of Economic Perspectives on the eve of the millennium. The twentieth century yielded two lessons, Kornai argued: capitalism is a necessary condition for democracy; and socialism cannot survive because it doesn’t foster enough technological innovation.1 When the Internet arrived, it was seen as the latest triumph of American enterprise. The fact that Pentagon-funded academics created the network and a European invented its most important application didn’t seem to matter. In Britain, for example, Tony Blair’s New Labor government was consumed with trying to replicate the freewheeling culture of Silicon Valley.
The Internet’s appeal was partly ideological. In the summer of 1989, Francis Fukuyama, a senior official at the State Department, argued that history, viewed as the ongoing clash of rival ideologies, had ended. Writing in The National Interest, Fukuyama argued that liberal democracy of the type pioneered by the United States may constitute “the final form of human government.”2 The significance of Fukuyama’s piece was widely debated (partly because it was written in the mystifying argot of G. W. F. Hegel, an obscurantist German philosopher), but it captured a moment when the future seemed blurred. If the ancient debate about how society ought to be organized was over, what was to be the new organizing principle for history? Where was mankind to go next? The Internet enthusiasts had a ready answer: into cyberspace. Here was a medium that transcended the old divisions of class, geography, and nation-state. The Internet was global and anonymous, but also personal. From Lima to Lagos to Laos, anybody with a phone line and a computer could communicate with people on the other side of the Earth. And once a person was online, no government could tell him or her what to do. A student in Shanghai could log on to the White House Web site (www.whitehouse.gov), which was launched in 1993, and read what the president’s spokesman had said about U.S.-China relations in his press briefing that morning. A researcher in Stockholm could swap notes with a colleague in Rome. A teacher in Sydney could have online sex with an accountant in Chicago.
To borrow an ugly phrase from the social sciences, the Internet represented a new paradigm for human development. Paradigms are stories (not necessarily true stories) that help people to organize their thoughts. Most of us use paradigms, even though we sometimes don’t realize it. The straight line is a paradigm. Nobody has ever seen a perfectly straight line, but it’s hard to imagine the world without the concept. The Internet paradigm was one that liberal utopians like Bertrand Russell and Robert Owen had dreamed about: a worldwide community in which distance had been conquered, national hatreds had disappeared, and borders meant nothing. “Computing is not about computers anymore. It is about living,” Nicholas Negroponte, head of the Media Lab at MIT, wrote in his 1995 book Being Digital.3 “As we interconnect ourselves, many of the values of a nation-state will give way to those of both larger and smaller electronic communities. We will socialize in digital neighborhoods in which physical space will be irrelevant and time will play a different role. Twenty-five years from now, when you look out a window, what you see may be five thousand miles and six time zones away. When you watch an hour of television, it may have been delivered to your home in less than a second. Reading about Patagonia can include the sensory experience of going there. A book by William Buckley can be a conversation with him.”
II
Thursday, August 12, 1982, didn’t seem like a momentous day in American history. Events in the Middle East dominated the news. The Israeli military, which had been invading Lebanon for two months, launched its biggest air raid yet on Beirut, killing more than a hundred people, many of them civilians. President Reagan expressed “outrage” at the Israeli attack, but the Israeli prime minister, Menachem Begin, appeared to take little notice. In Beverly Hills, Henry Fonda died at seventy-seven following a long illness. In sports, the White Sox handed the Yankees their third straight loss. On Wall Street, it was a quiet day. With the economy stuck in a slump, there was little interest in buying stocks. The Dow Jones Industrial Average fell by 0.29 points, closing at 776.92. The fledgling Nasdaq National Market fell by 0.93 points, to 159.84. Nobody realized it at the time, but when the closing bell rang at the New York Stock Exchange the last bear market of the twentieth century had ended. The following morning, the Federal Reserve, led by Paul Volcker, cut short-term interest rates by half a percent in an effort to revive the economy. It was the third rate reduction in six weeks, and it worked. From then on, the stock market and the economy recovered in tandem. By the end of 1982, the Dow was trading above 1,000. The greatest bull market ever—one that would see the Dow rise more than tenfold and the Nasdaq rise almost thirtyfold—was under way.
An entire generation of newborns would get their driver’s licenses before the stock market would see another lengthy downturn. High school students would go to college, get married, have children, and approach middle age to the accompaniment of a rising stock chart. Given the longevity of the bull market, it is hardly surprising that so many Americans came to regard buying stocks as an easy way to get rich. The Internet stock mania during the late 1990s was like the frenzy on the dance floor at the end of a wedding party. The disc jockey may have prompted the excitement by playing a popular tune, but the real reason that people were singing at the tops of their lungs and waving their hands above their heads was that the free bar had been open all day.
The stock market’s sustained ascent was the central and dominating fact of American history in the 1980s and 1990s. It was based on a restructuring of American capitalism. At the end of the 1970s, the conventional wisdom among economists was that overseas firms, especially Japanese firms, had become a lot more efficient than their American rivals. Starting in the recession of 1982–1983, firms like General Electric and General Motors slashed their payrolls, invested in laborsaving equipment, and shifted production to cheaper locations abroad. American workers suffered, at least in the short term, but corporate profits picked up sharply. Since a stock certificate is simply a claim on a firm’s future earnings, stock prices rose to reflect the brighter outlook for profits. By the summer of 1987, the Dow was trading above 2000.
At the same time as firms were restructuring, popular attitudes toward saving and investing were changing. Traditionally, many Americans viewed Wall Street with suspicion. Putting money into the stock market was seen as a rich man’s game. In 1983, the wealthiest 1 percent of households in America owned 90 percent of all stocks. The vast majority of households—about three out of four—owned no stocks at all. Most of these families, if they had any surplus income, kept it in the bank. Fifteen years later, in 1998, things had changed substantially. Almost half of American households owned stocks, either through individual shareholdings or through mutual funds, and the proportion of stocks owned by the top 1 percent had fallen to about 80 percent—still a very unequal share, but less so than it used to be.
The emergence of popular capitalism was largely an accident. During the 1970s, many economists were concerned that Americans were saving too little for retirement, thereby putting the Social Security system under strain and depriving businesses of the funds needed for investment. To encourage thrift, Congress introduced a series of tax incentives for saving. In 1974, the first individual retirement accounts (IRAs) were introduced, but the standards for qualifying were strict, and they didn’t really catch on. In the Tax Reform Act of 1978, legislators loosened things up a bit by allowing workers to contribute their cash bonuses to retirement savings accounts on a tax-deferred basis. The wording of this clause, number 401(k), was vague, and it attracted the attention of R. Theodore Benna, an employee benefits consultant in Langhorne, Pennsylvania.
One Saturday afternoon in 1980, Benna, who was then thirty-nine years old, was helping one of his clients, a local bank, to redesign its employee pension plan when he had a thought. If cash bonuses could be sheltered from tax under clause 401(k), then why couldn’t regular income be sheltered in the same way? There didn’t appear to be anything in the statute that specifically ruled it out. “My approach was that if the code doesn’t say, ‘Thou shalt not,’ then thou should be able to,” Benna, a devout Baptist, later recalled.4 Acting on this flash of inspiration, Benna designed a retirement plan that would allow employees to contribute a portion of their paychecks on a pretax basis to a savings account. A few months later, Benna’s own firm, the Johnson Companies, launched the first 401(k) plan. In November 1981, the Internal Revenue Service gave Benna’s creation its official blessing. With legal approval, the new saving plans spread rapidly, and by 1985 more than 10 million employees had one. This was only the beginning. At the end of 2000, more than 40 million Americans had 401(k) plans, and the accounts contained about $1.7 trillion in assets—enough to put every public student in the country through Ivy League schools.
Through their 401(k) plans, tens of millions of middle-class families were introduced to the stock market. In most cases, the initiation was via mutual funds—investment companies that pool their shareholders’ money and invest it in a range of financial assets. Mutual funds have been around for decades, but it wasn’t until the arrival of the 401(k) plan that they became a part of everyday life. The first mutual fund, it is generally agreed, was the Massachusetts Investors Trust, which opened its doors in 1924 with a promise to invest its shareholders’ money soundly and publish its holdings. At the time, such openness was a novelty. Many investment funds—or “investment trusts,” as they were then called—were run by swindlers and rogues, who spent their shareholders’ money as they saw fit. In the stock market crash of October 1929, many investment trusts, including some operated by eminent Wall Street firms, went broke. After that experience, most Americans with money chose to invest it on their own behalf, and this remained the case despite the Investment Company Act of 1940, which regulated investment pools. At the beginning of 1980, the total amount invested in stock and bond mutual funds was just $49 billion.
The growth of 401(k) plans and IRAs transformed the mutual fund industry. Between 1981 and 1985 the number of mutual funds increased from 665 to 1,527. Initially, much of the new money went into money market funds, but this soon changed. In 1985, for the first time, the total amount in stock and bond funds surpassed the amount in money market funds. It is no accident that the bull market and the development of the mutual fund industry coincided. Rising stock prices drew money into mutual funds, and mutual funds poured money into the stock market. During 1995, Americans invested more than $100 billion in stock funds, and the figure went up from there. Between 1996 and 1999, almost $170 billion a year flowed into stock funds.
By the middle of the 1990s, there were about 130 million mutual fund accounts, and names like Fidelity and T. Rowe Price were as familiar to Americans as Citicorp and Chase Manhattan. At the end of 2000, mutual funds contained more money than the banking system—about $7 trillion, of which more than $4 trillion was in stock funds. New funds were being created every day, and the mutual fund listings were taking up almost as much space as the stock tables. At the start of 2001, the point of absurdity was reached. The number of mutual funds topped eight thousand, which meant there were more mutual funds than there were stocks listed on the New York Stock Exchange and the Nasdaq combined.
III
It is a curious fact of history that mass movements often invent their own vocabularies. The Jacobins did during the French Revolution; the antiwar protesters did during the 1960s; and stock market investors did during the 1980s and 1990s. Much of the jargon they adopted dates from the 1950s and 1960s, when economists like Harry Markowitz, of the University of Chicago, William Sharpe, of Stanford, and Paul Samuelson, of MIT, invented the field of financial economics. (All three were later awarded Nobel Prizes for their work.) Before the 1950s, economists regarded the stock market as an emotional and unpredictable beast, which was better studied with the tools of psychology. John Maynard Keynes compared stock picking to a newspaper beauty contest in which readers are asked to pick the six prettiest girls from a hundred photographs, the winner being the person whose choices correspond most closely to the selections of the entrants as a whole. “It is not a case of choosing those which, to the best of one’s judgement, are really the prettiest, nor even those which average opinion genuinely thinks the prettiest,” Keynes wrote in The General Theory of Employment, Interest and Money. “We have reached the third degree where we devote our intelligences to anticipating what average opinion expects average opinion to be. And there are some, I believe, who practice the fourth, fifth and higher degrees.”5
Some people (including this author) would argue that Keynes’s metaphor remains an apt way of describing the way investors behave, especially during a speculative mania, but the economists of the postwar era looked at things differently. To them, stock market investing represented a straightforward scientific problem: How could prospective returns be maximized for a given level of risk? Or, equivalently, how could risk be minimized for a given level of desired returns. The solution that Markowitz et al. came up with, after many pages of complicated mathematics, was surprisingly simple: invest in a diverse portfolio of stocks and bonds, then hold on to it for as long as possible. Some of the investments will fall and others will rise, but over the long term the investor will do a lot better than he would by leaving his money in the bank or by studying the newspaper and buying individual stocks.
To mutual fund managers, who made their living providing investors with diversification, this conclusion was greatly welcome. The mutual fund industry spent heavily to inform the public about the purported benefits of “risk management,” “portfolio diversification,” and “long-term investing.” (Not surprisingly, the way to achieve these goals was to buy more mutual funds.) Academics and financial journalists also helped spread the message that investing had been reduced to a formula, as did magazines like Money, Smart Money, and Fortune. Publishing investment guidebooks became a lucrative industry, although much of what these books said could be encapsulated in a short sentence: “Invest for the long term and spread your risk.” By the mid-1980s millions of Americans regarded the two mantras of scientific investing—“diversify” and “buy and hold”—as gospel. Then, on October 19, 1987, the Dow fell by 508 points, or 22.8 percent, its biggest-ever drop. Black Monday, as it came to be known, presented a challenge to the new investment doctrines. Diversification didn’t help investors much during the market’s collapse, because virtually every stock on the New York Stock Exchange plummeted, as did the vast majority on the Nasdaq and the American Stock Exchange. “Buy and hold” also took a severe knock. For several months before the crash, numerous analysts had been saying that the market was due for a fall. Investors who acted on this advice saved a lot of money.
The crash shook the public’s faith in the stock market, but didn’t break it. During 1988, American households took more money out of stock mutual funds than they put into them. Then, as stock prices recovered, confidence returned. In August 1988, the Dow passed its September 1987 peak. Ultimately, the message that many Americans took from October 1987 was that the market always comes back, so selling in a downturn is a big mistake. “Investors learned firsthand in the 1980s that it’s usually smart to stick,” Jane Bryant Quinn, Newsweek’s personal finance maven, wrote in September 1993, when some analysts were fretting that the Dow, then perched at about 3,650, was heading for a fall. “If you’d held every stock in the Standard & Poors 50 Stock Index over every five-year period since 1926, and reinvested the dividends, you’d have made money 89 percent of the time, reports Ibbotson Associates in Chicago. So quit worrying about your mutual-fund investments and worry about Bosnia instead.”6
Jeremy Siegel, a professor of finance at the Wharton business school, was the most influential proponent of the “buy and hold” philosophy. In 1994, Siegel published a book called Stocks for the Long Run, in which he argued that “stocks have been chronically undervalued throughout history.”7 Siegel was nothing if not thorough. He calculated that stocks had outperformed bonds in every single ten-year period since 1802. On the basis of this evidence, Siegel claimed it was always a good idea to buy stocks, regardless of how far prices had risen in previous months and years. Similarly, he argued, “there is no compelling reason for long-term investors to significantly reduce their stockholdings, no matter how high the market seems.”8 Even people who bought stocks on the eve of October 1929 achieved decent returns if they held on to them for long enough, Siegel demonstrated. Not surprisingly, this message proved popular on Wall Street. Siegel became something of a celebrity, appearing on television regularly and speaking at investment conferences all over the country. At one point, he was even summoned to make a presentation to Alan Greenspan and his fellow governors at the Fed.
IV
Between 1946 and 1964, 76 million Americans were born, a demographic bulge that has been variously ascribed to postwar euphoria, increasing prosperity, and the lack of diverting pursuits during the Truman and Eisenhower years. Whatever its origin, the surge in births inspired a popular theory that explains much of recent American history in terms of the aging of the baby boomers. “From V-J Day forward, whatever age bracket Boomers have occupied has been the cultural and spiritual focal point for American society as a whole,” William Sterling and Stephen Waite wrote in their 1998 book, Boomernomics. In the halcyon 1950s, according to this simple narrative, the boomers grew up; in the 1960s, they discovered drugs and rock and roll; in the 1970s, Richard Nixon and Jimmy Carter disillusioned them; in the 1980s and 1990s, they had kids, started saving for retirement and discovered the stock market. Harry S. Dent Jr., a former management consultant, was the first person to trace the stock market’s rise to the boomers’ encounter with middle age. In The Great Boom Ahead, which was published in 1992, Dent compared historic data on birthrates to the subsequent path of stock prices. “Plot the two, and it looks like the same chart with a forty-five-year lag,” he claimed.9 Dent’s work yielded a simple prediction: the more people that were turning forty-five, the more saving there would be, and the higher the stock market would go. Since the number of boomers in their forties was increasing rapidly, Dent concluded that the Dow could “streak to around 8500 between 2006 and 2010.” In 1992 this seemed like an outlandish prediction.
The boomer theory of stock market behavior remains popular with journalists and Wall Street stock strategists, but it has major weaknesses. For one thing, there is no evidence that the postwar generation is saving more than its predecessors did. To the contrary, personal savings rates in the United States have dropped to levels rarely seen before in the developed world. Back in the 1970s, when economists started to worry about a lack of thrift, Americans saved about ten cents out of each dollar they earned. In the mid-1990s, they saved about five cents on the dollar, but the amount was still falling. In 1999, the personal savings rate turned negative, at least according to some measurements, which meant that, on average, people were borrowing money to finance their spending. Far from saving more for retirement, Americans were on an unprecedented binge—buying 7,500-square-foot McMansions, light trucks disguised as “utility vehicles,” and $5,000 outdoor grills.
True, more of the saving that did take place during the 1990s found its way into the stock market, but this had as much to do with economics as demographics. Following the recession of the early 1980s, inflation and interest rates on bank deposits fell in tandem, which made stocks look relatively attractive. But they looked attractive to Americans of all ages, not just to boomers. As always older people, who have more money, were the biggest buyers of stocks. If the aging of the boomers had any affect on stock prices, it was indirect. By taking part in the biggest spending spree in American history, they provided a fast-growing market for many businesses, which, in turn, led to rising profits and higher stock prices. In the auto industry, for example, annual sales of cars and light trucks rose from about 12 million in the early 1990s to 17 million in 2000. There may also have been something to the argument that the “me generation,” with its openness to individual experimentation, came to view investing in stocks as another form of self-expression. When Joey Ramone, the late punk rock icon, was writing a stock market newsletter, something significant had happened to American culture.
Since popular attitudes are so difficult to measure, such cultural generalizations must remain the currency of pop sociologists and media directors. The fact is that the stock market culture was based on the growth of 401(k) plans and IRAs. Between 1983 and 1998, the share of American households with some type of retirement savings account rose from 11 percent to 23 percent. During the same period, hourly wages hardly grew at all for production workers, especially unskilled workers, and middle-class families struggled to maintain the standard of living that they had come to expect. For many American households that were investing in the stock market for the first time, their 401(k) plans and IRAs offered the only prospect of a decent increase in spending power. This wasn’t the upbeat message that the financial industry wanted to portray, but it was a fact. Rising interest in the stock market also coincided with a big run-up in consumer debt—another sign that many were struggling to achieve their financial goals.
Some Americans invested in the stock market simply because they liked to gamble. According to a study done for Congress, during 1998 more than 125 million Americans wagered money in one way or another, and more than 7 million were problem gamblers. In lotteries and in the casinos, the odds are stacked heavily against the gamblers. On Wall Street, at least for a time, the chances of making money seemed more favorable.