Chapter 9

Entrepreneurs and the global economy, 1980–2020

In the decades from 1980 to the end of the twentieth century, the American business landscape changed markedly. The post–World War II period was the high point of managerial capitalism, during which managers at large US firms found markets for their products around the world. Executives and managers working at these firms often exhibited a strong degree of loyalty to their enterprises, banks were heavily regulated and pursued strategies of low risk and steady profits, and union membership was relatively high (about 35 percent of private-sector workers were union members in the mid-1950s).

By contrast, the closing decades of the twentieth century were more difficult to categorize, with several important yet disparate developments that significantly affected American business. First, these decades were marked by deregulation. Starting in the late 1970s, American policymakers sought to reduce government restrictions in some industries to try to make companies more competitive. President Jimmy Carter signed the Airline Deregulation Act of 1978, the Motor Carrier Act of 1980, which deregulated trucking, and the Staggers Rail Act of 1980, which deregulated the railroad industry—an industry that, over decades, had suffered dramatically from automobile and trucking competition. His successor, Ronald Reagan, extended deregulation to include interstate bussing, ocean shipping, and other industries.

This wave of deregulation was in concert with the work of Nobel laureate Milton Friedman, who promoted the idea of free-market economics to combat rising unemployment and inflation. The application of such ideas, especially during the Reagan presidency, represented a major departure from the thinking of the New Deal. Reagan proclaimed in his inaugural address, “Government is not the solution to our problem, government is the problem.”

Second, the closing decades of the twentieth century saw the rise of shareholder capitalism, a term popularized by Friedman, in which the long-standing balance of power among shareholders, boards of directors, and managers began to shift. Previously, shareholders at large public companies could seldom influence managers or executives. However, beginning in the 1970s, institutional investors, such as giant pension funds and hedge funds, began to grow large enough to monitor and even influence corporate leaders. Board members and shareholders became increasingly active in response to increased international competition, corporate failures (such as the massive Penn Central bankruptcy in 1970), and strategic mistakes (including the wave of conglomerates and divestitures). Moreover, in the 1980s, reports of “hostile takeovers” and “corporate raiders” became increasingly common, as shareholders found other, more aggressive, ways to circumvent the authority of corporate managers.

In response to these developments, businesses started to rethink their understanding of corporate governance—who runs a corporation and how it is structured—as the power of managers and executives waned in comparison to that of increasingly powerful directors and shareholders. Notions about how to measure a company’s valuation began to change in parallel. While previously tied to corporate assets, products, employees, and other concrete metrics, in the 1970s valuation shifted toward a market-driven approach. The economist Michael Jensen argued that companies should focus on “shareholder return” and advocated for tying the compensation of CEOs to stock market performance—an idea that contributed to skyrocketing payment packages.

Finally, the closing decades of the twentieth century were also a period of financial innovation, as banks broke away from the relatively conservative strategies of the 1950s and 1960s. This shift was largely the result of a more lenient regulatory environment—evidenced, especially, by the 1999 repeal of the Depression-era Glass–Steagall Act, which had mandated the separation of commercial and investment banks. In the same years, activity on the stock market grew rapidly. While roughly 160 million shares were traded per day on the New York Stock Exchange in 1990, that figure rose to 1.6 billion in the early twenty-first century. The financial future of many Americans became entwined in the activities of the market, as nearly one-quarter of household wealth was tied to stocks. Financial transactions also became more complex as hedge funds, which pooled money from institutional investors, increased in numbers and operated in “derivatives,” new higher-risk products derived from conventional securities.

In the context of this changing business landscape, computer technologies proliferated in the final decades of the twentieth century with important impacts on oversight, governance, and finance by improving communication, calculation, and information processing.

Silicon Valley

While many large-scale companies in traditional sectors—for example, automobiles and consumer electronics—were rethinking their strategy and structure, a new site of innovation emerged on the West Coast, stretching from San Jose to Seattle. It would become a center for high-tech industries such as computing, aviation, and advanced engineering.

The growth of a high-tech cluster on the West Coast would have seemed unlikely in the period immediately after World War II. Then, the nascent computer industry came to be dominated by established firms, including IBM and the so-called Seven Dwarfs—NCR (founded in Dayton, OH), Sperry Rand (Brooklyn, NY), Control Data (Minneapolis, MN), General Electric (Schenectady, NY), RCA (New York, NY), Honeywell (Wabash, IN), and Burroughs (Plymouth, MI)—which were all located east of, or near, the Mississippi River. Moreover, with Digital Equipment Corporation and Wang, the area around Boston’s Route 128 led the development of the minicomputer, a smaller and more affordable alternative to mainframes.

After the 1970s, however, the center of technological innovation in computers shifted to the West Coast. The small strip of California known now as Silicon Valley began as a fertile stretch of land filled with plum, almond, and apricot orchards. The area’s fruit-drying and -packing plants had been transformed to perform war production during World War II, and after the war, Lockheed, Northrup Grumman, and Litton Industries used the facilities to produce aircraft, radios, and engines for the civilian economy. Beginning in the 1960s, the area started to establish itself as a center for innovation in the sciences and, importantly, in technology.

Frederick Terman, a professor of engineering at Stanford University, was a pioneering figure in the early years of Silicon Valley. In 1945, Terman became dean of the engineering school and, a decade later, university provost. He believed that universities were a “natural resource” for industry—a necessary element of technological entrepreneurship. As Terman wrote, “Industry is finding that, for activities involving a high level of scientific and technological creativity, a location in a center of brains is more important than a location near markets, raw materials, transportation, or factory labor.” In this spirit, Terman set aside 209 acres of university land in 1951 to lease to private tech companies. Stanford Industrial Park, as it was known, became home to Hewlett–Packard, Lockheed, Xerox (PARC), Varian Associates, and GE, among others. Terman fostered academy–industry cooperation by encouraging faculty consulting, inviting industrial researchers to teach specialized courses, and creating an honors cooperative program that allowed students to earn degrees while working full-time.

Fairchild and Intel

One of the most influential figures in Silicon Valley was William Shockley. A solid-state physicist trained at the Massachusetts Institute of Technology, Shockley was instrumental in the invention and development of the transistor in the 1940s. In 1954 he left Bell Labs in New Jersey and moved to Palo Alto, where he had spent his youth. There, he started a transistor manufacturing company called Shockley Semiconductor Laboratory. He was among the first to use silicon to build transistors, rather than the more commonly used germanium.

Despite Shockley’s early success, his autocratic management style and erratic personality prompted mutiny when he decided to stop research into silicon-based semiconductors. Eight senior scientists left Shockley Semiconductor and, with a $1 million investment from New York financiers Arthur Rock and Sherman Fairchild, established Fairchild Semiconductors in 1957. Two of the so-called Traitorous Eight—Robert Noyce and Gordon Moore—soon left Fairchild Semiconductor in 1968 to found Intel, where they continued work on memory chips and eventually the microprocessors that revolutionized computing technology.

In 1965, Moore made a prediction, known to posterity as “Moore’s Law.” He estimated that the number of transistors on a single microchip would double every year. (In 1975, he revised the estimate to every two years.) He was right: Computing capacity doubled at the pace he predicted until the end of the twentieth century. In 1960, a single transistor cost about a dollar. At the end of the century, a single dollar could buy 10 million transistors. As the devices got smaller, they also got more powerful and more efficient. By 1976, Intel was the largest supplier of semiconductor components in the world. By 1979, profits there reached $78 million—nearly forty times what the company had earned in 1972.

Like Intel, many tech start-ups received support from venture capital funds. In 1961, New York–based Arthur Rock moved to California and, in partnership with Thomas A. Davis Jr., founded the venture capital firm Davis & Rock, which, in addition to Fairchild and Intel, invested in Apple, Scientific Data Systems, and Teledyne. The collaboration between scientists, entrepreneurs, educators, federal and state research sponsors, and venture capitalists allowed Silicon Valley to blossom into a center of innovation, with the natural boundaries of mountain and ocean keeping companies in collaborative proximity. By the 1990s, the region was the fastest-growing high-tech sector in the country—with more than 250,000 employees working in high-tech jobs.

Silicon Valley firms also became notable for their distinct corporate culture. Employees did not expect career-long employment as they had in big firms in the 1950s and 1960s. Instead, the phenomenon of job-hopping—staying less than two years in a particular company—became increasingly common. Day-to-day work culture was less formal, with more casual dress; corporate architecture was flatter; and the upper levels of management were slightly more diverse. Although they were by no means representative of the diversity of American society, firms were nonetheless more inclusive than the homogenous ranks of managers described in Whyte’s Organization Man (1956). Silicon Valley firms came to draw talent from all over the world; in 1998, entrepreneurs originally from China and India were running 25 percent of start-ups in the region.

The rapid rise of Silicon Valley led to renewed interest in understanding the cycles of corporate growth and failure in the 1990s. In Innovator’s Dilemma (1997), Harvard Business School professor Clayton Christensen looked to the work of Joseph Schumpeter, an Austrian-born political economist. Schumpeter coined the phrase “creative destruction” to describe the process through which entrepreneurs continually disrupted existing firms and work traditions by making innovations in technology, management, marketing, or finance. Christensen applied this idea to explain why well-managed, leading companies had fallen behind. He noted the demise of Sears in the face of discount retailers, IBM in the area of minicomputers, and Xerox in the era of personal computers and desktop copiers. Technological innovation, Christensen argued, demanded a different type of management than that which had built and sustained successful companies in the past. Because large companies focused on incremental improvements to products for their established customers, they tended not to invest in emerging technology markets. Smaller, nimbler, more “entrepreneurial” firms, by contrast, were not hindered by bureaucracy and could target new customer bases and “disrupt” the market with the introduction of new products, patented technology, and services.

image

10. In this declassified CIA satellite image of University Avenue in Palo Alto, California, in 1984, buildings and streets resemble silicon chips, transistors, and resistors in a circuit board. In the technology behind the map and the details of the image, it depicts a world shaped by enterprise, far removed from that of colonial cartographers featured earlier in this book.

The personal computer

Perhaps the most disruptive new technology of the late twentieth century was the personal computer. In the early 1970s, Intel, Texas Instruments, Motorola, and others were producing integrated circuits, mostly to provide memory for large mainframe computers. There was also a growing number of hobbyists and tinkerers eager to experiment with new technology. The Homebrew Computer Club, founded in 1975 in Menlo Park, provided a place for enthusiasts to gather, share new ideas, and view new inventions. Steve Wozniak, who attended Berkeley and had an immense talent for engineering design, went to one such club meeting and afterward set out to build what would become the Apple I computer.

Apple cofounder Steve Jobs was instrumental to the early success of the company. Rather than just selling the design to an established firm, Jobs convinced Wozniak that they could make and market the computer themselves, and they set up a makeshift manufacturing operation in Jobs’s garage. In 1976, the Apple I, which was a well-thought-out circuit board that could be attached to a screen or monitor, was launched. They sold the Apple I in an electronics store called the Byte Shop for $500 each. The Apple II—which featured an integrated screen and keyboard and a floppy disc drive for additional data storage—was released the following year and was more marketable. In less than ten years, Apple had sold more than 2 million Apple II computers.

The main difference between Wozniak’s design and existing prototypes for nonindustrial computers was its focus on the potential needs of the home user. Its graphic display, which Wozniak had built with an old television, was more advanced and visually appealing than anything else on the market. This focus on the user’s experience and the possibilities of nonbusiness use for the computer turned out to be the keys to Apple’s survival in the face of competition.

Indeed, Apple did not have the market to itself for long. IBM, which had been the largest company in the mainframe computing market, soon established an independent unit devoted to the development of a desktop computer, or personal computer (PC). To bring a product quickly to market, executives decided to outsource the production of the microprocessor, memory, and software. Microsoft, founded in 1975 by Bill Gates and Paul Allen, won the contract to develop the computer’s operating system.

In 1981, the IBM Personal Computer was officially introduced, with a price tag of $1,565, and it was wildly successful. Sales were so great that in 1983 the company created a division to manage the manufacturing and distribution of the PC. This group’s output and revenues were large enough that, had it been an independent company, it would have made the Fortune 500 list that year. In 1985, Microsoft introduced a new operating system, Windows 1.0, an advance that introduced a point-and-click interface.

IBM’s decision not to build its own software or a proprietary operating system led to a boom in entrepreneurial firms creating peripheral components and software applications for the IBM PC. Rather than competing by making new operating platforms, innovators focused on designing software for spreadsheets, management databases, word processing, graphic design, and finance. However, although sharing its operating system had allowed IBM’s PC to become the industry standard, it also enabled competitors to build clones. Eventually, some of the firms that used its operating system began to mass-produce high-quality imitators of the PC. Companies including Compaq, Dell, and Gateway steadily attracted IBM customers.

Apple avoided this problem in large part thanks to Steve Jobs’s belief that the company needed to control both hardware and software design to achieve a seamless user experience—a strategy he met with the release of the Macintosh computer in 1984 and later with the iPod (2001), iPhone (2007), and iPad (2010). Apple even introduced its own iTunes store in 2001, an online platform that allowed users to purchase and sync music to their Apple devices. Jobs’s insistence on integrating aesthetics and function, as well as his emphasis on graphic interface and user-friendliness, helped build customer loyalty. In this respect, Jobs was inspired by Polaroid founder Edwin Land, who believed that a successful business must incorporate elements of both science and art. However, Jobs’s appeal went far beyond customer loyalty. His mark on the company was so strong that soon people referred to “the cult of Steve Jobs.” His rise from humble beginnings, and his production of truly beautiful technology, endeared him to many around the world. In 2018, Apple became the first American trillion-dollar corporation, just over a century after U.S. Steel became the nation’s first billion-dollar corporation (in 1901).

In the 1980s the rest of American business looked to Silicon Valley and aimed to harness the entrepreneurial spirit that flourished in the Bay Area. The admiration for Jobs extended to the American business community and rekindled respect for entrepreneurial values and charismatic CEOs. Such thinking helped to set the scene for business leaders such as Lee Iacocca, who sought to revive the struggling Chrysler Motor Company in the 1980s, bringing out the Dodge Caravan and Plymouth Voyager and promoting smaller cars, like the Dodge Omni and Plymouth Horizon.

The Internet

“The Internet did not originate in the business world,” wrote sociologist Manuel Castells; “It was too daring a technology, too expensive a project, and too risky an initiative to be assumed by profit-oriented organizations.” Instead, the communication technology now called the Internet was pioneered by the Department of Defense, specifically the Information Processing Techniques Office of the Advanced Research Projects Agency. After the launch of the Soviet satellite Sputnik in 1957, the United States redoubled efforts to outpace the Soviet Union with superior military technology. ARPANET, the first network enabling interactive computing, was one innovation to come out of this military research agenda. Initially limited to use in government and academic research, by the mid-1990s this network technology was opened up for public and commercial use—marking the advent of the World Wide Web, invented by the English computer scientist Tim Berners-Lee, which revolutionized the way business was done around the globe.

To make the Web viable, Netscape Communications developed the Netscape Navigator browser in 1994, and Microsoft released Internet Explorer the following year. By 1994, there were about 2,700 websites, including ones for the White House, the Economist magazine, and a fan site for the animated television show The Simpsons. (Roughly twenty-five years later, there were an estimated 1 billion websites.) With the personal computer and the Internet, businesses suddenly had the capacity for the constant and almost instantaneous exchange of information.

The Internet diffused rapidly. By 2014, about 75 percent of Americans owned either a desktop or a portable laptop computer that could access the Internet. Globally, about 38 percent of the world’s population owned a computer. In the 2010s, with the rise of the smartphone and the spectacular range of applications made for it, this figure increased dramatically. In 2011, just 35 percent of Americans owned a smartphone. By 2018, that number had grown to nearly 80 percent. In less than forty years, individually owned computers had gone from being isolated devices, the size of televisions, to fitting into a coat pocket, instantly available, and connected to machines around the world.

Personal computers, the Internet, and smartphones changed the strategies of business in nearly every industry—including manufacturing, services, transportation, and retail (for instance, with inventory-tracking and point-of-sale registers). Computers also brought transformational change to the financial services industry. In 1950, banks did not have computers. By the early twenty-first century, every facet of the banking industry depended on them, from interaction with users (through ATMs and online banking portals) to internal management, research, tracking investments, handling foreign currency, and competition with other banks.

Crisis and aftermath

Regulators and investors had difficulty monitoring and evaluating the new Internet-based business ventures. The dot-com bubble of 1995 to 2000, for example, was a period of excessive speculation in Internet stocks, with some new initial public offerings achieving lasting success, but many more failing. (The Internet apparel company Boo.com, for instance, raised $185 million in venture capital funds, but the company lasted only about eighteen months.)

As a result, the early 2000s experienced a period of both financial innovation and financial scandal. Perhaps the best-known example was the energy company Enron, which eventually folded after revelations in 2001 of long-term fraud and corruption. The extent of the fraud was such that Enron’s failure had reverberating impacts across much of the financial services industry and led to the passage of the Sarbanes–Oxley Act (2002), which required greater transparency in the filing of financial reports.

The great crisis, however, was yet to come. The global financial crisis of 2007–8 began in the subprime mortgage markets and then escalated into a global banking crisis. One casualty was New York–based Lehman Brothers, the international financial services firm founded in 1844 as a dry goods business in Alabama. In the early 2000s, Lehman Brothers, then under the direction of CEO Richard Fuld, became a leader in the subprime mortgage industry—mortgages granted to individuals with poor credit scores. Banks compensated for the high risk associated with these mortgages by charging borrowers higher interest rates. In 2006 alone, Lehman securitized $146 billion in subprime mortgages. But that year also saw rising rates of loan default and home foreclosures. Lehman, which purported to have made tens of billions of dollars in home loans, declared bankruptcy in 2008. Washington Mutual Bank, then the United States’ largest savings and loan association, also went bankrupt.

In 2008, the federal government bailed out another failing group, the American International Group, for $180 billion, in part because of the idea that it was “too big to fail”—meaning that its collapse would also bring down its trading partners, including Goldman Sachs, Morgan Stanley, and Bank of America. The widespread financial crisis was followed by a global downturn called the Great Recession, which lasted through 2010.

The Great Recession marked a period of transition for Silicon Valley and Seattle. The West Coast, known as being a hub for entrepreneurial start-ups, became the home of several massive corporations that had weathered the crisis, including some that had been around for decades, such as Apple and Microsoft, and some new ones, such as the e-retailer Amazon (founded in 1994), the search engine Google (1998), and the social media company Facebook (2004).

These multinational companies came to dominate many product areas beyond their original interests. Google launched a number of programs that offered users astonishing amounts of information and new capabilities, including Google Earth (2001), Google Maps (2005), and Google Translate (2006). Google also became a provider of content with the $1.65 billion acquisition of YouTube in 2006 and a manufacturer of hardware with the introduction of its own smartphone in 2010.

Amazon was another impressive example of large-scale global diversification. Jeff Bezos got the idea for his Web-based “everything store” while working at a New York hedge fund in the 1990s. At first dubious about commercial applications of the Web, Bezos was impressed by the record-breaking growth of Internet usage in the early 1990s. He became convinced that the Internet was the next major retail platform, both because of its broad accessibility and because, freed from the limitations of traditional stores, it allowed companies to offer consumers comparatively limitless selection.

Amazon began as an Internet retailer selling nothing but books. After just one year in operation it had 180,000 customer accounts and took in $16 million in revenue. It proved enormously disruptive to the publishing industry and, by 1998, its sales exceeded that of Barnes & Noble and Borders, then the largest brick-and-mortar booksellers. It then began to sell music and videos and then toys, software, games, and consumer electronics, on its way to becoming the largest Internet retailer in the world, by revenue, and a leading multinational in e-commerce, cloud computing, and digital streaming. The legal scholar Lina Khan compared Bezos’s strategy to John D. Rockefeller’s at Standard Oil in the late nineteenth century: like Rockefeller, Bezos worked deliberately to build a business that could not only capture market share rapidly, but also defend against new market entrants, for example, by engaging in predatory pricing.

Another part of Amazon’s strength lay in its leveraging of user data. Through analyzing clicks on its website, Amazon gleaned detailed, in-depth information—never before available to retailers or businesses of any kind—on the millions of users who freely shared their likes, product preferences, and consumption habits. The acquisition of user data by the largest Internet companies, including Amazon, Facebook, and Google, marked the emergence of a new business model, as companies turned human experience—including everything from personal photos, correspondence, exercise metrics, shopping preferences, and much more—into data to be bought and sold. Moreover, this information was analyzed, shared, and fed into mathematical formulas that sought to predict individual behavior and to modify what news, advertisements, and prices consumers would see. For many, the harvesting and application of such data had chilling consequences for the rights to personal information.

The computer industry had undergone a remarkable transformation in the years since the personal computer was introduced. By 2020, on the Fortune 500 list, Amazon, Apple, Google (Alphabet), Facebook, and Microsoft were well above oil companies (including Exxon Mobil), automobile companies (such as GM) and traditional strong performers (including Coca-Cola and GE) in terms of market value. Moreover, the activities of these new Internet-based firms were so vast as to raise deep political and social questions—just as, for instance, automobile companies had in the twentieth century in terms of pollution and personal safety. For most users of new Internet services, the problem was how to balance the benefits of seemingly limitless information with concerns about privacy. For the business scholar Shoshana Zuboff, the problem was more far-reaching. She argued that such web-based firms had introduced a new and dangerous form of “surveillance capitalism,” one that was still only in its infancy and one that would pose a continuing challenge for regulators and the public.