4. MARC ANDREESSEN

“By the power vested in me by no one in particular”

As Bill Gates and others had shown, there were fortunes to be made by an enterprising hacker willing to sell his computer expertise to the general public. But there still was a wide gulf that cut off the tech leaders of this era from social and political power. On one side of the gulf were the leaders themselves, engineers and coders obsessed with computers; on the other were their customers, members of the public or businesses who relied on computers to carry out important tasks but nonetheless thought that having a deep, personal connection with a machine was a bit strange, perhaps even perverse.1 Tech businesses like Microsoft, IBM, and Hewlett-Packard were already driving the economy, but until this gap was bridged—until regular folks shared a hacker’s delight in having a machine fulfill their every whim—these leaders would lack the influence to reshape society according to their values.

This observation may seem, in part, obvious: until computers were intertwined with people’s daily lives, improvements in software or hardware wouldn’t matter much. But there was another aspect as well. Before there could be a generation of Know-It-Alls to bring Silicon Valley–style disruption to America and the planet, the public’s view of hackers needed to radically change. Instead of being feared or pitied, they had to be respected. Their fixation on teaching a machine how to act human—an outgrowth of John McCarthy’s original computer-based artificial intelligence project—had to seem benevolent, not menacing or peculiar. In fact, a comprehensive recalibration was about to begin. Not only would hackers soon be admired for their great wealth and ingenuity, but a different class of people—the Luddites who saw computers as a threat to civic life—would take the hackers’ place as the new obsessives who lacked basic social skills.

A key marker in the public’s changing attitude toward computers—and the young men who loved them—was the arrival of the World Wide Web in the early 1990s. Each of us would grow to love our computers, too. And why not? A computer with access to the Web was nothing less than a revelation, taking you free and easily from essays to stock quotes to pornography to an old forgotten friend to new music to classical literature to video games and back again. A network of clickable hyperlinks propelled a “surfer” along on the Web, an intuitive mechanism for navigating online, especially when compared with the arcane commands and long strings of numbers hackers had used to access the Internet. By this time, there already were paid-subscription services like Prodigy, CompuServe, and America Online, which offered customers a few online tools like email and chatrooms, but the Web was different, promising more and free. Unlike the people who signed up to a subscription service and contributed only to that service or its affiliates, everyone on the Web was pulling together. The collective potential of the Internet, which had largely been hidden from the public since its creation as Arpanet in 1969, would be made plain through the Web.2

The first plans for how the Web might work were sketched out in 1989 by Tim Berners-Lee, a consultant at CERN, the European particle-physics research center located in Switzerland. A circumspect, thirty-five-year-old physicist born in Britain, Berners-Lee fit a very different profile than the one we have grown accustomed to among our brash tech leaders. He stumbled on the idea of the World Wide Web, he says, while building a computer database to keep track of the many scientists and support staff members who shuffled in and out of CERN’s laboratories. Over time, Berners-Lee discovered that he was more fascinated by the links between individuals than by the individuals themselves. His project, which came to life the following year, was meant to highlight and strengthen the ties among the CERN staff, like a shared language or similar research specialty. “The philosophy was: what matters is in the connections,” he recalled. “It isn’t in the letters, it’s the way they’re strung together into words. It isn’t the words, it’s the way they’re strung together into phrases. It isn’t the phrases, it’s the way they are strung together into a document.”3

There was a power in this simple philosophy, for it meant that the Web grew thicker and more interconnected every time someone or some group created a new page or simply added a link to a related page. A decentralized network of users, including many not so adept at computers, were creating what was already an unprecedented digital resource. “The Web made the Net useful because people are really interested in information (not to mention knowledge and wisdom!) and don’t really want to have to know about computers and cables,” Berners-Lee explained.4 Regular folk, not programmers, were steering this ambitious computer project along, and its designer was fine with that.

In fact, so much about the birth of the Web came, as soccer fans might say, against the run of play in computer innovation. A template had emerged from Silicon Valley: young programmers and engineers, nurtured in the windowless computer labs of America’s great universities, came up with projects that they were encouraged to take to market with the backing of venture capitalists. Instead, this most recent breakthrough had bubbled up from a physics lab in the middle of Europe, under the direction of an anti-hacker of sorts who apparently made no effort to profit personally from his discovery!

These developments led to more than a little consternation back in the States, mixed with a determination to get in the game. After all, the hackers were all about how computers, in the right hands, could change the world. A civilization-bending project like the Web was simply too attractive—too potentially profitable—to be ignored and left to its own devices. Users would demand a better system, and Silicon Valley would give it to them. In short order, the Web was pulled toward America, where Berners-Lee’s decentralized, noncommercial vision succumbed to the innovative powers of a new generation of hackers-turned-entrepreneurs. First among this generation was Marc Andreessen, the influential Silicon Valley venture capitalist who at the time was an ambitious twenty-one-year-old computer science major at the University of Illinois, Urbana-Champaign. He was quick to recognize the business potential of a global network like the Web and nearly as quick to act.

Born in 1971, Marc Andreessen grew up unhappily in New Lisbon, Wisconsin, about eighty miles north of the liberal college town of Madison and a world apart. During his childhood in the 1970s and ’80s, New Lisbon had a population of about fourteen hundred and was more than 97 percent white. Marc’s father, Lowell, was a sales manager for a company that sold genetically modified corn seeds, and his mother, Pat, worked in customer service at, among other places, Land’s End. Marc Andreessen describes a life of relative privation—a shared “party” telephone line at home; relatives who had an outhouse; a winter of “chopping fucking wood” when his father decided to stop paying for gas.5 (His friend and business partner, Ben Horowitz, is genuinely nonplussed by the thought of Andreessen, a large man who stands six foot five, wielding an axe: “It is still hard for me to really visualize Marc chopping wood. It’s like asking Einstein to mine coal. How crazy that must have been.”)6

Andreessen displayed classic “compulsive programmer” characteristics as a child, to use Joseph Weizenbaum’s resonant phrase. To start, there was his preteen fascination with the TRS-80 personal computer, which he bought with money saved from mowing lawns, supplemented by a contribution from his parents. Andreessen also had the requisite anti-authoritarian streak. Like John McCarthy at a similar age decades earlier, Andreessen fought a losing battle to be excused from gym class.7 He thought his public high school was an embarrassment, and didn’t hide that opinion. The school had a blandly religious culture, which opposed science, Andreessen recalled, while one history class was “taught out of our teacher’s unpublished 800-page manuscript on the JFK assassination conspiracy.”8 There was a computer lab in high school, but it lacked a modem to connect to the wider world of Internet bulletin boards and university computer centers. Young Marc Andreessen was trapped in rural Wisconsin and, alas, even computers offered little help in making an escape.9

Yet escaping was precisely what Andreessen’s neighbors remembered was on his mind. “I got the feeling that New Lisbon wouldn’t keep him,” said Paul Barnes, the manager of the local supermarket where Andreessen worked as a bagger and stocker, who recalled being impressed by his employee’s large vocabulary and big ideas.10 Andreessen can rattle off the deprivations of being raised in New Lisbon: the cold weather; the poor diet; the ignorant, superstitious farmers; the husbands waking up early to go ice fishing to avoid their wives; the barren intellectual landscape.11 In a profile that appeared in the New Yorker, Andreessen complained of driving an hour to the west to La Crosse, where all you could find was a Waldenbooks with nothing but cookbooks and cat calendars. “Screw the independent bookstores,” Andreessen said in praising the disruption later brought by Amazon.com, which began its march through e-commerce by selling books. “There weren’t any near where I grew up. There were only ones in college towns. The rest of us could go pound sand.”12

Notwithstanding Andreessen’s especially dim view of rural life, he still acts as the protector of his former townsfolk, who, as he once expressed in a post to Twitter, are “well aware that the left, intellectuals, politicians, et al look down on them.”13 The chip that Andreessen carries about his rural upbringing—expressed today from the comfort of the Bay Area—raises questions about what drives his enthusiasm for Internet-based social disruptions. Is it faith in the wonderful new world to come, or anger at the hurdles, real and imagined, that he faced as a super-smart teenager growing up so far from the action? If we as a society are going to accept so much disruption and destruction, the assurances that it will be worth all the suffering should come from a place of compassion, not resentment.

Andreessen’s first steps toward leaving New Lisbon and finding that action included winning a Merit Scholarship in high school and enrolling at the University of Illinois, where he planned to study electrical engineering. There was nothing romantic or idealistic about these choices, he insists. They were purely mercenary. While in high school, Andreessen read an issue of U.S. News & World Report from 1986, which ranked undergraduate majors based on whose graduates earned the highest starting salary; electrical engineering was at the top of the list. The same issue ranked the University of Illinois among the top three schools in electrical engineering, and in short order his college decision was made as well.14 As to why someone fleeing rural Wisconsin would choose a school in nearby rural Illinois, Andreessen says he thought he was heading to a city of sorts, and then discovered that in Urbana-Champaign, “they had a cow with a hole in its gut so you could see it digesting its food. It was that kind of school.”15

Soon after arriving at the university, Andreessen concluded that electrical engineering was too demanding and switched to computer science, still something of an obscure discipline in the late 1980s. “Sometimes I just made things up, but then the field was so new, my professors were making things up, too,” he recalled.16 Andreessen found a purpose in programming and was good at it, to boot. He was chosen for a coveted part-time job on campus at the National Center for Supercomputing Applications (NCSA), one of five such centers created in the 1980s by the National Science Foundation.17 At the center, Andreessen made just $6.85 an hour, but on the bright side he had a desk and an expensive Indigo computer, which he managed to connect to the cable TV box so he could have CNN playing in the background. Around the lab, Andreessen was known as a generally grumpy figure apt to reject tasks as boring or beneath him . . . until that time when a project worthy of his ambitions appeared.18

In 1992, the World Wide Web was up and running but was still lacking a browser that worked on all computers, not just the NeXT computer that Berners-Lee first programmed with. A proper browser, which was highly compatible . . . now that was a worthy project! After all, the Web was the new, new thing on the Internet and a browser was the crucial program for the Web—your transportation, your translator, your window, your pad and pencil, your safety blanket. Later, the Web browser would take on more sinister responsibilities—ankle bracelet, chaperone, corporate listening device. But let’s not get ahead of ourselves. In that ancient year, 1992, the NCSA was among a number of computer labs quick to take up the challenge of producing a browser that was easy to install, could work on different operating systems, and would improve on the intuitive navigation of the Web.

Andreessen lobbied hard to land the assignment and in the fall he was paired with an experienced staff programmer, Eric Bina, who took on the difficult coding, freeing up young Andreessen to keep his eye on the big picture. “Marc is a strong driving force for changing the world. He is clearly driven to do that,” Bina said. He added, by way of contrast, “I don’t feel driven to change anything but my own situation.”19 Bina and Andreessen and their growing team of coders worked out of the dark basement offices of the old Oil Chemistry building, offices that soon filled up with piles of pizza boxes, stray cookie packages, empty soda cans, and Skittles wrappers.20 In a matter of months, they had created a working version of the browser. Significantly, their browser, which was given the name Mosaic, could embed images directly on the page rather than clumsily requiring images to pop up in a new browser window. On January 23, 1993, Andreessen posted a file containing a first working version of Mosaic, under the words, “By the power vested in me by no one in particular, X-Mosaic is hereby released.”21

Andreessen expected there to be immediate demand for what was, by all accounts, a vastly superior browser. “We just tried to hurry and get it out there, initially to a limited group of 10 or 12 alpha and beta testers,” he said. “Of course, the Internet is a great way to distribute viruses, too; put a virus out and then it propagates.”22 Born on a university campus, Mosaic had additional advantages in quickly finding an audience. To start, the campus itself was filled with young people with computers and unusually fast Internet connections who were eager to try something new. When the public did in fact quickly engage with Mosaic—and the original twelve downloads grew to several hundred thousand by December—the team could depend on the university’s infrastructure to keep up with demand.23 No surprise, then, that the most prominent Web browser (Mosaic/Netscape), portal (Yahoo), search engine (Google), and social network (Facebook) all germinated at universities, whether Illinois, Stanford, or Harvard.

The immediate popularity of Mosaic meant that there would be two very different guardians of the nascent Web: Berners-Lee, the scientist who conceived it, and Andreessen, the Midwest-born college student who helped it to catch on quickly. Both men were obviously transfixed by the Web’s potential, but if anything, Andreessen was the one more enthralled: great idea, Tim, now let’s get on with it. There had never been anything like the Web before; who knew if it would even be popular? Thus the Mosaic team focused on making the Web experience simple, intuitive, and eye-pleasing, starting with the browser’s newfound compatibility with images. Andreessen acted more like the leader of a hungry start-up than a member of a university research team. Berners-Lee noticed that after the first version of Mosaic was released, Andreessen maintained “a near-constant presence on the newsgroups discussing the Web, listening for features people were asking for, what would make browsers easier to use . . . almost as if he were attending to ‘customer relations.’”24

Berners-Lee, by contrast, made a priority of promoting the values of individual autonomy and collaboration on the Web. He insisted that a browser should be a text editor as well, so that Web surfers would be encouraged to add to the interconnections, not just surf across them. Users would have handy tools to create and publish a page on their own or cooperate with friends to write, edit, and publish together. Think of Wikipedia, the online encyclopedia where thousands of contributors create articles individually, but usually improve them collectively, or a shared Google document, which similarly grows as more people are invited to contribute. Berners-Lee wanted these experiences to be the norm. Not just a part of the Web experience, but central to it. This was the democratic instinct as applied to the Internet, with the general public driving the development of the Web, rather than programmers, with all the inefficiency and lack of professionalism that implies, as well as the unpredictability and personal control.25

When Andreessen and Berners-Lee finally met face-to-face in 1993 in Illinois, there already was a “strange tension,” Berners-Lee reported.26 The Mosaic team exuded a confidence that they represented the future of Web development, which rankled Berners-Lee. They described material online as being “on Mosaic,” rather than “on the Web,” another annoying trait.27 More significant, the Mosaic developers early on dropped the collaborative, text-heavy tools that Berners-Lee championed as empowering the public, seeing them as inefficient and a distraction from the central mission of creating a compelling, entertaining Web experience. With a better browser and faster Internet connections, the Web could become more like the television that Andreessen had already wired into his computer—passive and commercially friendly. But with a crucial difference: a television signal that could reach the entire world at once!

From a Silicon Valley perspective, the Mosaic team was strategically “pivoting” the Web browser toward Andreessen’s commercial friendly vision and away from Berners-Lee’s, which wasn’t. The timing couldn’t have been better. By the early 1990s, the last official barriers to business and commerce on the Internet were torn down through a combination of congressional legislation and new rules from the National Science Foundation, the organization that supported the Internet.28 The noncommercial status of the Internet was rooted in its history as a government-funded project operating mainly through universities and government agencies, but businesses were persistent in arguing that they belonged online as well. In 1993, the Internet became fully open for business with the passage of the National Information Infrastructure Act, which “clearly took the development of the Internet out of the hands of the government and placed it into the hands of the competitive marketplace.”29 This shift didn’t necessarily mean that the young programmers like Andreessen who built Mosaic would benefit from its success. They were merely salaried employees at a lab; the University of Illinois retained the rights to Mosaic.30

Years later, those early design choices by the Mosaic programming team still made Berners-Lee cringe. “The Web, which I designed to be a medium of all sorts of information, from the very local to the very global, grew decidedly in the direction of the very global, and as a publication medium but less of a collaboration medium,” he said in dismay.31 The experience was a useful harbinger, however. Going forward, the Web experience would largely be in the hands of hacker-entrepreneurs committed above all else to bringing in the most users to the Web, at first to make sure the project would survive, later, to reach profitability. If gaining a huge global audience was your primary goal, even Berners-Lee had to concede, why would you fight for tools to encourage collaborative editing, which “didn’t seem to promise that millionfold multiplier”?32

Andreessen certainly pled guilty to wanting to please the largest possible audience. “I’m a Midwestern tinkerer type,” Andreessen says. “If people want images, they get images. Bring it on.”33 In response to Berners-Lee’s other concern, that Andreessen was hijacking control of the Web, the young hacker would turn the question back on him. Andreessen’s goal was to share the Web with the world and give users a chance to shape its development by carefully watching which features were popular and which were not, and revising accordingly. He would later press for changes to the browser that helped businesses operate online and, in the process, usher even more users to the Web. Berners-Lee, by insisting that the Web be collaborative and less flashy whether the public wanted these features or not, was the better example of a programmer trying to impose his will on the public. “The Web had already become a brush fire, and he was uncomfortable that he was no longer controlling it,” Andreessen said about Berners-Lee in those early days.34

Brush fire indeed. Aided by the steady adoption of the Mosaic browser, the amount of information being conveyed by the Web grew more than two thousand times from January 1993 to January 1994, a figure that caught the attention of people attuned to how the economy might be changing, including a young investment banker, Jeff Bezos, considering whether to leave finance to start his own business. “Things just don’t grow that fast,” he observed.35

In December 1993, Andreessen left the University of Illinois to head to Silicon Valley to start earning the high salary promised in that issue of U.S. News & World Report. He had just graduated, so this was a natural time to be departing, but he would also be leaving behind Mosaic, the project that had defined his time there. Andreessen’s frustration with the Mosaic team had been growing, as the success of the browser outside the lab had caused administrators to take notice.36 They began to schedule regular meetings to review progress and kept adding members to the programming team. Meanwhile, the university was weighing proposals to license the Mosaic code. The terms of participation were now very different for Andreessen. He had been running the equivalent of a lean, fast-moving Web start-up. Great fun. Going forward, he would be navigating academic turf wars as a recent graduate. Not so much fun. “There was no reason to stay there,” Andreessen explained. “The environment was falling to pieces compared to what it had been, simply because there was this influx of money. The project grew from 2 people to 20. It was completely different.”37

Just before Andreessen left the lab, however, he was given a bloody flag to wave to unite his fellow hackers at the lab. That December, John Markoff wrote a prescient article in the New York Times about Mosaic, calling it “a map to the buried treasures of the Information Age.” The big photo accompanying the article featured Larry Smarr, the director of the NCSA, and Larry Hardin, who directly supervised the Mosaic project, but no one else. Neither Andreessen nor Bina, nor any of the other programmers, was mentioned by name.38 This was the traditional academic model in a nutshell: the students do the work, the professors get their names first on the journal article and in the news media. This public diss would prove quite helpful in the months that followed, as Andreessen tried to lure members of his old team to a new commercial project.

When Andreessen first headed to Silicon Valley in the winter of 1994, however, he had no intention of resuming work on a browser. He joined a small company, Enterprise Integration Technologies, and lasted three months before being recruited by Jim Clark, a former Stanford electrical engineering professor turned entrepreneur. They were to work together on the hot business idea of the time, interactive television. Only when the plan for interactive TV fell through, and Andreessen and Clark were brainstorming ideas, did Andreessen bring up the idea of creating a commercial browser to compete with Mosaic.39 Clark and Andreessen began hiring as many of the disgruntled NCSA programmers to their new company as possible; they would be joined by a few other early Web programmers as well as some more seasoned hands Clark knew from his previous company, Silicon Graphics. Smarr, who considered Clark a friend, at the time felt betrayed by the “raiding” of the talent at his lab.40

Clark, an engineer himself, believed that hiring the best technical talent would be the key to the success of his and Andreessen’s new company, Mosaic Communications, MCom for short. Clark flew out to Illinois to close the deals personally. Bina, for example, only signed on after Clark personally agreed to let him work from Illinois so he could stay with his wife, who was a professor.41 Another important hire, Lou Montulli, a recent graduate from the University of Kansas who had created Lynx, a text-based Web browser, recalled being summoned to Champaign and flying with a last-minute ticket that was so expensive that he wanted to be assured he would be reimbursed for its cost.42 Nothing to worry about, he was told. Clark’s new company would make sure the programmers were appropriately compensated, starting with Andreessen, but extending to the entire programming team. Clark had learned a painful lesson from Silicon Graphics, which grew out of his research at Stanford and was founded with a team of departing Stanford graduate students: over time, the financiers had profited much more from the company’s initial success than the engineers, including Clark, even, who at the start sold a 40 percent stake in the company to an investor for $800,000.43 Ultimately, the investors took over the board and made business decisions that forced Clark to leave.

The MCom team set to work on creating a new browser from scratch, Mosaic Netscape. Clark had raised the idea of paying the University of Illinois a fee to license the Mosaic code, as other companies were doing, but Andreessen said no. Alma mater wouldn’t see a penny of MCom money if he had anything to do with it.44 The programming team would construct a better browser, which would be designed for the slow 14.4 kb modems of the real world, not the fast cables of well-financed universities. What came next was a programming binge straight out of the hacker annals—Montulli, who had extensive responsibilities for coding the browser, painted a picture of programming life at MCom: “Essentially 10 Mountain Dews (full strength, no diet), horrible food and I think my regular schedule back then was to come in, work for 20 hours straight, we had a futon room, which is a little disturbing to think about now, it was a mattress in a conference room that was dark, I would catch 4 or 5 hours of sleep at the office, wake up, do another 20 hours, and then go home and sleep for 12 or 15 hours and start the whole cycle again.”45

Everything about the MCom work environment resembled the hacking days at MIT, not just the working hours and the bathing habits, but also the near-complete absence of women in any meaningful role. A 1994 Web page presents the team at the time, a total of twenty-three members, some with short descriptions like “Marc Andreessen—the Hayseed with the Know-How” or “Jim Clark—Uncle Jim’s Money Store.” Seemingly, not a woman among them. The motto of the group, taken from Sartre’s Being and Nothingness, is filled with collegiate angst: “All human actions are equivalent . . . and . . . all are on principle doomed to failure.”46 The Web, however, seemed to defy gravity or entropy, growing frantically—roughly 600 Web sites at the beginning of 1994, became 10,000 at the end of the year, became about 100,000 by the end of 1995.47 By late 1994, the original Mosaic browser had an estimated 3 million users. In the next year, 1995, about 18 million American homes had a computer with a modem for connecting to the Internet, an increase of more than 60 percent from the year before, and, for the first time, a majority of Americans used a computer either at home, at work, or at school.48

In the fall of 1994, Andreessen was invited to explain the burgeoning online ecosystem to a San Francisco conference for entrepreneurs eager to learn about Web commerce. He was twenty-three and still relatively new to Silicon Valley. His company’s built-from-scratch Mosaic Netscape browser had just been released. Working with an overhead projector and a bunch of transparencies, as one did at the time, Andreessen began his self-deprecating talk: “We tried really hard not to invent anything new or solve any hard problems, which makes it easier to get something done.” He then listed some of the obvious challenges his team happily ignored: “How do you search across the entire information space? I don’t know. How do you know where you are going? Beats me.” He then tried to describe to the audience the larger purpose of the Web, saying it is “fundamentally about communication. The applications that are going to be successful are the ones that tie together people.”49 It was a bravura performance that in a few words, we can now see, sketched out the commercial history of the Web, identifying niches in the ecosystem that would feed tech titans like Yahoo, Google, and Facebook.

Soon after he flicked off the projector, however, came the inevitable audience question: “What is the Mosaic Communications business model?” Without missing a beat, Andreessen answered: “Making money.”50 Another jokey comment, certainly, but perhaps the only one that could capture what he and Clark were thinking. Making money was the plan and, to that end, MCom would need the Web to be entertaining and useful. If no one wanted to use the Web then no one would need whatever MCom planned on selling. However, if the Web proved useful and entertaining, then businesses might pay MCom for servers to host their websites or for help running their online business. Or maybe MCom’s Web site would become a valuable portal in its own right, as people new to the Web inevitably visited there to learn more about the browser that opened up the Web.

Microsoft had already demonstrated that there was money to be made from becoming the standard software for operating your computer. In Gates’s days, you wanted to enforce your dominant position by preventing any copying or sharing—you withheld the program or operating system until the public came crawling, dollar bills in hand. In the two decades that separated Gates from Andreessen, however, the Web changed what qualified as an astute business strategy. A company like MCom didn’t need to wield its power quite so heavy-handedly. Why send interested users away if acquiring a large audience was how a Web business ultimately hoped to make its money? In that sense, Andreessen the professional programmer had very similar priorities to Andreessen the undergraduate programmer. They both pursued the “shareware model of free distribution. . . . Get it out there and into people’s hands.”51 The browser initially wasn’t free, but was free on a trial basis, and the company didn’t object if a user began a new trial over and over again. Gaining a large audience was so important that Andreessen was happy to suggest opportunities for entrepreneurs to explore, knowing that the browser maker should come out fine in the process as the proverbial store that sells pick axes during the gold rush.

Weeks after the San Francisco talk, MCom changed its name to Netscape Communications, and the browser’s name to Navigator, in response to a complaint from the University of Illinois that it could be confused with the original Mosaic browser. The university also claimed that despite a thorough rewrite, the Navigator browser still contained code from the original version of the program that Andreessen and Bina first put together at the NCSA.52 Andreessen is still angry about the experience, which he recounted on Twitter: “Netscape never got rights to Mosaic. We rewrote code base from scratch,” he wrote. “Univ Illinois then threatened to sue us and tried to kill our business. So we sued them to stop harassment. . . . Univ of Illinois got small-$ cash payoff. Refused stock.”53 Andreessen enjoys reminding the world that had the university wisely taken stock instead of the $2.7 million Netscape paid in compensation it could have netted $5 million more.54

For all of his foresight in that early talk about the way the Web would grow and organize itself for commerce, Andreessen never mentioned advertising, which would become the predominant economic engine of the Internet. For the time being, Netscape’s path to profitability would center on promoting online commerce. The company would sell businesses powerful servers and other tools to reach their customers, while the Netscape browser would be made commerce friendly, with new features to allow secure financial transactions. “Our goal is to get millions of copies in use quickly to really start enabling the market for lots of commercial services to come online,” Andreessen said in the San Francisco talk.55

Despite Andreessen’s initial omission, the Netscape browser would ultimately prove crucial to introducing the advertising-centric, data-collecting Web we have today. Not that this was Netscape’s intention. The change in the browser that would have such long-term implications for the Web was a new snippet of code created by Montulli, who called it a “cookie.” This new code was meant to fix what was seen as a flaw in Berners-Lee’s original design for the Web, namely users traveled anonymously “from server to server afresh, with no reference to any previous transactions.”56 This lack of “memory” on the Web—that your past wouldn’t, couldn’t, accumulate—posed problems. Encounters with Web sites became “a bit like talking to someone with Alzheimer’s disease,” Montulli wrote.57 You may have visited a site ten times a day, every day, but you were nonetheless considered a stranger. Unless you registered at a site, sales would have to be conducted along the time-consuming “vending machine” model58: You want six different candy bars? Put in the money six times and pull the lever six times. The inability to retain even the most basic information about users could have been fatal to online commerce, and Montulli devised cookies as a way for Web sites to keep tabs on their regular visitors the better to sell them things.

Montulli recalls coming up with the idea in July 1994, after meeting with an in-house team focused on supporting e-commerce. The team was planning an online shopping cart system, but the browser seemingly wouldn’t allow it, since there was no way for a shopper to pick something to buy, leave it in a cart, keep shopping, add something else to the cart, and again return to shopping. Montulli wrestled with a solution for days, proudly holding the line against a proposal to give each browser a unique ID as it traveled across the Web. Such a solution was antithetical to how Montulli understood the Web—yes, a browser ID would help a Web site keep track of its customers, but it would turn every visitor into an open book, as businesses could easily pool their knowledge to create an online portrait.59 Instead, Netscape’s cookies would be built around each visit, or “session,” between a particular Web site and a browser. This bit of memory would be enough to speed commerce by allowing a business to recall its customer’s past preferences, credit card number, earlier selections, and the like, though that information wouldn’t be carried to other Web sites.

Netscape’s business customers were told of this new powerful tool for commerce in fall 1994, with the release of the first browser, and advised how best to employ it. The ordinary users of that browser—who typically were not paying customers, after all—were never told.60 Here was a vivid example of the truth in the adage, “If you are not paying for it, you’re not the customer; you’re the product being sold,” which was coined in 2010 by a commenter to the Web site MetaFilter.61 Nonetheless, Montulli and the others felt good about what they had accomplished, confident that they had the interests of users—not just businesses—at heart. Yet, in one of those painful ironies that illustrate the importance of early design decisions, the user protections that Montulli cared so much about ultimately wouldn’t make a difference. Web businesses found a work-around and managed to create the kind of “cross-site tracking” he dreaded from the start.

The weakness these businesses exploited, as Montulli was forced to admit later, was that Web pages typically included embedded content from a variety of outside sources, “third parties,” each of whom was able to install a cookie on a visitor’s browser and keep track of where she had been and what she had done. Each visit to a Web site in actuality represented many simultaneous “sessions.” Certain third parties, particularly the companies that placed digital advertisements across the Web, were ubiquitous online; an ordinary Web user could cross paths with the same “third party” site after site. Thus, it might seem that the user of a Web browser was starting a new “session” with a site, even as she was continuing a session with an advertising company that began many Web sites earlier. A business like DoubleClick, which was acquired for $3.1 billion by Google in 2008, could therefore stitch together a detailed profile of a Web surfer’s online life on its own, exactly what Montulli had tried to avoid.

When, in 1996, journalists in Britain62 and the United States informed Web surfers of the surprising news that “the Web sites you’re visiting may be spying on you,” there were protests over cookies and calls for the government to step in.63 Netscape was concerned enough to ask Montulli to think of a coding change to thwart third parties. “Tracking across Web sites was certainly not what cookies were designed to do, they were designed with the opposite intention,” he wrote in 2013 on his blog, explaining the predicament, “but what could be done at that point to fix the problem?” He agonized for weeks and then opted to do nothing, convinced that at this point businesses trying to profile Web users couldn’t be stopped: “If third-party cookies were disabled, ad companies would use another mechanism to accomplish the same thing, and that mechanism would not have the same level of visibility and control as cookies.”64

When Microsoft introduced a browser to compete with Netscape, there never was a question about whether it would have cookies too. The gains from cookies were tangible, the loss of privacy less so. Montulli’s warning about how the tracking would only get worse without cookies proved correct. Two decades later, Facebook, for example, has access to so much more information about its users than mere browsing history—what they like and dislike, whom they communicate with, their relationship status, what articles they click on, what articles they read to the end, even—and the mechanism to limit what it retains is much less visible.

Despite the many improvements Netscape introduced to assist in Web commerce, the company itself hadn’t achieved commercial success. Past practice on Wall Street was for a company to hold off on an initial public offering of stock until there had been at least three consecutive profitable quarters; in the summer of 1995, Netscape was still waiting for its first. But Clark argued that Netscape should play by its own rules—after all, no other company had experienced the kind of viral growth that Netscape had, approaching 90 percent of the browser market. If Clark’s experience in Silicon Valley had taught him anything it was to take the money when it’s sitting there. “I wanted us to go public, because I thought it’d be good for us from a P.R. standpoint, and I did go into this thing to make money, so I was looking for a reward as well,” he said.65

The day of the IPO, August 9, 1995, appeared to be perfectly timed, producing a mania that surprised even Clark. Shares spiked to nearly three times the opening price, from $28 to $75, before settling at $58. By the end of that day’s trading, Clark’s stake in Netscape was worth $663 million, a fact he recalled a little later when he needed to come up with a tail number for an airplane he bought. “I told them to use 663, because that meant something to me.”66 Andreessen clocked in with around $60 million. Jimmy Wales, the cofounder of Wikipedia, is one of a number of aspiring Internet entrepreneurs who consider 8/9/95 a life-changing date. Wales had already dropped out of graduate school to become a futures and options trader in Chicago, but “when Netscape went public and it was worth more than $2 billion on the first day,” he recalled, “it clicked in my mind that something big was happening on the Internet.”67

Shares of Netscape stock never eclipsed their peak of $171 in December, and from such heights, seemingly the company had only one direction to go. The frenzy leading to the IPO made Microsoft finally take notice of the Web, after Bill Gates had dismissed it as so much hype.68 In a bit of painful payback for Andreessen, a version of Mosaic licensed by the University of Illinois to a local software company, Spyglass, helped Microsoft quickly challenge the Netscape browser. Working from Spyglass’s browser, Microsoft released its first version of Internet Explorer the same month as the Netscape IPO, followed in October 1995 by the release of the beta version of the much-improved Internet Explorer 2.0, which was available free for surfers and businesses alike. New versions quickly followed, and Microsoft aggressively promoted them. A turning point came two years later, in October 1997, when the company released Internet Explorer 4.0, which was tightly bundled with the Windows operating system. At that point, Netscape was still roughly twice as popular—65 percent of the market versus 32 for Internet Explorer.69 In November 1998, however, the tide was turning, and a declining Netscape was bought by AOL in a stock swap valued at the time at $4.2 billion.70

Andreessen’s wild run from Mosaic to Netscape to AOL thoroughly transformed the computing world. Microsoft may have succeeded in taking down Netscape—with Internet Explorer, for example, reaching a peak market share well above 90 percent—but that success took a toll. The company’s win-at-all-cost approach to “the browser wars” produced important evidence when the U.S. government filed an antitrust lawsuit against Microsoft in 1998, particularly testimony alleging that Microsoft had conspired to cut off Netscape’s “air supply.” After Microsoft settled that lawsuit in 2001, Microsoft didn’t have the same strut or chokehold on how the public used computers. In a fitting final twist to the Netscape-Microsoft fight, the code for Netscape Navigator was released as free software—that is, free to be shared and improved upon by whoever acquired a copy. That code became the basis of the Firefox browser (original name Phoenix because it rose from Navigator’s ashes), which helped chip away at Internet Explorer’s dominant market share. Today, Firefox lives on as part of a nonprofit project supported by a community of programmers who were motivated to push back against Microsoft.

Andreessen is a dual figure, the hacker-entrepreneur. At the same time that the Netscape team helped make the Web commerce friendly, it also helped install anarchic hacker values onto the Web. For example, on the Web, as in the artificial intelligence lab, little deference would be given to authority: anyone can publish online no matter his age or experience. Your work speaks for itself. The Web also adopted the hackers’ belief that information should be free to circulate: music files, newspaper articles, movies, and software all bouncing from computer to computer, unrestrained by duplication costs and seemingly one step ahead of the authorities. There was also a similar consensus that freedom of speech should trump all other concerns: the Web would be beyond the reach of “politically correct” censors declaring some comments as too hateful or cruel or obscene to appear. A more society-focused vision of the Web lost out, although it has been kept alive on the margins, often by European governments who try to prop up traditional newspapers as a stabilizing force and where some topics, like far-right political parties, are barred from appearing online. Some European governments have gone so far as to resist computers’ unrivaled memory skills, promoting a right to be forgotten so that an individual can insist that material be taken off the Web if it is old and embarrassing.

In the early 1990s, administrators at the NCSA had briefly suggested that the Mosaic browser warn users if a Web site might not be suitable for children. The problem arose from a link on the What’s New page Andreessen maintained for the Mosaic homepage, back when a single person could actually try to keep up with what was new on the Web! A child of a lab employee had clicked on the link and was sent to an arts site with a prominent display of a nude sculpture. Administrators asked Andreessen to come up with a fix. It was the Stanford censorship case all over again, and the hackers’ loyalties hadn’t shifted. They knew that censorship was stupid and antithetical to the Web, and Andreessen offered a suitably stupid proposal. Let’s have a box appear before a user reaches any new Web site, he suggested, with the following warning: “ARE YOU SURE YOU WANT TO TAKE THIS CRAZY STEP AND KEEP SURFING?” The administrators decided to pass.71

One clear articulation of how hackers’ anti-authoritarian views were shaping the Web appeared in 1996 as “A Declaration of the Independence of Cyberspace,” written by John Perry Barlow, the libertarian cofounder of the influential digital rights group the Electronic Frontier Foundation. No government, Barlow declared, had the authority to limit the freedoms inherent to cyberspace. “Governments derive their just powers from the consent of the governed,” he announced. “You have neither solicited nor received ours. We did not invite you. You do not know us, nor do you know our world. Cyberspace does not lie within your borders.” More broadly, Barlow was arguing that nothing from the offline world—traditional rules, institutions, and codes of behavior, even history itself—carried any weight in cyberspace, which was “a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth.” Having ditched America’s living history of racism in less than a sentence, and ignored the misogyny outright, Barlow was then free to demand the familiar absolutist line about online speech. “Anyone, anywhere,” he wrote, “may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.”72

The declaration is a political statement about as nuanced and considered as the hand-scrawled “Keep Out” sign that a teenager tapes on his door. Nonetheless, it accurately describes much of the Web today—the hostility to authority and rules or regulations of any kind; the privileging of freedom over empathy; the fantasy that the Internet is immune to the pull of history. Barlow certainly drew inspiration from the early hackers as he wrote his declaration, but the radical ideology promoted by the Know-It-Alls over the last twenty years involves so much more than the hackers’ desire to be left alone. Other troubling aspects of Silicon Valley values described in this book—blind faith in the power of markets to do good all the time, trafficking in people’s private information as a commodity, acquiring obscene personal wealth and pursuing economic and social disruption for their own sake with no thought to the human cost—have nothing to do with the hacker ethic. In fact, McCarthy and the other early hackers were critical of those who saw the computer revolution as a path to personal wealth or its close cousin, personal power. Even Lou Montulli recalls being taken aback by the promise of a quick fortune from Jim Clark. “He filled our heads with giant numbers of how we were going to make riches and be the most important people on the planet,” Montulli recalled,73 which conflicted with his own “sort of Marxist” belief that “you couldn’t make more than a million dollars honestly.”74

After a decade of trying to replicate the success of Netscape at other ventures, Andreessen in 2009 found his calling as a Silicon Valley venture capitalist. He promotes disruptive capitalism among a new generation of hacker-entrepreneurs who, as he memorably put it in the title of a Wall Street Journal op-ed, create software that “is eating the world.”75 Certainly he was better suited to be an investor than to run a company of his own. To start, his misanthropic personality, a liability in a manager, is seen as an asset in an investor.76 His partner, Ben Horowitz, explains: “If you say to Marc, ‘Don’t bite somebody’s fucking head off!,’ that would be wrong. Because a lot of his value, when you’re making giant decisions for huge amounts of money, is saying, ‘Why aren’t you fucking considering this and this and this?’”77

Think back, too, to Andreessen’s 1994 talk to aspiring online entrepreneurs: when it came to how the Web would adapt to become more business friendly he was brimming with ideas, but as to the details of the business plan for his own company, he offered the generic “make money.” In his role as a VC, he is expected to survey the economy, weighing in as a public intellectual and pointing to broad trends. Back in 2003, when Andreessen was still an entrepreneur, he scoffed at a reporter who wondered if he kept a personal blog. “No,” he responded, “I have a day job. I don’t have the time or ego need.”78 More recently, however, he has been inclined to marvel at the influence of his Twitter feed, where he had posted more than 100,000 times and acquired an audience of half a million followers. “Reporters are obsessed with it,” he bragged to a reporter. “It’s like a tube and I have loudspeakers installed in every reporting cubicle around the world.” Andreessen was using Twitter as a Trumpian bullhorn for his ideas of software-led disruption before Donald Trump was a serious enough figure to communicate effectively in 140-character bursts.

On Twitter, Andreessen’s praise of disruption has been exclusively economic, not political or social. For example, in a Twitter “essay” from 2014, Andreessen praised the slow and deliberate steps America has made in opening up its political system to all citizens. Don’t dwell on all the improvements yet to be enacted, he advised, but instead think about how far we’ve come. “Common thing one hears in US is ‘Political system broken; Founding Fathers never intended politics to be dominated by moneyed interests.’ But in 1776, voting ‘restricted to property owners--most of whom are white male Protestants over the age of 21.’ In 1789, George Washington was elected president. ‘Only 6% of the population can vote,’” he wrote, adding, “We have far broader-based voting and political participation today than ever before, due to hard work by many activists over 200 years. And we’re still by no means perfect; lots of progress yet to be made. But we’re leaps and bounds ahead of 50-100-150-200 years ago.”79

Imagine Andreessen’s reaction to someone who made a similar argument concerning the economic disruption caused by Internet companies—you know, think how far we have already come, let’s not act too hastily. Well, we don’t have to imagine, actually. In another Twitter essay, Andreessen argued that technological progress has benefited the poor much more than the rich—an observation he insists “flows from basic economics.” Therefore, he writes, “Opposing tech innovation is punishing the poor by slowing the process by which they get things previously only affordable to the rich.”80 To recommend patience in implementing technical changes is simply immoral. What’s the difference? Well, one difference is the power relationship. In the case of the disruptive democratic politics that Andreessen appears leery of, members of the public are being given greater control over their lives at the expense of an elite; in the case of disruptive technologies, an elite is driving the change.

In September 2016, just when Trump was deploying Twitter to strike out at his foes and communicate without speaking to the press, Andreessen stopped posting to Twitter, not long after having to apologize for posts that praised British colonialism in India as superior to democracy in providing for the poor.81 In a clever work-around, however, Andreessen has remained active on Twitter by “liking” as many as forty posts a day written by others. That way he seemingly expresses his opinion—and continues tangling with Web idealists—without bearing ultimate authorial responsibility for what has been said.

The first idealist Andreessen ever tangled with, of course, was Berners-Lee. In his memoir, Berners-Lee is quick to reject the anti-capitalist label, denying that he thinks the Web should be treated as some hallowed space, “where we must remove our shoes, eat only fallen fruit and eschew commercialization.”82 But he also clearly isn’t comfortable with how it has been twisted to generate runaway profits. At one point in this same memoir, Berners-Lee pauses to answer why he never tried to amass a fortune from his ideas, even as so many other key figures in the Web’s development did. “What is maddening is the terrible notion that a person’s value depends on how important and financially successful they are, and that that is measured in terms of money,” he writes. “To use net worth as a criterion by which to judge people is to set our children’s sights on cash rather than on things that will actually make them happy.”83

Berners-Lee tells a story about a technical breakthrough in the development of the Web that occurred on Christmas 1990. That day, his computer at CERN for the first time used a primitive browser/editor to communicate with a server hosting the Web’s first URL, info.cern.ch. The Web worked! Even so, Berners-Lee writes that he “wasn’t that keyed up about it,” because he and his wife were expecting their first child and, “As amazing as it would be to see the Web develop, it would never compare to seeing the development of our child.”84 Even at this late date, we would do well to try to restore the human-scale perspective and idealism Berners-Lee brought to the Internet project from the start. The lesson the Know-It-Alls took from those early years, however, was to grow big and grow fast.