MEMORY LANE

GREGORY: What was that? . . .

MAN #1: I think it was ‘Blessed are the cheesemakers.’

Life of Brian, Scene 3, Sermon on the Mount

The web is familiar to us but clearly not well understood. For instance, many people use the terms internet and web interchangeably; yet as you will see, the two entities are quite distinct.

The web has a complex, convoluted history. Unlike the Greek goddess Athena, it didn’t spring into existence fully formed. Rather, it emerged gradually and somewhat haphazardly from a hodgepodge of existing technologies, some dating back to the nineteenth century. It was as if a bunch of clever kids piecing together Legos surprised themselves one day by creating something truly amazing.

At the heart of the story is our species’ instinctive desire to amplify its voice and influence far and wide—an impulse that shows itself early in life. As tiny, helpless infants firing off our first loud screech, we are startled by its sheer power and captivated by its ability to arrest people’s attention.

Beginning centuries ago, that simple realization—the loudest voice in the room carries weight—drove us to invent ways of projecting our voice and influence. First by broadcasting, then by computing, and then by networking, innovations that in 1989 led to the conception of the world wide web.

BROADCASTING

According to Guinness World Records, the intelligible range of a man’s voice in perfectly still, outdoor conditions is about 590 feet.114 In real life—where background noise makes for less-than-ideal conditions— that range is considerably reduced.

It appears the all-time distance record for unamplified speech goes to the eighteenth-century evangelist George Whitefield. According to Braxton Boren, a music technologist at New York University, Whitefield’s stentorious voice was able to reach roughly 400 feet (121 meters), which equates to an audience of between 20,000 and 50,000 people. “When it is considered in the context of the hundreds of such crowds he attracted over his lifetime,” Boren explains, “Whitefield probably spoke directly to more individuals than any orator in history.”115

American painter and inventor Samuel F. B. Morse far exceeded the natural reach of Whitefield’s voice by transmitting an electrical message over a long wire. In 1844, using a clever dot-dash code of his devising, he telegraphed the message “What hath God wrought” (from the Bible verse Numbers 23:23) over some forty miles, from Washington, DC to Baltimore, Maryland.116

Scottish-American scientist and teacher Alexander Graham Bell bested Morse by conveying the human voice across many miles by wire. The microphone in Bell’s telephone had a diaphragm that fluttered when struck by sound waves. The fluttering membrane generated electrical ripples the way a fluttering hand in a swimming pool generates water ripples. The speaker (in effect, a reverse microphone) reconverted the electrical ripples into sound waves.

On January 25, 1915, Bell achieved history’s first transcontinental phone call. It was placed from New York City to his now-famous assistant in San Francisco, some 3,400 miles away:

“Ahoy! Ahoy! Mr. Watson, are you there? Do you hear me?”

“Yes, Mr. Bell, I hear you perfectly. Do you hear me well?”

“Yes, your voice is perfectly distinct.”117

By the 1950s, with the help of cables laid across the Atlantic Ocean, the human voice could be telephoned halfway around the world.118 “Undersea cables, and long-distance communications in general, became the highest of high tech,” observes American science-fiction writer Neal Stephenson, “with many of the same connotations as rocket science or nuclear physics or brain surgery would acquire in later decades.”119

In the 1880s, on a completely different front, German physicist Heinrich Rudolf Hertz discovered that large electrical sparks gave off waves of electromagnetism, the way a bomb gives off shock waves.120 The revelation immediately suggested the possibility of communicating without wires.

In 1901 the possibility became very real. Italian nobleman and electrical engineer Guglielmo Marconi successfully generated foot-long sparks, which produced electromagnetic waves so powerful they wafted clear across the Atlantic Ocean, from Cornwall, England, to St. John’s, Newfoundland—a distance of roughly 2,100 miles.121

Like a carrier pigeon, the invisible waves carried a message. It consisted of just the letter S in Morse Code (dot-dot-dot), but it spoke volumes about the potential of wireless communication.122

Five years later, on Christmas Eve at 9:00 p.m. (EST), Canadian-American inventor Reginald Aubrey Fessenden aired the first wireless radio voice program. In Brant Rock, Massachusetts, Fessenden stepped up to a microphone and sent greetings to radio-ready ships on the Atlantic and Caribbean within a radius of several hundred miles. Details are a bit sketchy, but he reportedly played Handel’s “Largo” on an Edison phonograph, performed “O Holy Night” on the violin, and then, after readings from the Bible, signed off with a cheery “Merry Christmas, everyone.”123

Broadcasting technology reached a climax of sorts on April 7, 1927, when scientists publicly demonstrated a way to marry voice to moving images. Then Secretary of Commerce Herbert Hoover stood before a microphone and TV camera in Washington, DC and solemnly declared to a small audience in New York City: “Human genius has now destroyed the impediment of distance in a new respect, and in a manner hitherto unknown.”124

Hoover was exactly right. In the two centuries since Reverend White-field’s record-setting oratory, the range of our voice and influence increased from hundreds of feet to thousands of miles. An amazing achievement, to be sure. But just the beginning.

COMPUTING

At the start of the nineteenth century, numerical tables were all the rage. Astronomers used them for navigating the night sky, ship captains for plotting courses at sea, artillery officers for positioning and aiming their massive weaponry, and tax collectors for levying tariffs. But the numerical tables—hand-calculated by minions called human computers—were riddled with errors.

On June 14, 1822, Englishman Charles Babbage came before the august members of the Royal Astronomical Society with a seemingly far-out solution to the problem: a hand-cranked computing device he claimed could do tedious calculations with great accuracy. Babbage’s proposed contraption would require 25,000 precision-milled parts and weigh four tons.125

Alas, the persnickety inventor never completed the gigantic machine; and his vision of replacing humans with automated brainiacs pretty much died with him. It stayed moribund until the 1940s, when American physicist John Vincent Atanasoff at Iowa State College (now Iowa State University) and other scientists began developing rudimentary electronic computers.126

Their pioneering efforts inspired many separate efforts, which reached a highpoint on February 14, 1946. On that historic day, University of Pennsylvania electrical engineers publicly unveiled a thirty-ton, 1,800-square-foot programmable calculator named ENIAC, an acronym for Electronic Numerical Integrator and Computer.127

ENIAC’s lightning-fast electronic brain—comprising one hundred thousand vacuum tubes, diodes, relays, resistors, and capacitors—could execute 50,000 instructions per second. It completed in a mere thirty seconds what it took the average human computer twenty hours and the best mechanical calculators of the day twelve hours to do.128

ENIAC and other electronic digital computers were prohibitively expensive, however, so they remained novelties well into the twentieth century. Even NASA had to rely on human computers to launch the space age. The 2017 hit movie Hidden Figures commemorates three such human computers—all black women—who helped calculate the flight paths for Alan Shepard’s 1961 and John Glenn’s 1962 history-making missions.129

NETWORKING

The final leg of our winding journey toward the web was piloted by a handful of visionaries. They saw computers as much more than just fancy adding machines.

Among the prophets, those living in the United States benefitted greatly by a surprise event during the Cold War. In 1957 the Soviet Union launched the world’s first satellite—a mysterious, beach-ball–sized, beeping metal sphere called Sputnik.130

President Dwight D. Eisenhower reacted to the threatening incident by creating the Advanced Research Projects Agency (ARPA). Psychologist and computer scientist Robert W. Taylor recalls the president was eager to boost the nation’s scientific and technological prowess “so that we would not get caught with our pants down again.”131

One of ARPA’s first priorities was to improve the intolerable situation with research computers of the day. Because they were gigantic, expensive, and scarce, very few scientists had access to one.

In 1962 Joseph Carl Robnett Licklider—the first director of ARPA’s Information Processing Techniques Office—floated an ingenious remedy that saw computers as elements of a telephonic grid. By using telephone lines to connect widely separated scientists and computers, he proposed, we could create an “intergalactic computer network.”132

In a 1968 paper titled, “The Computer as a Communication Device,” Licklider and Taylor further prophesied: “In a few years, men will be able to communicate more effectively through a machine than face to face. That is rather a startling thing to say, but it is our conclusion.”133

But the vision of a telephonic computer network suffered from a glaring weakness. Legions of scientists getting on phone lines to access a small number of computers would surely create massive telephonic traffic jams.

Happily, engineers invented a device that worked like a telephonic traffic cop—the interface message processor—and voila! On October 29, 1969, the ARPAnet was born. It tied together computers at just four locations: UCLA, UC Santa Barbara, Stanford Research Institute (SRI), and University of Utah. But it was the true progenitor of what we now call the internet.

As a UCLA alumnus, I’m especially proud to report ARPAnet’s first data transmission went from the university’s engineering building, Boelter Hall, to SRI. It was an exciting moment, marked by an auspicious glitch.

UCLA computer scientist Leonard Kleinrock and his small team intended to transmit the word Login, but the fledgling net crashed prematurely and only Lo made it through. “We didn’t plan it,” Kleinrock recalls, “but we couldn’t have come up with a better message: short and prophetic.”134

In 1973 ARPAnet went international, by hooking up to research computers in Norway and England. Thereafter, in a kind of recapitulation of every nineteenth-century communications revolution—the telegraph, radio, and TV—computer scientists quickly found better and better ways to send written, voice, and video messages over the burgeoning computer network.

In 1974 scientists coined the term “internet” and created Telenet (which later became Sprintnet), the world’s first commercial computer network. In 1976 Apple publicly released its first desktop computer. And on March 26 of that year Queen Elizabeth II became the first monarch to send an email. Her email address was HME2.135

During the following ten years, scientists further improved the internet’s range and sophistication. They joined the sprawling US network to vast subnets throughout Europe and Asia. In the process, they settled on a way to assign each computer on the internet a unique number—an internet protocol (IP) address. And, on a kind of universal translator—a transmission control protocol/internet protocol (TCPIP)—for reconciling the growing babble of computer network languages.

Each of the incremental improvements was important. But the truly giant leap forward that produced the grand finale was taken by Tim Berners-Lee, an unassuming British computer scientist at CERN (Conseil Européen pour la Recherche Nucléaire or European Council for Nuclear Research), the legendary atom smasher located in Switzerland.136 On March 12, 1989, Berners-Lee proposed a concept his supervisor reportedly belittled as “vague but interesting,” in which he saw computers as electronic libraries and the internet as a way to access the libraries from anywhere in the world.137

In Weaving the Web, Berners-Lee recalls the struggle to find a name for his vision. Two possibilities were “information mesh” and “mine of information.” He rejected the first because it sounded too much like mess and the second because its initials, moi, spelled the French word for “me,” which struck him as overly possessive.138

He finally settled on calling it the world wide web, insisting on the three-word spelling “so that its acronym is three separate ‘W’s.’” Moreover, he stipulated, “There are no hyphens.”139

In Berners-Lee’s imagination, the world wide web would be a storehouse of human knowledge greater than the New Library of Alexandria, Harvard Library, and US Library of Congress combined. The www’s equivalent of library books would be websites composed of webpages filled with text, sound, and video—and hyperlinks via which users could instantly leapfrog from one webpage to another.

The www’s equivalent of catalog numbers would be addresses called URLs (uniform resource locators)—each starting with the now-familiar prefix http:// (hypertext transfer protocol)—that would enable users to locate precisely the information they wanted.140

On August 6, 1991, Berners-Lee made good on his idea, unveiling the world wide web and giving it to us free. Why gratis? “It was simply that had the technology been proprietary, and in my total control, it would probably not have taken off. The decision to make the Web an open system was necessary for it to be universal. You can’t propose that something be a universal space and at the same time keep control of it.”141

Berners-Lee’s intentions were quickly realized. In 1993 The New York Times reported that Mosaic, the first user-friendly program designed to browse the world wide web “has grown so popular that its use is causing data traffic jams on the Internet.” Mosaic would eventually give rise to today’s Internet Explorer browser.

The growth of the internet was indeed explosive. In 1995 an estimated fourteen million people were online. Ten years later the number exceeded one billion. Another ten years hence more than 3.2 billion people worldwide were surfing the net.

In 2002 more information was stored on the web than on paper, fulfilling Berners-Lee’s vision of the world wide web housing more knowledge than all the world’s libraries combined. According to USC’s Annenberg School for Communication and Journalism, that year “could be considered the beginning of the digital age.”142

Today there are websites keeping constant tabs on the www’s continuing growth. According to http://www.worldwidewebsize.com/, on May 1, 2018, the “Indexed World Wide Web” comprised roughly 47 billion webpages. That’s how many webpages conventional search engines such as Google, Bing, and Yahoo are able to access. Upward of five hundred times more webpages exist in the so-called invisible or deep web.143 And a particularly shadowy realm called the dark web can be plumbed only by encrypted network browsers such as TOR and I2P.144

What, indeed, hath God wrought?

As we’ll see in the following three chapters, because of Berners-Lee’s decision to give the world wide web away free, it is disrupting life today even more so than Samuel Morse’s telegraph did in its day. As the respected technology journalist Cade Metz observes: “This [giveaway] allowed the web to spread, but it also allowed it to evolve in ways few could have foreseen.”145