Chapter 15
As We May Think: The Third Industrial Revolution

Today, it is truer than ever that basic research is the pacemaker of technological progress. In the nineteenth century, Yankee mechanical ingenuity, building largely upon the basic discoveries of European scientists, could greatly advance the technical arts. Now the situation is different. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade, regardless of its mechanical skill.

—Vannevar Bush, 19451

Beneath a concrete marker in Flushing, New York, is a message to the people of the year AD 6939. Designed by Westinghouse as part of its exhibit at the New York World’s Fair of 1939, the time capsule was buried on the autumnal equinox, September 23, 1939. In addition to the Bible and “The Book of the Record of the Time Capsule,” which contains messages on special paper in nonfading ink from Albert Einstein among others, the buried cache contains an electric lamp socket and electric wall switch, pieces of industrial machinery, samples of alloys, Portland cement, a newsreel, a watch, an alarm clock, a camera, a safety razor, fountain pen, swatches of cloth and asbestos, a lady’s hat, coins and a dollar bill, and a Mickey Mouse cup.

At one of the darkest moments in history in 1939, the introduction to “The Book of the Record of the Time Capsule” was defiant in its optimism: “In our time many believe that the human race has reached the ultimate in material and social development; others that humanity shall march onward to achievements splendid beyond the imagination of this day, to new worlds of human wealth, power, life and happiness. We choose, with the latter, to believe that men will solve the problems of the world, that the human race will triumph over its limitations and its adversities, that the future will be glorious.”

In the three decades that followed World War II, that optimism was vindicated, at least in the United States and its Western European and East Asian allies. The Westinghouse Corporation buried another time capsule nearby during the 1964 World’s Fair. The later time capsule contained items that had not existed at the time of its predecessor, including graphite from the world’s first nuclear reactor under Stagg Field at the University of Chicago in 1942, a reentry heat shield from the Mercury Aurora 7 spacecraft, the synthetic fibers Orlon, Dacron, and Lycra, a plastic heart valve, a laser rod, a transistor radio, parts of a satellite, and credit cards. And while there was also a Bible, the inclusion of the Beatles single “A Hard Day’s Night,” a bikini, and birth control pills suggested the social changes that had taken place. The technological advances of the mid-twentieth century amounted to a third industrial revolution.2

VANNEVAR BUSH

Thirteen miles from ground zero, Vannevar Bush lay on a tarpaulin thrown over the desert sand in the darkness of the early morning. The fifty-five-year-old director of the Office of Scientific Research and Development (OSRD) waited expectantly, next to his deputy, James Conant. They listened to the countdown by the physicist Saul Allison: “Three . . . two . . . one . . . zero.” At 5:29:45 Mountain War Time on July 16, 1945, civilization changed forever.

The flash lit the distant desert mountains. Through a piece of dark glass, Bush looked at the fireball rising above the New Mexico desert. The Trinity test was a success. The United States had exploded the first atomic bomb. Returning to the gate of the nearby base, Bush waited for the physicist J. Robert Oppenheimer, the chief scientist in the project, to drive past on his way to a vacation. Bush tipped his hat. Oppenheimer wrote later: “We knew the world would not be the same. A few people laughed, a few people cried, most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the Prince that he should do his duty and to impress him takes on his multi-armed form and says, ‘Now, I am become Death, the destroyer of worlds.’ ”3

On August 6, the United States dropped an atomic bomb on Hiroshima, Japan, killing 100,000 of its inhabitants. On August 9, a second bomb incinerated Nagasaki. On August 15, Emperor Hirohito announced the surrender of Japan. Earlier, on May 7, following the suicide of Adolf Hitler on April 30, the German government had formally surrendered. World War II was over. And the third industrial revolution was under way.

Many individuals contributed to the third industrial revolution of the mid-twentieth century, which engendered nuclear energy and information technology, as the second industrial revolution had produced electric power generation and the internal combustion engine and the first had given rise to the steam engine and the telegraph. But even when he was playing only a supporting role, Bush was present at key moments in the early years of the third industrial era, with personal connections to everything from the Manhattan Project and the computer to microwave ovens.

Thomas Edison was similarly ubiquitous during the second industrial revolution, inventing or contributing to the development of transformative technologies including electric-power generation, the incandescent lightbulb, the phonograph, and the motion picture. Edison was a folk hero in his day and his fame endures. In contrast, Bush is little known, except to historians of information technology. What explains Edison’s continuing celebrity and Bush’s relative obscurity?

One reason is publicity. Partly in order to raise funds from private investors, Edison did everything in a blaze of self-generated publicity, often making promises of imminent breakthroughs that he could not keep. In contrast, the wartime work of Bush and the Office of Scientific Research and Development was classified.

Another reason for the failure of Bush to find a place in the imagination of later generations has to do with the changing nature of technological innovation. Already by Edison’s time, corporate and government laboratories and university research institutes in Germany, Britain, and the United States were displacing the individual inventor, even as enormous, consolidated managerial companies were succeeding small firms owned and operated by their founders. By World War II, the personnel and resource requirements of basic scientific research were so immense that only governments like the US federal government could organize and fund them.

As an engineer and scientist in his own right, Bush made many contributions to the evolution of technology and inspired countless others. But his most lasting contribution may be not any particular technology but the institutional structure that generates technological breakthroughs. In his report and later book Science, the Endless Frontier, commissioned by President Roosevelt, and his work with Congress in establishing the National Science Foundation, Bush helped to lay the foundation for the creative collaboration among government, the academy, and industry from which most transformative innovations in recent generations have emerged.

“AS WE MAY THINK”

In July 1945, as Bush was overseeing the Trinity test, the Atlantic published his essay “As We May Think,” in which he speculated about the possibilities for technological augmentation of the human mind.4 Among the new technologies that Bush correctly predicted in “As We May Think” were a “thinking machine” (the calculator), a “vocoder” which would type in response to dictation (voice-activated software), and a “cyclops camera” worn on the forehead (this has yet to arrive, although cell phones with digital photography are close approximations).

The most important of the imaginary devices that Bush described in his 1945 essay was the memex, a version of the personal computer that became a universal appliance in developed societies by 2000. He got the mechanism wrong, speculating that the combination of pictures and text would be embodied in microtape. But his vision of a desk and a screen inspired the engineers who developed the monitor and the mouse. And his speculations about a universal network with a library that could be accessed through “trails” mimicking the associative nature of human thought anticipated the Internet, online dictionaries like Wikipedia, and hyperlinks.

The impact of “As We May Think” was magnified by media interest in the article. After its initial publication in the Atlantic, the essay was popularized on July 23, 1945, in Time under the title “A Machine That Thinks” and on September 10, 1945, Life magazine published an illustrated condensation: “A top U.S. scientist foresees a possible future world in which man-made machines will start to think.”5

What prevented “As We May Think” from being a catalog of gadgets was Bush’s discussion of how information technology could be used to augment human intelligence. Popularization of his work notwithstanding, he was more interested in a machine to help thinking than in a thinking machine. Because human thought is based to a large degree on associations among ideas and images, Bush believed that there was a need for “associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing.”

Bush’s memex would tie together items by means of “trails.” He provided an example: “The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected [on the screen]. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item.”

In addition to main trails and side trails, there would be the “skip trail which stops only on the salient items.” Individuals would save their information trails and share them with friends and colleagues. Bush predicted: “Wholly new forms of encyclopedia will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.”

In tribute to Bush, this chapter will be organized as though it were a series of associative trails on an imaginary memex. By following them, we discover that Bush is celebrated as a pioneer of virtually every aspect of today’s computer technology, from its physical form to hyperlinks and the Internet.

MISTER SCIENCE

Let us begin with the main trail of Bush’s biography. “It is interesting that Mister Science looks so much like Mr. America,” Coronet magazine reported in a profile of Bush published in 1952. “He reminds you of somebody—Will Rogers? Uncle Sam? Anyway, you’ve seen this face before, and it belonged to a man you liked.”6

Bush was the quintessential New England Yankee inventor, born the son of a Universalist minister in Everett, Massachusetts, in 1890. Earning degrees from Tufts, Harvard, and MIT, he worked for the navy during World War I on the problem of detecting submarines and then became a professor at MIT. While working on a “network analyzer” that simulated electrical networks, Bush and his MIT team developed the differential analyzer. The differential analyzer was an early computer that used electromechanical gears and spinning disks to do calculations, a version of the “difference engine” of which the Victorian British scientist Charles Babbage had dreamed. Bush continued to improve the analyzer after being appointed vice president of MIT and dean of its School of Engineering.

In 1938, as the world moved toward the second global war in a generation, Bush’s appointment as president of the Carnegie Institute brought him to Washington, DC. In June 1940, Bush met with President Roosevelt and argued that there was a need for an organization that could coordinate research in military technology. With Roosevelt’s backing, Bush became chairman of the new National Defense Resource Committee (NDRC). In his memoirs Bush wrote: “There were those who protested that the action of setting up NDRC was an end run, a grab by which a small company of scientists and engineers, acting outside established channels, got hold of the authority and money for the program of developing new weapons. That, in fact, is exactly what it was.”7

In June 1941, the NDRC was superseded by the OSRD. As its director, Bush reported directly to the president. He presided over the greatest R&D organization of all time. With inexhaustible military resources at his command and teams made up of great scientists who had fled Hitler, such as Leo Szilard, Enrico Fermi, and Niels Bohr, as well as brilliant Americans, Bush supervised one breakthrough after another, in fields as different as radar and nuclear energy, jet engines, and early computers. Science was now organized, marshaled, and mobilized in the service of the war against Hitler and his allies. The press called Bush “the general of science.”

Bush found the perfect patron and partner in Roosevelt. All his life FDR was entranced by visions of abundance and freedom made possible by technological advances. Following World War I, in his role as assistant secretary of the navy, Roosevelt promoted the infant American radio industry by creating the public-private Radio Corporation of America (RCA). In the only book he ever published, Whither Bound?, a 1926 lecture at the Milton Academy prep school, Roosevelt praised “the scientist people, and economists, and industrialists, and wild-eyed progressives” who were “bringing so many new things into everyday life.” He foresaw the day “when by the twist of a knob or the push of a button you see and talk with some true friend half the world away. . . . Cheaper power, canned power, compressed into light weight and small bulk, will make our present flying abilities childish within our own lives. So, too, with transportation by land.”8

As the war drew to a close, FDR and his chief scientist pondered what was to become of the federally funded research system that had produced so many innovations. Bush recollected: “Roosevelt called me into his office and said, ‘What’s going to happen to science after the war?’ I said, ‘It’s going to fall flat on its face.’ He said, ‘What are we going to do about it?’ And I told him, ‘We better do something damn quick.’ ”9

The president commissioned Bush to write a report on the future of science in America. Roosevelt was dead in 1945 when Bush presented the report to his successor, Harry Truman.10 Published as Science, the Endless Frontier, the book became a best seller. Bush proposed a national research foundation that would provide contracts for research and scholarships to young scientists. Populists in Congress like Representative Harley Kilgore of West Virginia opposed Bush’s vision of federal grants to top-ranking research universities as elitist, preferring a decentralized system of government laboratories modeled on the highly successful federal agricultural research stations. When it was finally created in 1950, the National Science Foundation (NSF) was more modest than Bush had hoped but in the years that followed the system of federal support for science and technology developed largely along the lines he had sketched out.

THE MANHATTAN PROJECT

From this main trail of information about Bush, our imaginary memex permits us to move at any time to side trails linking Bush to specific topics relevant to the third industrial revolution. One is the link between Bush and the development of the atomic bomb.

In December 1938, the success of German scientists in splitting the atom alarmed the physicists Leo Szilard and Eugene Wingner. They persuaded Albert Einstein to write a letter to President Roosevelt warning him of the possibility that Hitler’s Germany might create atomic bombs. In October 1939, a month after Germany invaded Poland, the letter was delivered to FDR by a friend of his, Alexander Sachs. Concerned about the slow speed of research in the United States under the Uranium Committee supervised by Lyman Briggs, the physicists, under Einstein’s signature, wrote FDR again in March and April 1940. But it was not until Britain’s MAUD (military application of uranium technology) committee concluded that an atomic bomb could be made in the next few years that the American government began to move quickly. In fall 1940, the British government sent two leaders of its wartime science effort, Sir Henry Tizard and Sir John Cockcroft, to Washington to share the results of their research into the creation of two fissile materials.

Bush’s involvement with the atomic bomb began in 1940, when he extended the jurisdiction of the newly created NDRC over the Uranium Committee. Under the auspices of the OSRD, organized in June 1941 under Bush, teams at the University of Chicago, Columbia, the University of California, and Princeton worked on the secret atomic project. In spring 1942, California’s Ernest Lawrence made a breakthrough in plutonium production.

On March 9, 1942, Bush wrote FDR: “Present opinion indicates that successful use is possible, and that this would be very important and might be determining in the war effort. It is also true that if the enemy arrived at results first it would be an exceedingly serious matter.” He estimated that bombs could be produced in 1944. Roosevelt answered two days later: “I think the whole thing should be pushed not only in regard to development, but also with due regard to time. This is very much of the essence.”11

In 1942, Bush arranged for what came to be called the Manhattan Project to be turned over to the Army Corps of Engineers under the leadership of General Leslie Groves, but he continued to supervise the work as chair of the Military Policy Committee, which advised the president. The cultures of the US military and American corporate engineers and executives frequently clashed with that of immigrant atomic scientists, many of whom were political leftists suspicious of business and the military.

The secret project had practically unlimited resources. The federal government spent more than $2 billion between 1939 and 1945 on atomic research. In December 1942, a team that included the Italian physicist Enrico Fermi produced the first self-sustaining chain reaction in the first nuclear reactor, located under the bleachers of Stagg Field on the University of Chicago campus. The team leader, Arthur Compton, reported the result to Washington: “The Italian navigator [Fermi] has just landed in the new world.”

Plutonium and uranium were produced in factories in Hartford, Washington, and Oak Ridge, Tennessee—the latter powered by Roosevelt’s Tennessee Valley Authority hydropower dams. The first atomic bombs were assembled at Los Alamos, New Mexico, by a scientific and engineering team led by J. Robert Oppenheimer. And so we return to Bush at ground zero on July 16, 1945, where we began.

THE INVENTION OF THE JET

Following another side link, we find that Bush played an administrative role in the development of the jet engine. Alarmed by intelligence reports about German advances in turbojet technology, in February 1941, General H. H. “Hap” Arnold of the army air corps asked Bush to form a committee on jet propulsion. Bush organized a Special Committee on Jet Propulsion, headed by W. F. Durand, that brought together representatives of GE, Westinghouse, and Allis-Chalmers. GE was selected to develop the turbojet engine designed by Britain’s Frank Whittle.

As often occurs in the history of invention, two inventors—Whittle in Britain and Hans von Ohain in Germany—came up with the idea of gas turbine engines to power aircraft around the same time. A gas turbine engine compresses air to raise its temperature, then forces it through a combustion chamber. The hot air spins the turbine and provides thrust as it escapes through an exhaust nozzle.

In 1928, Whittle’s thesis for the RAF College suggested that piston engines and propellers were inadequate for fast flight at high altitudes. In 1929, he suggested that gas turbines be used, and he obtained a patent in 1932. While serving as an RAF officer, Whittle founded a company, Power Jets, in 1935, to build an engine for a high-altitude, fast-mail plan.12

Meanwhile, Ohain, ignorant of Whittle’s work, obtained the backing of industrialist Ernst Heinkel, along with Herbert Wagner of Junkers and Helmut Schelp of the German Air Ministry.13 Ohain’s engine was tested by the experimental Heinkel-178 on August 27, 1938. Whittle’s engine was tested by the experimental Gloster on May 15, 1941.

Under Bush’s supervision, GE developed Whittle’s engine for the United States. With two GE engines, the Bell XP-59A became the first American jet aircraft to fly, on October 1, 1942.14

Jets went into military service only in the summer of 1944. After 1945, the Allies studied captured German aeronautical research. German swept-wing designs inspired Boeing to put swept wings on the B-47 jet bomber of 1947.15

Britain led the world into the jet age with the DeHavilland Comet. After three Comets disintegrated in midair in 1953 and 1954, all Comet flights stopped, resuming only in 1958. The vacuum was filled by the United States with the Boeing 707 and the Soviet Union with its Tupolev Tu-104. Boeing followed up the 707 with the 727, which could use shorter runways. The global jet era started in 1958, when Boeing 707s began regular commercial flights across the Atlantic. Boeing’s 737 and its jumbo jet, the 747, used a turbofan rather than a turbojet engine. Turbofans maximized their peak thrust at lower speed, making wide-bodied passenger jets and cargo jets possible.16

THE SPACE AGE

In October 1957, the Soviets launched Sputnik, the first artificial satellite. Americans were shocked to find that they had fallen behind in what became known as “the space race.” In response, in 1958 the Defense Department created the Advanced Research Projects Agency (ARPA) and the National Advisory Committee on Aeronautics (NACA) was turned into the National Aeronautics and Space Administration (NASA).

The development of missile and rocket technology for both military purposes and the exploration of space has been an important part of the third industrial revolution. In consulting the memex, we learn that Bush’s main role in this area was that of naysayer. For example, Bush predicted that the ballistic missile “would never stand the test of cost analysis. If we employed it in quantity, we would be economically exhausted long before the enemy.”17

In 1960, Bush told Congress: “Putting a man in space is a stunt: the man can do no more than an instrument, in fact can do less. There are far more serious things to do than indulge in stunts. . . . [T]he present hullabaloo on the propaganda aspects of the program leaves me entirely cool.”18 Bush’s opinion eventually was shared by the American government, which abandoned the Apollo program in the 1970s and then, in the 2010s, shut down the space shuttle without having any other method of sending astronauts to space except for reliance on, ironically, Russian rockets.

Along with exploration of the planets by robotic probes, the most important products of the space program have been satellites, used for military purposes, environmental monitoring, and communications. The United States created the US-dominated Communications Satellite Corporation (Comsat) in 1962, and the International Telecommunications Satellite Organization (Intelsat) in 1964, a multigovernment consortium that was privatized in 2001. The United States controlled 61 percent of Intelsat’s original ownership, compared to 30.5 percent for Western Europe and 8.5 percent for Canada, Australia, and Japan. In order to be less dependent on the United States, the Europeans eventually founded the European Space Agency in 1971.19

As the basis of global communication, submarine cables were eclipsed by communications satellites in the years following Sputnik. By 2000, the majority of international telephony took place by means of satellites.

THE EVOLUTION OF THE COMPUTER

One trail leads us back to Bush’s differential analyzer, from which we follow another trail to the earliest origins of the computer. The US federal government bore paternal responsibility for the infant computer industry. In 1886, Herman Hollerith, an employee of the US Census Office, invented an electrical punch-card reader that could be used to process census information and other data. The company that Hollerith formed in 1896, the Tabulating Machine Company, evolved by 1924 into International Business Machines (IBM). In 1911, another Census Office employee, James Powers, devised an automatic card-punching machine and founded the Powers Tabulating Machine Company which, in 1927, merged with Remington Rand. In the decades that followed, Remington Rand and IBM dominated much of the private-sector development of information technology.

At MIT, Bush advanced the technology of computing with his electromechanical device. An early model of the analyzer inspired a front-page headline in the New York Times in 1930: “ ‘Thinking Machine’ Does Higher Mathematics; Solves Equations That Take Humans Months.”20 Inspired by Bush, others built differential analyzers at Aberdeen Proving Ground, General Electric, and the universities of Pennsylvania, Texas, California, and Cambridge. Other analyzers were constructed in Germany, Russia, Norway, and Ireland. Beginning in 1935, the Rockefeller Foundation invested in the analyzer’s development.21

But the future of the computer would be electronic and digital, not electromechanical. In 1939, IBM funded Howard Aiken, a graduate student at Harvard, on the basis of a memo that Aiken had written about digital computing. By 1944, IBM had developed the automatic sequence controlled calculator.

Bush’s analog approach to computing would be superseded by the far more efficient binary approach. Here, too, there is a link. One of Bush’s graduate students at MIT, Claude Shannon, in his master’s thesis, explored the idea of using electrical circuits to replace the clumsy mechanical components of Bush’s differential analyzer. Shannon proposed using a binary system based on Boolean algebra. When he went to work for Bell Labs, he influenced the evolution of telephone technology. His 1948 work, “A Mathematical Theory of Communication,” developed his binary system, which became the basis of modern telecommunications and computing. It has been called the Magna Carta of the information age.

During World War II, Bush turned down an application to the NDRC for funding a project on digital computers from Norbert Wiener, a leading mathematician at MIT, for fear that it would divert resources from the war effort for a long-term project. For the same reason, Bush also refused to fund the electric numerical integrator and calculator (ENIAC), which was funded instead by the army.

The role of the US military in nurturing information technology began in the 1930s, when the army needed a computer capable of calculating artillery-firing tables. The Army’s Aberdeen Ballistics Research Laboratory provided funding for a team at Pennsylvania’s Moore School of Electrical Engineering led by John W. Mauchly and J. Presper Eckert. Inspired by theoretical work done earlier by Iowa State’s John V. Atanasoff, the Pennsylvania team in 1946 built the first all-purpose electronic computer, ENIAC. The army was joined in its sponsorship of the computer industry by the Office of Naval Research, NACA, and the Census Bureau, with its perennial interest in rapid data processing.22

The initiative then shifted back to Remington Rand and IBM. In 1950, Remington Rand acquired the Eckert-Mauchly Computer Corporation, along with its contract with the US Census Bureau. Then, in 1952, Remington Rand acquired another company, Engineering Associates, formed by veterans of work done for the navy on the use of computers in cryptology.

Meanwhile, in 1950, the head of IBM, Thomas Watson Sr., boasted that a single IBM computer on display in New York could “solve all the important scientific problems in the world involving scientific calculations.”23 Work done by IBM for the attempt of the US and Canadian governments to build a Semi-Automatic Ground Environment (SAGE) air defense against Soviet missiles for North America led to major breakthroughs. By the mid-1950s, IBM was responsible for two-thirds of all computer sales and, after it introduced its System 360 in 1960, it dominated the mainframe computer industry for a generation.

IBM is so important in the history of modern computer technology that we decide to follow a trail to learn more about the company.

“NOTHING IN THE WORLD WILL EVER STOP IT”: THE RISE OF IBM

In January 1926, Thomas Watson Sr., the founding president of IBM, predicted to the star salesmen at his One Hundred Percent Club convention: “This business has a future for your sons and grandsons and your great-grandsons, because it is going on forever. Nothing in the world will ever stop it. The IBM is not merely an organization of men; it is an institution that will go on forever.”24 To date his prediction has been borne out.

In its centennial year of 2011, IBM ranked eighteenth on Fortune’s list of America’s biggest companies and the seventh most profitable, and fifty-second on the list of the Fortune Global 500. IBM was ranked number one in information technology services. In the same year, IBM ranked twelfth out of fifty on a list of the world’s most admired companies, and number one in information technology (IT) services.25 The previous year, the company had filed eighteen thousand patents—more than any company in the world—and spent $24 billion on R&D. Among its projects were Smart Planet, a program to use computer networking to ease traffic and help power grids.

In 1997, an IBM computer, Deep Blue, defeated the chess champion Garry Kasparov at chess. In February 2011, Ken Jennings, the record-holding champion, and another contestant, Brad Rutter, battled Watson on the American television quiz show Jeopardy. Watson was a computer capable of understanding questions in natural language and developed by IBM’s DeepQA project, headed by David Ferrucci. Watson defeated its human rivals, to win the prize of $1 million. In his final Jeopardy response, Jennings, alluding to a line in an episode of the TV cartoon show The Simpsons, wrote: “I, for one, welcome our new computer overlords.” IBM had influenced popular culture before. The company’s name is thought to have inspired the intelligent computer in Stanley Kubrick’s and Arthur C. Clarke’s 1968 science fiction movie 2001: A Space Odyssey, each letter of whose name is one letter removed from IBM: HAL.

Watson was named after IBM’s founder, Thomas J. Watson Sr. The son of a farmer and lumber dealer in upstate New York, Watson began his career peddling pianos, organs, and sewing machines. He discovered his talents as a salesman as a protégé of John H. Patterson, the dynamic and eccentric president of National Cash Register (NCR). Along with Patterson and other NCR managers, Watson was accused by the government of violating antitrust laws as part of a scheme to dominate the used cash register market. Like Patterson, Watson was cleared, but six months later the temperamental Patterson fired him for disagreeing with him in public.

In 1914, Watson became head of the Computer Tabulating Recording Corporation (CTR), whose name he changed to International Business Machines in 1924. Founded a few years before his arrival, in 1911, CTR was the product of mergers of several other companies. The most important was the Tabulating Machine Company, whose founder, as we saw earlier, was the inventor and former Census Bureau official Herman Hollerith.

Watson was a rare combination of technological visionary, marketing genius, and supersalesman. In the late twentieth century, the stereotypical tech company founder in the popular mind was a brilliant bohemian from the San Francisco Bay area who favored informality and a casual approach to organization. Watson could not have been further from that archetype.

A strict Methodist, Watson insisted that his male employees wear only white shirts and dark suits and avoid embarrassing themselves with alcohol. Influenced by Patterson’s methods, Watson created a revival-like atmosphere to inspire his sales force at meetings and the conventions of his One Hundred Percent Club. Alleged to have influenced Japanese and other East Asian managers, Watson motivated employees with inspirational slogans like the Five C’s—Conception, Consistency, Cooperation, Courage, and Confidence—and hymnlike songs, including this, from his days at CTR:

Mr. Watson is the man we’re working for

He’s the leader of the CTR

He’s the fairest, squarest man we know.26

One of his mottoes was: “IBM products are not bought; they are sold.” His most famous slogan became a fixture in IBM offices and advertisements:

T-H-I-N-K.

In 1929, Watson funded a statistical laboratory at Columbia University, where Wallace J. Eckert worked closely with IBM. At Harvard in 1936, Howard Aiken, a graduate student in physics, proposed the creation of a massive computer, inspired by the work of the nineteenth-century British theorist of computing, Charles Babbage. IBM’s chief engineer, James Bryce, brought the idea to the attention of Watson, who funded the project and assigned engineers to assist Aiken. The result was the five-ton Harvard Mark I, completed in 1943. Furious that Aiken neglected to mention IBM’s support at the press conference, Watson got his revenge by establishing the Watson Computer Laboratory at Columbia in 1945. Led by Wallace Eckert, the Columbia laboratory developed the selective sequence electronic calculator, which overshadowed Harvard’s Mark I when it debuted in 1948. Displayed on the ground floor of IBM’s headquarters in New York City, the computer became a sensation.

FROM SAGE TO SABRE

According to legend, a myopic Watson stated after World War II that there was only a market for a dozen or so computers in the world. In reality, IBM was working on numerous computer projects at the time. When the Korean War began in June 1950, IBM won a government contract to develop a “defense calculator.”

IBM’s most important military contract of the 1950s was the Semi-Automatic Ground Environment (SAGE) program. The ancestry of the SAGE project can be traced to a memo written by Jay Forrester in 1948, outlining a computerized national air-defense system inspired by radar defenses in World War II for North American defenses against Soviet bomber (and later missile) attacks. Forrester had been working since 1944 at MIT on Project Whirlwind, a digital air-combat-information program. When IBM received a contract to work on SAGE, it received the Whirlwind technology. In addition to IBM, contractors on the SAGE project included MIT’s Lincoln Laboratories, Western Electric, the SDC branch of RAND, and the Burroughs Company. The Computer System Division of Lincoln Laboratories in 1958 became the MITRE Corporation, which worked on software and systems integration.

When complete, the SAGE system consisted of twenty-three concrete bunkers in the United States and one in Canada. Each of the direction centers contained an IBM AN/FSQ-7 computer, along with a standby. Each AN/FSQ-7 weighed 250 tons, and contained forty-nine thousand vacuum tubes. The installations were designed for the simultaneous analysis of vast amounts of data coming in from radar on the ground and mounted on ships and planes.

The most ambitious computer project in history to date, the SAGE system was completed in 1963 and remained operational until it was decommissioned in 1983. Although it was technologically obsolete almost as soon as it was finished, the system helped to inspire later innovations. The linkages between the nodes in the SAGE system helped to inspire J. C. Licklider’s musings, which in turn led to the development of ARPANET and the Internet.

For its part, IBM drew on its experience in the SAGE project in the early 1960s when it received a contract from American Airlines to devise a computerized airline reservation system entitled SABRE (Semi-Automatic Business Research Environment—even the name was modeled on SAGE). SABRE became the basis of modern airline reservations.

THE COMPUTER THAT IBM MADE, THAT MADE IBM

But it was in civilian office computing that IBM would make its greatest mark. Earlier in 1947, J. Presper Eckert and John Mauchly had incorporated the Eckert-Mauchly Computer Corporation to make the UNIVAC and other computers. Their commercial difficulties led the two to visit Watson and his son and eventual successor as head of IBM, Thomas Watson Jr. Perhaps remembering his earlier brush with antitrust law at NCR, the senior Watson had checked with IBM’s lawyers and told the inventors that the Justice Department probably would not allow IBM to absorb their company because of antitrust considerations. Instead, the rights to UNIVAC were sold to James Rand, the president of Remington Rand, making the company a leading computer manufacturer. (In 1955, Remington Rand merged with the Sperry Corporation to become Rand, later Sperry; a merger between Sperry and Burroughs in 1986 produced Unisys.) On live television on election eve 1952, a UNIVAC computer correctly predicted a landslide for Republican presidential candidate Dwight Eisenhower (the computer’s operators thought at first the computer must have been mistaken).

Motivated by competition with Remington Rand, IBM in 1953 brought out its inexpensive model 650, which used magnetic tape instead of punched cards. Thomas Watson Jr. observed that “the 650 became computing’s ‘Model T.’ ”27

In the early 1960s, IBM took on the challenge of providing office computers that used compatible software. In secrecy, the company sponsored one of the greatest corporate research programs in history, code-named the New Product Line. Thousands of programmers and engineers labored urgently in multiple laboratories and IBM began to manufacture more semiconductors than any other company in the world. Finally in April 1964, IBM unveiled its System/360 product line of software-compatible mainframe computers. The System/360 was described as “the computer that IBM made, that made IBM.”

FROM VACUUM TUBE TO SILICON CHIP

Early computers were hobbled by reliance on vacuum tubes that took up space and generated heat. At AT&T’s Bell Laboratories, the physicist William Shockley led a research group that developed solid-state transistors between 1947 and 1950. By the mid-1950s, Texas Instruments led in the manufacture of silicon transistors. The next step, propelled by US military demands for smaller computers, was the combination of transistors on a single circuit board. Working independently of each other, Robert Noyce at Fairchild Semiconductor and Gordon Moore at Intel invented the silicon “chip,” a single “integrated circuit” that combined transistors and capacitors. Moore’s law, named after Gordon Moore, was based on the fact that the number of transistors per microchip doubled roughly every seventeen months after 1972, when the Intel 8008 chip had 2,500 transistors, to 2000, when the Pentium 4 processor had forty-two million transistors.

Soon there were three kinds of integrated circuits: memory chips, microprocessors, developed by Intel, and microcapacitors. Raytheon and the optical equipment manufacturer Perkin-Elmer developed a method of photolithography fabrication that made possible the mass production of silicon chips.

SIDE TRAIL: RAYTHEON AND THE RADARANGE

Let us follow a side trail on the imaginary memex to learn more about Raytheon. It brings us to the microwave oven, the first major innovation in cooking since primitive hominids began to cook with fire.

Percy Spencer, an engineer working at Raytheon, was startled one day in 1946 when a candy bar in his pocket melted as he was working on a new vacuum tube called a magnetron. Realizing that the magnetron was the cause, Spencer successfully popped popcorn kernels by placing them nearby and then cooked an egg, which exploded in his face. By spring 1946, Spencer and a colleague, P. R. “Roly” Hanson, were working on a project given the secret code name Speedy Weenie. Their work led to Raytheon’s 1946 patent for the microwave oven. In 1947, a contest among employees to name the device produced a winner: “Radarange.” By 1976, more American households owned a microwave oven than owned a dishwasher.28

At the time that it invented the microwave, Raytheon was a leading manufacturer of vacuum tubes, having acquired or merged with other companies in the field including Acme-Delta and Q.R.S. Company. In 1928, it had chosen the name Raytheon Manufacturing Company to replace its previous name, the American Appliance Company, because of the visibility of one of its products called the Raytheon (“light of the gods”), an electron tube used in a “battery eliminator” that converted the alternating current in household wiring to direct current that could be used in radios, as an alternative to batteries.

The Raytheon electron tube was the invention of Charles G. Smith, who had founded the American Appliance Company in Cambridge, Massachusetts, in 1922, with two partners. One was an engineer named Laurence K. Marshall. The other founder of the company to be known as Raytheon was Marshall’s engineering school classmate and college roommate at Tufts University, the thirty-two-year-old Vannevar Bush.

THE ORIGINS OF SILICON VALLEY

Using a skip trail on the memex, we return to the main trail and Bush. Was there any connection between Bush and Silicon Valley? Under the label “Silicon Valley” we find an interesting side trail.

We learn that Professor Bush’s first graduate student at MIT was named Frederick Terman. After taking a job as a professor of electrical engineering at Stanford University, Terman was disappointed by the lack of employment opportunities for graduates of his department in Northern California. With his encouragement, two of his students, Bill Hewlett and Dave Packard, founded an electronics company named Hewlett-Packard in Packard’s garage. Some myths are true; the garage is now a historical landmark in Palo Alto.

After working during the war at Harvard to develop the technology of radar, Terman resumed his post at Stanford and joined others in an attempt to make that university a center of high technology in collaboration with business and government. The railroad baron Leland Stanford, one of the captains of industry of early industrial America, bequeathed his eight-thousand-acre ranch to the university that bore his son’s name. Terman and his colleagues leased out Stanford’s acreage, now called the Stanford Industrial Park, to select high-technology firms including General Electric and Eastman Kodak. One of Terman’s biggest prizes was William Shockley. Shockley joined the faculty at Stanford and directed the Shockley Laboratory of Beckman Instruments.

Several of his protégés, nicknamed “the Traitorous Eight,” quit to form Fairchild Semiconductor. Its veterans in turn went on to found dozens of companies like Intel in what became Silicon Valley, an area that included Palo Alto and—here is a link to the second industrial revolution—Menlo Park, named after Thomas Edison’s famous research laboratory in New Jersey.

Meanwhile, Hewlett-Packard had grown into a substantial electronics firm that moved into the computer market. The earliest documented use of the term “personal computer” has been found in the October 4, 1968, issue of Science magazine, in an ad for Hewlett-Packard’s HP 9100: “The new Hewlett-Packard 9100A personal computer is ready, willing, and able . . . to relieve you of waiting to get on the big computer.”29 At forty pounds and costing nearly five thousand dollars, the HP 9100A could be improved upon. And it was—by, among others, Steve Wozniak, who worked for HP before teaming up with Steve Jobs to found Apple Computer.

In another garage, the garage of Jobs’s parents’ house, Wozniak and Jobs experimented with assembling small personal computers. Wozniak’s boss at Hewlett-Packard reportedly told him, “HP doesn’t want to be in that kind of market.”30 Jobs and Wozniak founded Apple Computer, Inc., which in 1977 brought out the first successful personal computer (PC), the Apple II.

Jobs went on to have one of the most remarkable careers in the history of American business. Apple developed a cultlike following with its Apple Macintosh PC. But Jobs was forced out of the company by its board of directors. In 1985, he founded another company, NeXt. When NeXt was bought by Apple, Jobs returned as CEO from 1997 to 2011, overseeing the release of the innovative iPod, iPhone, and Apple Tablet.

Two other hobbyists, Bill Gates and Paul Allen, wrote beginners all-purpose symbolic instruction code (BASIC) to be used by Atari fans. They went on to found Microsoft, which began as a small Seattle company with only a few dozen employees.

Having decided to enter the personal computer market, IBM decided that it needed skilled outsiders to provide software. First it approached Gary Kildall of Data Research. For reasons that remain disputed, IBM instead chose Microsoft, run by Allen and the twenty-nine-year-old Gates. Microsoft bought software from a local firm, Seattle Computer Products, and developed it into the operating system MS-DOS. IBM brought out its personal computer, the IBM PC, in August 1981. Bundled with most IBM PCs and compatible machines, MS-DOS became the industry standard after IBM chose to buy its operating software from Microsoft—making Gates for a time the richest person in the world—and its microprocessors from Intel.

VENTURE CAPITAL

The term “venture capital” is frequently found in the memex discussion of Silicon Valley, so we follow a side trail to a treatment of that topic that begins with George Doriot.

Doriot is often identified as the founder of the American venture capital sector. This son of a founder of France’s Peugeot car company moved to the United States after World War I and became first a student and then a professor at Harvard Business School, where he taught for more than four decades.

During World War II, Doriot went to work for the US Army’s quartermaster corps as head of research and development, overseeing the creation of the portable meals known as K-rations and water-repellent boots and clothes, and taking part in the crash program to develop synthetic rubber. Plastic armor capable of resisting bullets was named Doron after him.

Following the war, Doriot, who had been promoted to general, went back to Harvard Business School. He founded American Research and Development (ARD), a pioneering venture capital company that commercialized new technologies, many of them devised at MIT. One of the companies that ARD invested in was Zapata Off-Shore, founded by the son of Connecticut senator Prescott Bush, the young George Herbert Walker Bush.31

At Harvard Business School, General Doriot taught a popular class called Managing. Doriot tried to interest one of his students, Tom Perkins, into succeeding him at ARD. Instead, Perkins teamed up with an Austrian Jewish refugee from the Nazis, Eugene Kleiner, to form Kleiner Perkins in 1972. Kleiner Perkins and other venture capital firms played an integral role in the development of the tech industry in Silicon Valley and elsewhere.

OF MICE AND HYPERTEXTS

Returning to Vannevar Bush, we follow another trail on the imaginary memex that connects him with Douglass Engelbart. In 1962, Engelbart, then an engineer at Stanford Research Institution, wrote Bush: “I re-discovered your article about three years ago, and was rather startled to realize how much I had aligned my sights along the vector you had described.”32

Engelbart was in the navy, working as an electronics technician, when he read “As We May Think” at the time of its publication in 1945. Influenced by Bush’s description of the memex, he came up with the idea of a display like that of a radar set capable of interaction with users. He labored for years developing his ideas for an NLS (oNLine system) at Stanford University, before unveiling the finished product in San Francisco’s Brooks Hall on December 9, 1968. In what has been described as “the mother of all demos,” Engelbart demonstrated the use of the computer mouse to control symbols on a screen, along with texts and graphics sharing a screen, videoconferencing, and hyperlinks.33

Using “hyperlink” as the key phrase, our imaginary memex allows us to follow a “skip trail” to Theodore H. “Ted” Nelson. “Bush was right,” Nelson declared in a 1972 paper entitled “As We Will Think.”34 Nelson coined the term “hypertext” for the two types of trails that Bush imagined for his memex: side trails and step trails. Tim Berners-Lee incorporated Nelson’s term into “hypertext transfer protocol,” or “http.” Berners-Lee also named the World Wide Web, the source of the Internet address www.

THE INTERGALACTIC COMPUTER NETWORK

On the imaginary memex, we return to “As We May Think” and read: “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” In his 1965 book Libraries of the Future, J. C. R. Licklider described “As We May Think” as the “main external influence on his ideas.”35

Licklider, a psychologist and computer scientist, worked for the Advanced Research Projects Agency (ARPA), created in order to achieve an American lead in technology following the shock of the successful launching of the first satellite, Sputnik, by the Soviet Union in October 1957. ARPA was renamed the Defense Advanced Research Projects Agency (DARPA) in 1972. Renamed ARPA in 1993, it became DARPA again in 1996 so that it could be shielded against conservative opposition to government spending on science and technology that is not defense related.

Licklider proposed a computer network allowing researchers working on defense contracts to communicate with each other. His 1962 memo about an “Intergalactic Computer Network” laid out a vision of the Internet, the first element of which was created by ARPA and MIT in the form of ARPANET, the world’s first packet-switching network. Contrary to folklore, the purpose of ARPANET was to allow researchers working on projects for ARPA to communicate with each other, not to create a communications system to survive nuclear war. In 1986, ARPANET was connected to NSFNET, a network created by the National Science Foundation (NSF) to allow researchers funded by its grants to communicate with each other. NSFNET was opened first to all academics and then to businesses and the general public, evolving into today’s Internet, which is global if not yet intergalactic.

A side trail leads us from NSFNET to Vannevar Bush’s brainchild, the NSF, from which another side trail goes to a discussion of the Digital Library Initiative (DLI). Among the graduate students funded by the DLI project were Larry Page and Sergey Brin, who was also supported by an NSF graduate student fellowship. Their research led them to create a superior search engine and in 1998—with an initial office in a garage, of course—they incorporated Google, Inc. With the help of Eric Schmidt as CEO, Page and Brin defeated competitors like Inktomi and Dogpile and built Google into the world’s dominant search engine.

Google’s search engine results were closer than anything yet to the trails on the imaginary memex. Searching for any number of topics involved in the third industrial revolution would lead to articles and books mentioning Vannevar Bush and “As We May Think.”

Right back where we began.