“Curiosity leads you to be interested in all sorts of disciplines, which means that you can stand at the intersection of the arts and sciences. To me, that’s where creativity occurs—at that intersection.”
Americans have always seemed to be innovative, inventing equipment, tools, processes, and services that have helped the country meet its needs, grow its capabilities, and, as a by-product, increase its wealth.
Nowhere has this been more true than in the technology sector since World War II. Engineering skills developed in the military and in academic life, combined with a renewed and increasingly driven entrepreneurial instinct, sparked a tech boom that changed the world in ways that once seemed unimaginable.
While the computer’s forerunner was actually invented by a British woman, Ada Lovelace, in the nineteenth century, the computer was improved and enhanced dramatically by a number of large American companies, principally IBM in the 1950s and ’60s. But the computer, initially quite large and cumbersome by today’s standards, spawned—most especially in the newly named Silicon Valley area of Northern California—a whole variety of products that revolutionized the business world as well as life itself.
Transistors, microprocessors, computer software, personal computers, smartphones, search engines, social media platforms, and artificial intelligence, among other tech innovations and inventions, created infinite new ways to solve problems, get information, communicate, conduct business, and live.
And the geniuses, entrepreneurs, and technology creators who produced this revolution became household names for their creativity, drive, wealth, philanthropy, and, in some cases, eccentricities and lifestyles. David Packard, Walter Hewlett, Robert Noyce, Gordon Moore, Andy Grove, Steve Jobs, Bill Gates, Jeff Bezos, Sergey Brin, Larry Page, Mark Zuckerberg, and Elon Musk, among others, became tech and business icons as they helped shape the modern world.
Were they geniuses? Did they get lucky? What were their secrets? There is no simple answer, but Walter Isaacson provides perceptive insights into these questions in his book The Innovators.
Walter is an extraordinary writer, while also having many other legendary intellectual skills. He has focused his best-selling books on individuals who could fairly be seen as geniuses—Henry Kissinger, Benjamin Franklin, Albert Einstein, Steve Jobs, Leonardo da Vinci, and, most recently, Jennifer Doudna (recipient of the 2020 Nobel Prize in Chemistry for the CRISPR gene-editing process).
Those individuals, as well as the many individuals described in The Innovators, have several things in common. One is that they tended to build on what had already been discovered or invented—virtually nothing was created out of “whole cloth.” And two, they often worked as part of a team. Lone, madcap geniuses tended to be an image rather than the reality.
This interview took place virtually on October 5, 2020. I have interviewed Walter about almost all of his books. How he wrote so many of them while having a full-time job like chairman and CEO of CNN or president and CEO of the Aspen Institute is something I cannot really understand. He says he writes at night and is not distracted by television. He does not own one. If I had only known that was the secret to great writing.
DAVID M. RUBENSTEIN (DR): In your research and writing about great innovators like Leonardo da Vinci and Franklin, Einstein and Jobs, have you found any common traits? Do they tend to be the leaders, the loners, often associated with individual creativity and genius? Or is the reality more of a collaborative effort?
WALTER ISAACSON (WI): There are two interrelated traits that are common to all the innovators I’ve written about. The first of these is curiosity—pure and passionate and playful curiosity about everything. Like Benjamin Franklin, as a teenager, going over to England for the first time and measuring the water in the ocean because he’s trying to figure out how does the Gulf Stream work. Or Leonardo da Vinci, my favorite, who in his notebooks writes things in the margins like “Describe the tongue of the woodpecker.”
Who wakes up in the morning and wants to know what the tongue of a woodpecker looks like? A curious person does. And that’s Leonardo.
Einstein and Leonardo both wrote in their notebooks, “Why is the sky blue?” Now, we all see blue skies, but we forget to be curious about “Why is it blue?” And of course Steve Jobs had a voracious curiosity.
The other thing about their curiosity was that it crossed all sorts of fields. Whether it was Steve Jobs being curious about calligraphy and coding or Leonardo being curious about art and anatomy, they wanted to know everything you could know about everything knowable.
That curiosity leads you to be interested in all sorts of disciplines, which means that you can stand at the intersection of the arts and sciences. To me, that’s where creativity occurs—at that intersection. People can have a foot both in engineering and in the humanities or in science and the arts.
So, a wide-ranging curiosity, one that allows you to see that patterns exist across nature and how those patterns ripple, whether it’s Leonardo da Vinci’s spirals of water becoming the curls of hair of the Mona Lisa or Steve Jobs understanding the beauty of calligraphy and ingraining that in his first Macintosh.
DR: Is there something about the way the U.S. developed from its early days that encouraged innovation or creativity?
WI: Ingrained in the DNA of our country is that the people who came here were either pioneers or refugees. They were the second and third children who had to strike out on their own. They were people escaping oppression and looking for freedom. They were going on an errand into the wilderness.
They are people who are used to uprooting, changing their minds, and being part of a frontier. Whether that was the literal frontier that existed until around 1900 or things like the electronic frontier, people in the United States were more willing to uproot from their Old World and take the risks of embarking on an errand into the wilderness.
DR: In recent decades, the science and technology worlds appear to be dominated by American companies. Is that because the U.S. seems to be more encouraging of innovation than, to mention one area, Europe?
WI: It’s partly because the U.S. has less regulation and allows more freedom, and the fields in which there was less regulation were the ones where the most innovation happened. We think that the U.S. is the most innovative because we look at things like the Internet and computers and social networks and social media, which tended to be rather unregulated things. In fields that are more regulated, whether it be physical things like batteries and nuclear power or air travel or even, to some extent, pharmaceuticals, the U.S. doesn’t have quite the same advantage.
Secondly, the U.S. is better at allowing people to take risks. Especially out in Silicon Valley, if you haven’t failed two or three times, nobody’s going to take you seriously. Whereas in Europe, if you fail once or twice, you’re probably not going to get your foot in the door looking for financial backing.
DR: When did American leadership in such areas, like computers and semiconductors, really start?
WI: It happens right after World War II. During the war, Germany, Britain, and the United States were all developing digital computers, mainly to calculate missile trajectories and other wartime needs.
As Leonardo da Vinci knew from working for the warlord Cesare Borgia and the duke of Milan, war tends to stimulate technology. And that’s what happened with computers.
But the genius in America is that right after World War II, leaders like Vannevar Bush came up with the concept that we had to have science as the next frontier. They said that there was going to be a three-way partnership between universities, government, and corporate America, that the new types of labs for computers weren’t going to just be done in the government, the way the atomic bomb was done with the Manhattan Project. We were going to have places like RAND and Bell Labs, and the government was going to create the National Science Foundation to give grants to universities to do research.
ENIAC was really the first general-purpose computer, invented by the War Department and the University of Pennsylvania during World War II. That spins out into a private company, which becomes UNIVAC and eventually Unisys and Sperry Rand. The ability to have that three-way partnership distinguished the United States from other countries.
DR: What was the impact on innovation from returning World War II veterans who had technology training?
WI: I’ll tell a personal account. Innovation in America is not just done by huge companies; it’s about thousands of foot soldiers in the progress of innovation.
My father left Tulane his senior year, with six buddies from engineering school, to make sure they could join the navy before the end of World War II. They were trained in radar and supply chains and even refrigeration and sent to the South Pacific.
When my father got back in 1947, he became an electrical engineer, because he had been trained. He invented new ways to air-condition department stores and movie theaters in New Orleans with his buddies.
And he was just one of tens of thousands of returning World War II veterans who got their technology training but also learned to take risks that you have to take during wartime. He became a great innovator as well as a small-business owner in New Orleans.
DR: Did the growth of venture capital after World War II, and particularly from the 1960s onward, have an impact on fostering innovation?
WI: When we think of great innovators we often think of scientists or engineers. But one of the most important inventions that happens in the 1960s is venture capital. It’s people like Arthur Rock, who had worked for investment banks in New York, who decides to go west and invest in new ventures.
Up until then you had the Rockefeller family, Laurance Rockefeller and his siblings, doing things like Venrock. But you didn’t have firms that raised capital and said, “We’re going to bet on new entrepreneurs.” So, if you look at great innovators of the digital revolution, Arthur Rock and his successors really came up with a new invention, which is raising capital funds. Getting people to have equity stakes in new ventures that hadn’t yet started up. Being angels for entrepreneurs.
Rock helps Robert Noyce and some of the rebels at Fairchild Semiconductor start what becomes Intel. That led to the birth of a venture capital industry, which was one of the ingredients that made Silicon Valley the cradle of innovation more than even Boston or New York.
It almost echoes Florence five hundred years earlier, when the Medici family and others invented new ways of doing double-entry bookkeeping, so they could do debit and credit financing. That provided funding that helped the Renaissance flourish.
DR: What were the big advances in computers in the post–World War II period?
WI: The biggest advance was the realization that computers should be personal. In World War II, there were these huge computers, like Colossus that Alan Turing worked on in Bletchley Park in England, or ENIAC at the University of Pennsylvania. After the war, companies like Sperry Rand and Digital Equipment Corporation thought that computers were going to be huge machines owned by corporations and the government and maybe universities.
What happened in the early 1970s, because of a confluence of forces, is that a group of people—hobbyists and hackers and rebels, computing-power-to-the-people types—decides, “Let’s make the computer personal.” That mind-set coincides with Intel creating microprocessors that allow people like Steve Wozniak and Steve Jobs to say, “We can make our own personal computers.”
What distinguished the United States in its digital revolution from other places is that great entrepreneurs snatched computing power from the big corporations and turned it into personal computers. The advent of personal computers enabled creativity, entrepreneurship, and innovation in garages and garrets and dorm rooms around America from then on.
DR: You think if they didn’t have garages, Silicon Valley would never have gotten anywhere?
WI: Larry Page and Sergey Brin knew the Wojcicki sisters, who had two things: they had a garage, and they had a friend who was in the venture capital business. This program Page and Brin created called PageRank eventually became Google.
DR: What led to the development of transistors? What was their impact?
WI: During World War II, most of the computers used vacuum tubes, which only you and I are old enough to remember. They were like lightbulbs, and they burned out, and you had to replace them, and they were hot, and they used electricity.
Right as the war was ending, people who had been engaged in the war effort, who had worked at Bell Labs, came back. They were given the task of figuring out how to replace vacuum tubes so that the Bell system could amplify phone calls from coast to coast without having all these vacuum tubes that would burn out.
What allows Bell Labs to invent the transistor is that it was a place that mixed everybody from theorists, to practical engineers who had come back from the war, to experimentalists, to pole climbers with grease under their fingernails who strung phone lines, to people who understood how to turn something into a business. So in December 1947, a theorist like William Shockley, who understood and could visualize how electrons danced on the surface of semiconducting materials such as silicon, could pair with an experimentalist like Walter Brattain, who could take a chip of silicon and germanium and a paper clip and solder it together and put it underwater and see if it all worked.
The transistor allows the digital revolution to happen, just like the dynamo or the steam engine allows the Industrial Revolution to happen. The invention of the transistor is the key thing, because the transistor is simply a tiny on/off switch.
The digital revolution is based on the theory that information can be encoded as zeroes and ones—in other words, it’s on and off—and that you can build circuits that can manipulate this information and say “Yes, no, if this do that,” based on zeroes and ones. To make that work you needed an on/off switch that was tiny, and that’s what the transistor was.
DR: What led to the development of semiconductors? How did that speed up technology development?
WI: One of the distinguishing things about U.S. antitrust law and patent law is that it incents a big corporation, like the Bell system, to take a patent but license it out rather freely so that they wouldn’t be accused of an antitrust violation. Everyone from Fairchild Semiconductor to Texas Instruments, which originally was an oil field company, decides to license the transistor.
They try to figure out how to make the transistor better. At what becomes Intel and also at Texas Instruments, they realized that you could etch many components, including transistors, on a single chip of silicon. That becomes the microchip, which is the next great advance in semiconductors.
Underlying that theory is the same simple on/off switch. Semiconducting materials such as silicon can be juiced up in ways to become on/off switches.
DR: What is a microprocessor? Is that the same as a microchip?
WI: No. A microchip is when you take a lot of transistors, say, and etch them on a chip. But at a certain point in the 1970s, Intel figured out a way to take this chip and to put together all of the components you might need for a circuit—transistors and resistors and capacitors—and etch them all on the same chip.
The subtle but huge breakthrough is that instead of just making this as a special-purpose chip, like for a calculator for a specific company, Intel made it so that those chips could be reprogrammed. You could take a chip that had all these components on it and program it to do whatever you want.
That becomes a microprocessor, which is the kernel of a computer. Back in the early days of computers, these processing systems were huge. But after Intel invents the microprocessor, it becomes the heart of a computer on a chip that you can put in the palm of your hand.
DR: When did the first minicomputers begin to replace the large-scale computers that IBM had developed?
WI: In the early ’70s, after Intel creates the idea of a microprocessor, people like Ed Roberts, who ran a hobbyist company in Albuquerque, say, “I can use these types of microprocessors and make a kit so that hobbyists can build a computer.” That becomes the Altair, the first personal computer. It was just done for hobbyists and hackers. Didn’t have much use.
As soon as Bill Gates saw that on the cover of Popular Electronics, he and his friend Paul Allen said, “We’re going to create software for this Altair.” In the meantime, at the Homebrew Computer Club up near Palo Alto, people like Steve Wozniak and Steve Jobs were hanging out. They said, “We can use this tiny microprocessor from Intel and we’ll build our own computer.” And they built the Apple I and then the Apple II.
So hackers and hobbyists, as well as sort of these hippie-like Whole Earth Catalog–reading people, ranging from Steve Jobs to Ed Roberts, who wanted to take computing power away from the big companies and give it to the people, then the peace movement, free speech movement, power to the people—they all jell in places like the Homebrew Computer Club in the early ’70s.
And the hackers and the hobbyists and the Homebrew types all start building their own computers. Out of that comes the Apple I, the Apple II, and eventually the Macintosh, but also many other computers.
DR: You mentioned Bill Gates. What led to his company becoming the dominant software producer for computers? There were many other companies producing software in the early days of the so-called software revolution.
WI: Bill Gates had a singular insight, one of the most important, innovative insights in the business of technology, which is that it was not going to be about the hardware, it was going to be about software. And that eventually, whether you were Dell or Sperry Rand or IBM, the hardware would be pretty interchangeable, but whoever made the operating system software would be at the lead of innovation.
Early on, when they were big computers owned by grand corporations, it was boys with their toys. The men made the computers and then they hired women like Grace Hopper and the six women who programmed ENIAC, thinking that programming was just a clerical thing that women could do. But when the inventors of ENIAC eventually create the company UNIVAC, they’re smart enough to hire Grace Hopper and the women who did the programming. And they create things like COBOL.
There was a struggle between who was going to be in control, the hardware manufacturers or the software writers. Then Bill Gates, with help from his father, who was a great lawyer, figured out a way to write and adapt an operating system for a personal computer, and then not sell that software to IBM but instead give them a nonexclusive license to it. The software company, which becomes known as Microsoft, becomes more powerful than the hardware companies such as IBM and Dell and DEC.
DR: Who really invented the Internet? The French had a predecessor called the Minitel. Why did that not take off around the world?
WI: With all due respect to Al Gore, the Internet has many inventors. The reason the French system didn’t catch on is that it was a centralized system. One of the rules of innovation in the digital revolution is empower the fringes and decentralize and distribute authority.
What happens in the United States is that the Defense Department is trying to create a system to link the research computers at the various universities they were funding so that they could time-share. They tell the professors at these universities, “You have to figure out a way to link to our network.”
The professors do what they always do. They delegate that task to their graduate students. About thirty of them joined together to invent what becomes known as ARPANET, the predecessor to what is now the Internet. It was based on a system called packet-switching, which meant that, unlike Minitel in France and unlike the phone company in the United States, there were no central hubs in which the information was controlled by whoever ran the system.
In a packet-switch network, the information is all broken up into small packets. It scurries through a web, with address headers so it knows where to go and how to reassemble itself when the packets get where they’re supposed to be.
It means that every single node on that network has the power to create and store and transmit and forward information. It becomes a web in which there’s no central control mechanism.
Later, at Time magazine, we once wrote that that was done to survive a nuclear attack from the Russians. If you have a centralized system and you take out one of the hubs, you can screw up the whole network. But with the Internet, if any one of the thousands or millions or billions of nodes gets knocked out, the information just knows how to route around that.
That distributed system, where every node has equal power to create information, is at the heart of the Internet. It’s not centrally controlled, unlike the systems that were being developed by British Telecom or Minitel by the French Telecom.
DR: What role did Marc Andreessen and the company Mosaic that he helped to create play in fostering the widespread use of the Internet?
WI: Mark Andreessen’s contribution was huge. The World Wide Web, which is a set of protocols for easily navigating the Internet, had been created at CERN in Switzerland by a guy named Tim Berners-Lee. But what turns out to be the most important element for that is the piece of software called the browser that allows a normal person to easily navigate the web.
When Mark Andreessen was at the University of Illinois, a big, corn-fed Iowa guy, he does what great innovators do. He combines a feel for technology, because he was a great computer coder, with a feel for the humanities.
He knew how people interface with great products. He created the Mosaic browser, which had wonderful technical features and was done in a smart way. It was made public. It was made free. It was almost as if it were more open-source than proprietary.
So everybody got to use the Mosaic browser, and it caught on. That not only made the browser important, but it caused the web, the World Wide Web, to be the best way to navigate the Internet.
Only people like me, who are early Internet geeks, remember that it wasn’t inevitable that websites with hyperlinks and hypertext were going to be the way the Internet became easy to navigate. There were things like Gopher and Veronica and Archie and Send and Fetch and all these other ways to navigate the Internet.
But the Mosaic browser becomes the popular user interface. Just like Steve Jobs took the personal computers of the early days that were for hobbyists and hackers and said, “I’ll make an easy graphical user interface and make it easy for people to interface with their computer,” Mark Andreessen had the same innovative spirit. He says, “I’ll make it easy for people to interface with the World Wide Web.” And the way he did it was so that you could hop around anywhere.
DR: What led to the development of the smartphone?
WI: Steve Jobs’s great innovative genius was connecting our technology to us as humans, which is what he did when he created the Macintosh, which is easy to use. When he comes back to Apple in the late 1990s, after having been fired from the company twelve years earlier, he and his brilliant team sit around grousing, “Our cell phones suck. It’s not intuitive. There’s no screen on which to see things.”
He made one of those great creative and innovative leaps. Having been in the business of personal computers, he said, “Let’s reinvent cell phones and do for them what we did in the early days of computers, which is make them intuitive and easy to use.” He had already invented the iPod, which had a way to put a thousand songs in your pocket. It was just a beautifully intuitive music player.
He said that if the people who made cell phones figured out a way to make them easy to use and to put music on them, it would kill the iPod. So he decided he was going to create a cell phone that was an easy-to-use music player, a cell phone, and also an easy-to-use personal assistant and computing device—all three rolled into one.
In a stroke of genius that he did not know was a stroke of genius initially, he creates a place where people can put apps. When he eventually opens up the App Store to outside developers, you get things from Amazon to Uber to Airbnb.
DR. What was the innovation that led to the widespread development of e-commerce?
WI: Unlike some of the other entrepreneurs like Steve Jobs or even Bill Gates, who came at it from inventing a product, Jeff Bezos came at it from a business and finance mind-set as well. He figured out how you could do an easy-to-use online store.
That coincides with the explosive growth of personal computing and then the advent of smartphones. By the late 1990s, when Amazon is coming along, it coincides with a period when everybody is getting easy-to-use personal computers and easy-to-use access to the Internet.
In the beginning of the 1990s, an ordinary citizen could not go on the Internet. You could go on an online service like America Online or CompuServe. But those were walled gardens that had their own ecosystems.
Al Gore gets made fun of, but the most important innovation in the early ’90s was the Gore Act [the High Performance Computing Act of 1991] and a subsequent act the following year [the Scientific and Advanced Technology Act of 1992], which opened up the Internet to people who want to dial in and use it for personal or commercial reasons. He invents things like the “.com” address, which means that you don’t have to be at the university or a major corporation to get on the Internet. You can create your own business.
Gore opening up the Internet to things like dot-coms in the early 1990s, the spread of easy-to-use Internet interfaces such as Mark Andreessen’s Mosaic web browser—all of this laid a fertile field for a guy like Jeff Bezos to come in and say, “I’m now going to create a store that will sell books on the Internet, and I’m eventually going to make it an everything store.” E-commerce is one of those things that was largely driven by one great, creative visionary, and that was Jeff Bezos.
DR: What was the innovation that led to the development of social media?
WI: The first insight into social media, I think, was Steve Case at America Online. This is before the Internet was opened by the Gore Act. People would go onto services like America Online or CompuServe or Prodigy. Those services had information you could get—stock prices, sports scores, weather, news.
What Steve Case realized with America Online was that community and social networking were the killer app—not only inventing that wonderful phrase “You’ve got mail” but creating easy-to-use bulletin boards and chat rooms and instant messaging services all embedded in the early AOL. That caused the rise of bulletin boards on the web.
When the web takes off in the 1990s, it leads eventually to services like the WELL and other online communities. Then it gets driven by various entrepreneurs who create things like MySpace.
Then, famously, Mark Zuckerberg is in a dorm room at Harvard trying to create a college facebook service where you can connect with other people at your college. Zuckerberg ends up winning because he makes his the best and the easiest to use.
There’s also a network effect. If you’re on AOL and your friends are on CompuServe, it doesn’t quite work. Once somebody has the place everybody wants to go to, it goes into hyper growth mode, because everybody wants to be where everybody else is. Facebook won that race.
Facebook did it by creating a better product. They did it by doing things that can be a bit harmful, like becoming addictive or incenting people to send out things that enrage or incite them.
But it was mainly done by creating a product that made it easier for people to connect. Once again, it was led by the type of person who would connect technology to humanities, who understood “Hey, a like button will work” or “A share button will work.”
DR: Two final questions. Do you see any signs that innovation is slowing down in the U.S. compared to China or other countries?
WI: There is a danger. Part of it is that the four or five big technology companies, Facebook, Amazon, Apple, Google, maybe Microsoft, have such control in their particular fields.
Since the invention almost twenty years ago of things like Facebook, Google, Amazon, and others, we haven’t had as much innovation in the digital technology realm. I don’t believe you have to break these companies up, but I do believe a little more antitrust enforcement, where these big companies can’t favor their own products over those of new innovative entrepreneurs, would be healthy, so that we’d have a greater market for creativity and innovation.
DR: How important is a country’s culture or government to the advent of innovation?
WI: It’s absolutely critical whether it’s a culture that allows failure; a culture that celebrates success and creating a business; a culture that can regulate with a soft and sensitive hand instead of an iron fist; a culture that knows how to protect intellectual property but not allow patents to get in the way of innovation.
These all require delicate balances. It’s not to go hell-bent for or against regulation, or for or against intellectual property; it’s understanding the delicate balance. Ever since the Patent Act of 1790 and the antitrust enforcements against Standard Oil, we’ve gotten the balance pretty much right in this country.
I fear that the hyperpartisanship we have now could cause the culture of America to lose that ability to say it’s all about balance or nuance when it comes to government, academia, corporations, and entrepreneurs all being part of an ecosystem that can flourish.