On January 9, 2007, the most iconic entrepreneur on the planet announced something new—a product that was to become the most profitable in history.1
It was, of course, the iPhone. There are many ways in which the iPhone has defined the modern economy. There is the sheer profitability of the thing, of course: only two or three companies in the world make as much money as Apple does on the iPhone alone. There is the fact that it created a new product category: the smartphone. The iPhone and its imitators represent a product that did not exist ten years ago but now is an object of desire for most of humanity. And there is the way the iPhone transformed other markets—for software, for music, and for advertising.
But those are just the obvious facts about the iPhone. And when you delve deeper, the tale is a surprising one. We give credit to Steve Jobs and other leading figures in Apple—his early partner Steve Wozniak, his successor Tim Cook, his visionary designer Jony Ive—but some of the most important actors in this story have been forgotten.
Ask yourself: What actually makes an iPhone an iPhone? It’s partly the cool design, the user interface, the attention to details in the way the software works and the hardware feels. But underneath the charming surface of the iPhone are some critical elements that made it, and all the other smartphones, possible.
The economist Mariana Mazzucato has made a list of twelve key technologies that make smartphones work. One: tiny microprocessors. Two: memory chips. Three: solid state hard drives. Four: liquid crystal displays. Five: lithium-based batteries. That’s the hardware.
Then there are the networks and the software.
Continuing the count: Six: fast-Fourier-transform algorithms. These are clever bits of math that make it possible to swiftly turn analog signals such as sound, visible light, and radio waves into digital signals that a computer can handle.
Seven—and you might have heard of this one: the Internet. A smartphone isn’t a smartphone without the Internet.
Eight: HTTP and HTML, the languages and protocols that turned the hard-to-use Internet into the easy-to-access World Wide Web. Nine: cellular networks. Otherwise your smartphone not only isn’t smart, it’s not even a phone. Ten: global positioning systems, or GPS. Eleven: the touchscreen. Twelve: Siri, the voice-activated artificial-intelligence agent.2
All of these technologies are important components of what makes an iPhone, or any smartphone, work. Some of them are not just important but indispensable. But when Mariana Mazzucato assembled this list of technologies and reviewed their history, she found something striking. The foundational figure in the development of the iPhone wasn’t Steve Jobs. It was Uncle Sam. Every single one of these twelve key technologies was supported in significant ways by governments—often the American government.
A few of these cases are famous. Many people know, for example, that the World Wide Web owes its existence to the work of Tim Berners-Lee. He was a software engineer employed at CERN, the particle physics research center in Geneva, which is funded by governments across Europe.3 And the Internet itself started as ARPANET—an unprecedented network of computers funded by the U.S. Department of Defense in the early 1960s.4 GPS, of course, was a pure military technology, developed during the Cold War and only opened up to civilian use in the 1980s.5
Other examples are less famous, though scarcely less important.
The fast Fourier transform is a family of algorithms that have made it possible to move from a world where the telephone, the television, and the gramophone worked on analog signals to a world where everything is digitized and can therefore be dealt with by computers such as the iPhone. The most common such algorithm was developed from a flash of insight from the great American mathematician John Tukey. What was Tukey working on at the time? You’ve guessed it: a military application. Specifically, he was on President Kennedy’s Scientific Advisory Committee in 1963, trying to figure out how to detect when the Soviet Union was testing nuclear weapons.6
Smartphones wouldn’t be smartphones without their touchscreens—but the inventor of the touchscreen was an engineer named E. A. Johnson, whose initial research was carried out while Johnson was employed by the Royal Radar Establishment, a stuffily named agency of the British government.7 The work was further developed at CERN—them again. Eventually multitouch technology was commercialized by researchers at the University of Delaware in the United States—Wayne Westerman and John Elias, who sold their company to Apple itself. Yet even at that late stage in the game, governments played their part: Wayne Westerman’s research fellowship was funded by the U.S. National Science Foundation and the CIA.8
Then there’s the girl with the silicon voice, Siri.
Back in 2000, seven years before the first iPhone, the U.S. Defense Advanced Research Projects Agency (DARPA), commissioned the Stanford Research Institute to develop a kind of proto-Siri, a virtual office assistant that might help military personnel do their jobs. Twenty universities were brought into the project, furiously working on all the different technologies necessary to make a voice-activated virtual assistant a reality. Seven years later, the research was commercialized as a start-up, Siri, Inc.—and it was only in 2010 that Apple stepped in to acquire the results for an undisclosed sum.9
As for hard drives, lithium-ion batteries, liquid crystal displays, and semiconductors themselves—there is a similar story to be told. In each case there was scientific brilliance and plenty of private-sector entrepreneurship. But there were also wads of cash thrown at the problem by government agencies—usually U.S. government agencies, and for that matter usually some arm of the U.S. military.10 Silicon Valley itself owes a great debt to Fairchild Semiconductor—the company that developed the first commercially practical integrated circuits. And Fairchild Semiconductor, in its early days, depended on military procurement.11
Of course, the U.S. military didn’t make the iPhone. CERN did not create Facebook or Google. These technologies that so many people rely on today were honed and commercialized by the private sector. But it was government funding and government risk-taking that made all these things possible. That’s a thought to hold on to as we ponder the technological challenges ahead in fields such as energy and biotechnology.
Steve Jobs was a genius, there’s no denying that. One of his remarkable side projects was the animation studio Pixar—which changed the world of movies when it released the digitally animated film Toy Story.
Even without the touchscreen and the Internet and the fast Fourier transform, Jobs might well have created something wonderful. But it would not have been a world-shaking technology like the iPhone. More likely it would have been, like Woody and Buzz, an utterly charming toy.