7
Conclusion
Given all the press coverage and publicity, it seems appropriate to end this narrative with the rise of Facebook and Twitter as long as we do not lose sight of the general themes that have driven computing and computer technology from its origins. Above all, it is the force that drove the invention of the electronic computer in the first place: numerical calculation. Traditional number crunching, such as solving differential equations as the ENIAC did, remains a primary use of large computers in the scientific world. The early computers presented their results in numerical form, as tables of numbers; scientists increasingly can see the results in graphical form as well. Complementing the number-processing capabilities of modern mainframes is an ability to handle very large sets of data, using ever increasing capacities of disk storage and database software. This is an important point: behind many apps on a smart phone is a complex network of servers, routers, databases, fiber-optic cables, satellites, mass storage arrays, and sophisticated software.
Likewise, the current fascination with portable devices and Facebook should not obscure the continuing use of computers for business, the market that IBM exploited so well with its System/360 line in the late 1960s. Facebook may be the obsession right now, but corporations still use computers for prosaic applications like payroll and inventory, in many cases still programming in COBOL. Companies like FedEx employ networks of large mainframes to schedule their work and track shipments on the Internet. Customers can see where their package is by looking at their smart phone, behind which the company operates a network whose complexity is orders of magnitude greater than what IBM and its competitors offered a few decades ago. The federal government, including the Department of Defense, remains a huge user of computers of all sizes.
ARPA (now called DARPA) continues to drive innovation, although it may be hard-pressed to come up with something like the Internet again. With DARPA’s support, the Defense Department has a counterpart to the Apple iPhone: small microprocessor-based devices that are embedded into weapons systems, such as unmanned aerial vehicles (UAVs), that use satellite data, on-board inertial guidance, and robotic vision to fly autonomously and hit remote targets. These are seamlessly linked to human commanders in the field. Some of these weapons are adapted from devices developed for consumer use, a reversal of the initial one-way flow of innovation from DARPA to the civilian world. The issue of intelligent, autonomous robotic weapons raises philosophical and moral questions about warfare that once were found only among science-fiction writers. This future is here, now.
We conclude by returning briefly to the four themes I identified at the outset, which I argue give a structure to the topic of computers and computing.
The digital paradigm has proved itself again and again as traditional methods of information handling and processing give way to manipulating strings of bits. It began with the pocket calculator replacing the slide rule. A few years ago, we saw the demise of film photography and the rise of digital cameras. Vinyl, analog recording has made a surprising comeback among music fans, but the current trend of downloading individual songs as digital files is not going to vanish. Airplanes are designed on computer screens and are flown by digital “fly-by-wire” electronics. After a few false starts, the e-book has established itself. A book is now a string of bits, delivered over a wireless network, to an e-book reader. Traditional books will not disappear, but will they occupy a niche like vinyl records? The list will go on.
The convergence of communications, calculation, data storage, and control also goes on, although with a twist. The handheld device can do all these things, but its communication function—voice telephone calls, texting, or Internet access—is split among different networks. Some of this is a product of the political and regulatory structure of the United States, and one finds a different mix in other countries. Future developments in this arena may depend as much on the U.S. Congress or the Federal Communications Commission as what comes from Apple’s galactic headquarters in Cupertino, California.
Moore’s law, the shorthand name for the steady advance of microelectronics, continues to drive computing. A recent study by the National Research Council hinted that although memory capacity continues to increase exponentially, the corresponding speed of microprocessor is not keeping pace.1 The question of whether Moore’s law illustrates the concept of technological determinism, so anathema to historians, remains a contested topic. The development of the personal computer refutes the deterministic thesis. So does the sudden rise of Facebook, even if a program like that requires a large infrastructure of silicon to be practical.
The fourth theme, the user interface, also remains central even as it evolves. No one disputes the importance of a clean, uncluttered design, and this has been noted regarding the success of Apple Computer’s products and the most-visited Web sites, like Google. How do these issues relate to the design of antiaircraft weapons during World War II? That was when the discipline of operations research emerged. Operations research was a response to the realization that an engineer’s job was not finished once a machine was designed; he or she then had to fit that device into a human context—in that case, of newly recruited soldiers and sailors who had little background in advanced technology yet who were being asked to operate sophisticated radar and other electronic devices.2 The context has changed, but the underlying issue remains: computers are effective when their operation—and programming—is pushed down to the user level, whoever that user may be. And the plasticity of computers, the very quality that sets them apart from other machines, will always mean that their use will never be intuitive or obvious. As with the other themes, this too is evolving rapidly. It is startling to be reminded, for example, that many people today are accessing computers without using a mouse, never mind a QWERTY keyboard.
Zeno’s paradox states that we can never truly understand computers and computing, at least until innovation in the field slows or stops. Perhaps if Moore’s law slows down, software writers will have time to catch up with cleaner and less buggy programming. The only certainty is that the next few decades of the twenty-first century will be as dynamic and effervescent as the decades since 1945.