2 The Exponential Nature of Technology

Since the dawn of mankind, cultures and civilizations have developed many different technologies that have changed profoundly the way people live. At the time of its introduction, each technology changed the lives of individuals, tribes, and whole populations. Change didn’t occur at a constant pace, though. The speed at which new technologies have been developed and used has been on the increase ever since the first innovations were introduced, hundreds of thousands of years ago.

Technological innovations are probably almost as old as the human species, and it is even possible some non-human ancestors used basic technologies more than 3 million years ago (Harmand et al. 2015). However, the rate at which new technologies have been introduced is far from constant. In prehistoric times, it took tens or even hundreds of thousands of years to introduce a new technology. In the last thousand years, new technologies have appeared at a much faster rate. The twentieth century saw new technologies appear every few years, and since then the pace has increased.

It is in the nature of technology that new developments are based on existing ones and, therefore, that every new significant development takes less time and effort than previous ones. This leads to an ever-increasing rate of technological change that many believe to be exponential over long periods as new and significant technologies are developed and replace the old ones as the engine of change (Bostrom 2014).

Prehistoric Technologies

Although stone and wood tools have been developed many times and are used by a number of species, we may consider that the first major technological innovation created by humans was the discovery (more precisely, the controlled use) of fire. Although the exact date for that momentous event is disputed, it probably happened between the lower and the middle portions of the Paleolithic period, between more than a million years and 400,000 years ago, and thus predated Homo sapiens.

The controlled use of fire must have brought a major change in the habits of prehistoric humans (Goudsblom 1992). Newer generations got used to the idea that meat would be easier to eat if grilled, and that some vegetables would taste better if cooked. Cooking makes food easier to digest and makes it possible to extract more calories from the available food supply (Wrangham 2009). The control of fire may have been the major reason for the increases in the sizes of humans’ brains.

From our point of view, it may seem as if not much happened for thousands of years, until the domestication of plants and animals. That, of course, is not true. Significant cultural changes happened in the intervening years, including what Yuval Harari (2014) has called the cognitive revolution—a change that enabled humans to use language to communicate abstract thoughts and ideas. A large number of major technological breakthroughs happened in the thousands of years before the agricultural revolution. Bows and arrows, needles and thread, axes, and spears all were probably invented more than once and changed the societies of the time. The exact dates of these inventions or discoveries are mostly unknown, but they all happened probably between 50,000 and 10,000 years ago. One major invention every 10,000 years (roughly 400 generations) doesn’t look like a breathtaking pace of change, but one must remember that hundreds of thousands of years elapsed between the time our ancestors descended from the trees and the time when any major changes in their habits occurred.

Roughly 10,000 to 11,000 years ago, humans had spread to most of the continents and had begun to live in villages. A few thousand years later, many cultures had begun to domesticate plants and animals. This represented what can be viewed as the second major technological revolution, after the controlled use of fire. Usually called the agricultural revolution, it deeply changed the way people lived and interacted. The change from the hunter-gatherer nomad way of life to sedentary agriculture-based living wasn’t necessarily for the better. With the population increasing, humans became more dependent on a fixed supply of food, and larger communities led to an increase in the incidence of contagious diseases and to battles for supremacy. Still, these changes must have occurred, in most places, over a large number of generations, and, most likely, they have not caused a profound change in the way of life of any particular individual.

The speed of technological evolution has not stopped increasing since then. Although the exact date is disputed, the first wheel probably appeared in Mesopotamia around 3500 BC. Surprisingly enough, several cultures as recent as the Aztecs, the Mayans, and the Incas didn’t use the wheel—probably owing to a lack of convenient draft animals. However, the majority of occidental and oriental cultures used that sophisticated implement extensively.

Many other inventions aimed at saving physical labor appeared during the past 10,000 years. Technological progress wasn’t uniform in all civilizations or in all parts of the world. Although the exact reasons are disputed, there is a good case to be made that minor differences in the timing of the initial technological developments were exacerbated by further technological advances (Diamond 1997) and led to very different levels of development. Relatively minor environmental differences, such as the availability of draft animals, the relative ease of travel between places, and the types of readily available crops, led to differences in the timing of the introduction of technological developments, such as agriculture, transportation, and weapons. Those differences, in turn, resulted in differences in the timing of the introduction of more advanced developments—developments that were correlated with the more basic technologies, although sometimes in a non-obvious way. Information-based developments such as counting and writing, and other societal advances resulted from technological developments related to the more fundamental technologies. In the end, the cultures that had a head start became dominant, which eventually resulted in the dominance of Western society and culture, supported in large part by superior technological capacity. It isn’t hard to see that civilizations which started later in the technological race were at a big disadvantage. Minor differences in the speed or in the timing of introduction of technological developments later resulted in huge differences. Western civilization ended up imposing itself over the entire known world, with minor exceptions, while other developed civilizations were destroyed, absorbed, or modified in order to conform to Western standards—a process that is still going on.

The First Two Industrial Revolutions

The industrial revolution of the late eighteenth century is the best-known discontinuity in technological development. Technological innovations in industry, agriculture, mining, and transportation led to major changes in the social organization of Britain, and, later on, of Europe, North America, and the rest of the world. That revolution marked a major turning point in human history, and changed almost all aspects of daily life, greatly speeding up economic growth. In the last few millennia, the growth of the world economy doubled every 900 years or so. After the technological revolution, the growth rate of the economy increased; economic output now doubles every 15 years (Hanson 2008).

That industrial revolution is sometimes separated into two revolutions, one (which took place in the late eighteenth century) marked by significant changes in the way the textile industry operated and the other (which began sometime in the middle of the nineteenth century) by the development of the steam engine and other transportation and communication technologies, many of them based on electricity. These revolutions led to a profound change in the way consumer products were produced and to widespread use of such communication and transportation technologies as the telephone, the train, the automobile, and the airplane.

The first industrial revolution began in Great Britain, then spread rapidly to the United States and to other countries in Western Europe and in North America. The profound changes in society that it brought are closely linked to a relatively small number of technological innovations in the areas of textiles, metallurgy, transportation, and energy.

Before the changes brought on by the first industrial revolution, spinning and weaving were done mostly for domestic consumption in the small workshops of master weavers. Home-based workers, who did both the spinning and the weaving, worked under contract to merchant sellers, who in many cases supplied the raw materials. Technological changes in both spinning and weaving brought large gains in productivity that became instrumental in the industrial revolution.

A number of machines that replaced workers in spinning, including the spinning jenny and the spinning mule, made the production of yarn much more efficient and cheap. Weaving, which before the revolution had been a manual craft, was automated by the development of mechanical looms, which incorporated many innovations, such as the flying shuttle. This led to a much more efficient textile industry and, with the invention of the Jacquard loom, to a simplified process of manufacturing textiles with complex patterns. The Jacquard loom is particularly relevant to the present discussion because it used punched cards to control the colors used to weave the fabric. Later in the nineteenth century, punched cards would be used in the first working mechanical computer, developed by Charles Babbage.

Whereas the first industrial revolution was centered on textiles and metallurgy, the second was characterized by extensive building of railroads, widespread use of machinery in manufacturing, increased use of steam power, the beginning of the use of oil, the discovery and utilization of electricity, and the development of telecommunications. The changes in transportation technology and the new techniques used to manufacture and distribute products and services resulted in the first wave of globalization, a phenomenon that continues to steamroll individual economies, cultures, and ecologies.

However, even the changes brought by improved transportation technologies pale in comparison with the changes brought by electricity and its myriad uses. The development of technologies based on electricity is at the origin of today’s connected world, in which news and information travel at the speed of light, creating the first effectively global community.

During the first two industrial revolutions, many movements opposed the introduction of new technologies on the grounds that they would destroy jobs and displace workers. The best-known such movement was that of the Luddites, a group of English workers who, early in the nineteenth century, attacked factories and destroyed equipment to protest the introduction of mechanical knitting machines, spinning frames, power looms, and other new technologies. Their supposed leader, Ned Ludd, probably was fictitious, but the word Luddite is still in use today to refer to a person who opposes technological change.

The Third Industrial Revolution

The advent of the Information Age, around 1970, is usually considered the third industrial revolution. The full impact of this revolution is yet to be felt, as we are still on the thick of it. Whether there will be a fourth industrial revolution that can be clearly separated from the third, remains an open question. Suggesting answers to this question is, in fact, one of the objectives of this book.

Of particular interest for present purposes are the technological developments related to information processing. I believe that, ultimately, information-processing technologies will outpace almost all other existing technologies, and that in the not-so-distant future they will supersede those technologies. The reason for this is that information processing and the ability to record, store, and transmit knowledge are at the origin of most human activities, and will become progressively more important.

The first human need to process information probably arose in the context of the need to keep accurate data about stored supplies and agricultural production. For a hunter-gatherer, there was little need to write down and pass information to others or to future generations. Writing became necessary when complex societies evolved and it became necessary to write down how many sheep should be paid in taxes or how much land someone owned. It also became necessary to store and transmit the elaborate social codes that were required to keep complex societies working.

Creating a written language is so complex a task that, unlike many other technologies, it has probably evolved independently only a few times. To invent a written language, one must figure out how to decompose a sentence or idea into small units, agree on a unified system with which to write down these units, and, to realize its full value, make sure that the whole thing can be understood by third parties. Although other independently developed writing systems may have appeared (Chinese and Egyptian systems among them), the only commonly agreed upon independent developments of writing took place in Mesopotamia between 3500 BC and 3000 BC and in America around 600 BC (Gaur 1992). It is still an open question whether writing systems developed in Egypt around 3200 BC and in China around 1200 BC were independent or whether they were derived from the Mesopotamia cuneiform script. Many other writing systems were developed by borrowing concepts developed by the inventors of these original scripts.

It is now believed that the first writing systems were developed to keep track of amounts of commodities due or produced. Later, writing evolved to be able to register general words in the language and thus to be used to record tales, histories, and even laws. A famous early set of laws is the Code of Hammurabi from ancient Mesopotamia, which dates from the seventeenth century BC. One of the oldest and most complex writings ever deciphered, the code was enacted by Hammurabi, king of Babylonia. Partial copies exist on a stone stele and in various clay tablets. The code itself consists of hundreds of laws that describe the appropriate actions to be taken when specific rules are violated by people of different social statuses.

Writing was a crucial development in information processing because, for the first time, it was possible to transmit knowledge at a distance and, more crucially, over time spans that transcended memory and even the lifetime of the writer. For the first time in history, a piece of information could be preserved, improved by others, and easily copied and distributed.

The associated ability to count, record, and process numerical quantities was at the origin of writing and developed in parallel with the written language, leading to the fundamentally important development of mathematics. The earliest known application of mathematics arose in response to practical needs in agriculture, business, and industry. In Egypt and in Mesopotamia, in the second and third millennia BC, math was used to survey and measure quantities. Similar developments took place in India, in China, and elsewhere. Early mathematics had an empirical nature and was based on simple arithmetic.

The earliest recorded significant development took place when the Greeks recognized the need for an axiomatic approach to mathematics and developed geometry, trigonometry, deductive systems, and arithmetic. Many famous Greeks contributed to the development of mathematics. Thales, Pythagoras, Plato, Aristotle, Hippocrates, and Euclid were fundamental to the development of many concepts familiar to us today. The Chinese and the Arabs took up mathematics where the Greeks left it, and came up with many important developments. In addition, the Arabs preserved the work of the Greeks, which was then translated and augmented. In what is now Baghdad, Al-Khowarizmi, one of the major mathematicians of his time, introduced Hindu-Arabic numerals and concepts of algebra. The word algorithm was derived from his name.

Further developments in mathematics are so numerous and complex that they can’t be described properly here, even briefly. Although it took thousands of years for mathematics to progress from simple arithmetic concepts to the ideas of geometry, algebra, and trigonometry developed by the Greeks, it took less than 500 years to develop the phenomenal edifice of modern mathematics.

One particular aspect of mathematics that deserves special mention here is the theory of computation. Computation is the process by which some calculation is performed in accordance with a well-defined model described by a sequence of operations. Computation can be performed using analog or digital devices. In analog computation, some physical quantity (e.g., displacement, weight, or volume of liquid) is used to model the phenomena under study. In digital computation, discrete representations of the phenomena under study, represented by numbers or symbols, are manipulated in order to yield the desired results.

A number of devices that perform analog computation have been developed over the centuries. One of the most remarkable—the Antikythera mechanism, which has been dated to somewhere between 150 and 100 BC—is a complex analog computer of Greek origin that uses a complex set of interlocked gears to compute the positions of the sun, the moon, and perhaps other celestial bodies (Marchant 2008). Analog computation has a long tradition and includes the astrolabe, attributed to Hipparchus (c. 190–120 BC), and the slide rule, invented by William Oughtred in the seventeenth century and based on John Napier’s concept of logarithms. A highly useful and effective tool, the slide rule has been used by many engineers still alive today, perhaps even by some readers of this book.

The twentieth century saw many designs and applications of analog computers. Two examples are the Mark I Fire Control Computer, which was installed on many US Navy ships, and the MONIAC, a hydraulic computer that was created in 1949 by William Phillips to model the economy of the United Kingdom (Bissell 2007; Phillips 1950). Analog computers, however, have many limitations in their flexibility, mainly because each analog computer is conceived and built for one specific application. General-purpose analog computers are conceptually possible (and the slide rule is a good example), but in general analog computers are designed to perform specific tasks and cannot be used for other tasks.

Digital computers, on the other hand, are universal machines. By simply changing the program such a computer is executing, one can get it to perform a variety of tasks. Although sophisticated tools to help with arithmetic operations (such as the abacus, developed around 500 BC) have existed for thousands of years, the idea of completely automatic computation didn’t appear until much more recently. Thomas Hobbes—probably the first to present a clearly mechanistic view of the workings of the human brain—clearly thought of the brain as nothing more than a computer:

When a man reasoneth, he does nothing else but conceive a sum total, from addition of parcels; or conceive a remainder, from subtraction of one sum from another… These operations are not incident to numbers only, but to all manner of things that can be added together, and taken one out of another. (Hobbes 1651)

However, humankind had to wait until the nineteenth century for Charles Babbage to create the first design of a general-purpose digital computer (the Analytical Engine), and for his contemporary Ada Lovelace to take the crucial step of understanding that a digital computer could do much more than merely “crunch numbers.”

The Surprising Properties of Exponential Trends

It is common to almost all areas of technology that new technologies are introduced at an ever-increasing rate. For thousands of years progress is slow, and for many generations people live pretty much as their parents and grandparents did; then some new technology is introduced, and a much shorter time span elapses before the next technological development takes place. Reaching a new threshold in a fraction of the time spent to reach the previous threshold is characteristic of a particular mathematical function called the exponential.

In mathematics, an exponential function is a function of the form an. Because an +1 = a × an, the value of the function at the point n + 1 is equal to the value of the function at point n multiplied by the basis of the exponential, a. For n = 0, the function takes the value 1, since raising any number (other than zero) to the power 0 gives the value 1.

Of interest here is the case in which a, the basis, is greater than 1. It may be much greater or only slightly greater. In the former case, the exponential will grow very rapidly, and even for small n the function will quickly reach very large values. For instance, the function 10n will grow very rapidly even for small n. This is intuitive, and it shouldn’t surprise us. What is less intuitive is that a function of the form an will grow very rapidly even when the base is a small number, such as 2, or even a smaller number, such as 1.1.

I will discuss two types of exponential growth, both of them relevant to the discussions that lie ahead. The first type is connected with the evolution of a function that grows exponentially with time. The second is the exponential growth associated with the combinatorial explosion that derives from the combination of simple possibilities that are mutually independent.

The first example I will use to illustrate the surprising properties of exponential trends will be the well-known history of the inventor of chess and the emperor of China. Legend has it that the emperor became very fond of a game that had just been invented—chess. He liked the game so much that he summoned the inventor to the imperial court and told him that he, the emperor of the most powerful country in the world, would grant him any request. The inventor of chess, a clever but poor man, knew that the emperor valued humility in others but didn’t practice it himself. “Your Imperial Majesty,” he said, “I am a humble person and I only ask for a small compensation to help me feed my family. If you are so kind as to agree with this request, I ask only that you reward me with one grain of rice for the first square of the chessboard, two grains for the second, four grains for the third square, and so on, doubling in each square until we reach the last square of the board.” The emperor, surprised by the seemingly modest request, asked his servants to fetch the necessary amount of rice. Only when they tried to compute the amount of rice necessary did it become clear that the entire empire didn’t generate enough rice.

In this case, one is faced with an exponential function with base 2. If the number of grains of rice doubles for every square in relation to the previous square, the total number of grains will be

20 + 21 + 22 + 23 + … + 263 = 264 – 1,

since there are 64 squares on the chessboard. This is approximately equal to 2 × 1019 grains of rice, or, roughly, 4 × 1014 kilograms of rice, since there are roughly 50,000 grains of rice per kilogram. This is more than 500 times the yearly production of rice in today’s world, which is approximately 600 million tons, or 6 × 1011 kilograms. It is no surprise the emperor could not grant the request of the inventor, nor is it surprising he was deceived into thinking the request was modest.

The second example is from Charles Darwin’s paradigm-changing book On the Origin of the Species by Means of Natural Selection (1859). In the following passage from that work, Darwin presents the argument that the exponential growth inherent to animal reproduction must be controlled by selective pressures, because otherwise the descendants of a single species would occupy the entire planet:

There is no exception to the rule that every organic being naturally increase at so high a rate that if not destroyed, the Earth would soon be covered by the progeny of a single pair. … The Elephant is reckoned to be the slowest breeder of all known animals, and I have taken some pains to estimate its probable minimum rate of natural increase: it will be under the mark to assume that it breeds when thirty years old, and goes on breeding till ninety years old, bringing forth three pairs of young in this interval; if this be so, at the end of the fifth century there would be alive fifteen million elephants, descended from the first pair.

Even though Darwin got the numbers wrong (as Lord Kelvin soon pointed out), he got the idea right. If one plugs in the numbers, it is easy to verify there will be only about 13,000 elephants alive after 500 years. If one calls a period of 30 years one generation, the number of elephants alive in generation n is given by an = 2 × an – 1 an – 3. This equation implies that the ratio of the number of elephants alive in each generation to the number in the previous generation rapidly converges to the golden ratio, 1.618.

The surprising properties of exponential growth end up vindicating the essence of Darwin’s argument. In fact, even though only 14 elephants are alive after 100 years, there would be more than 30 million elephants alive after 1,000 years, and after 10,000 years (an instant in time, by evolutionary standards) the number of live elephants would be 1.5 × 1070. This number compares well with the total number of particles in the universe, estimated to be (with great uncertainty) between 1072 and 1087. In this case, the underlying exponential function an has a base a equal to approximately 1.618 (if one measures n in generations) or 1.016 (if one measures n in years). Still, over a long period of time, the growth surprises everyone but the most prepared reader.

A somewhat different example of the strange properties of exponential behavior comes from Jorge Luís Borges’ short story “The Library of Babel.” Borges imagines a library that contains an unthinkably large number of books. Each book has 410 pages, each page 40 lines, and each line 80 letters. The Library contains all the books of this size that can be written with the 25 letters of the particular alphabet used. Any book of this size that one can imagine is in the library. There exists, necessarily, a book with the detailed story of your life, from the time you were born until the day you will die, and there are innumerable translations of this story in all existing languages and in an unimaginable number of non-existing but perfectly coherent languages. There is a book containing everything you said between your birth and some day in your life (at which the book ends, for lack of space); another book picks up where that book left, and so on; another contains what will be your last words. Regrettably, these books are very hard to find, not only because the large majority of books are gibberish but also because you would have no way of telling these books apart from very similar books that are entirely true from the day you were born until some specific day, and totally different from then on.

The mind-boggling concept of the Library of Babel only loses some of its power to confound you when you realize that a library of all the books that can ever be written (of a given size) could never exist, nor could any approximation to it ever be built. The number of books involved is so large as to defy any comparison. The size of the observable universe, measured in cubic millimeters, (about 1090 cubic millimeters) is no match even for the number of different lines of 80 characters that can exist (about 10112), much less for the 101,834,097 books in the Library of Babel.

Even the astoundingly large numbers involved in the Library of Babel pale in comparison with the immense number of combinations that can be encoded in the DNA of organisms. We now know that the characteristics of human beings, and those of all other known living things, are passed down the generations encoded in the DNA present in each and every cell. The human genome consists of two copies of 23 chromosomes, in a total of approximately 2 × 3 billion DNA bases (2 × 3,036,303,846 for a woman, and a bit less for a man because the Y chromosome is much smaller than the X chromosome). Significantly more than 99 percent of the DNA bases in the human genome are exactly the same in all individuals. The remaining bases code for all the variability present within the species. The majority of the differences between individual human genomes are Single Nucleotide Polymorphisms (SNPs, often pronounced “snips”)—locations in the DNA where different people often have different values for a specific DNA base. Since there are two copies of each chromosome (except for the X and Y chromosomes), three different combinations are possible. For instance, in a SNP in which both the base T and the base G occur, some people may have a T/T pair, others a T/G pair, and yet others a G/G pair. SNPs in which more than two bases are present are relatively rare and can be ignored for the purposes of our rough analysis. Although the exact number of SNPs in the human genome is not known, it is believed to be a few tens of millions (McVean et al. 2012). One of the first projects to sequence the human genome identified about 2 million SNPs (Venter et al. 2001). Assuming that this number is a conservative estimate and that all genomic variation is due to SNPs, we can estimate the total number of possible combinations of SNPs, and therefore of different human genomes, that may exist. We can ignore the slight complication arising from the fact that each one of us has two copies of each chromosome, and consider only that, for each SNP, each human may have one of three possible values. We then obtain the value of 32,000,000 different possible arrangements of SNPs in a human genome. This means that, owing to variations in SNPs alone, the number of different humans that could exist is so large as to dwarf even the unimaginably large number of books in the Library of Babel.

Of course, the potential combinations are much more numerous if one considers all the possible genetic combinations of bases in a genome of such a size, and not only the SNPs that exist in humans. Think of the space of 43,000,000,000 possible combinations of 3 billion DNA bases, and imagine that, somewhere, lost in the immense universe of non-working genomes, there are genomes encoding for all sorts of fabulous creatures—creatures that have never existed and will never be created or even imagined. Somewhere, among the innumerable combinations of bases that don’t correspond to viable organisms, there are an indescribably large number of genetic encodings for all sorts of beings we cannot even begin to imagine—flying humans, unicorns, super-intelligent snake-like creatures, and so on.

Here, for the sake of simplicity, let us consider only the possible combinations of four bases in each copy of the genome, since the extent of this genomic space is so large and so devoid of meaning that any more precise computation doesn’t make sense. In particular, diploid organisms, such as humans, have two copies of each chromosome, and therefore one may argue that the size of the space would be closer to 46,000,000,000. We will never know what is possible in this design space, since we do not have and never will have the tools that would be needed to explore this formidable universe of possibilities. With luck, we will skim the surface of this infinitely deep sea when the tools of synthetic biology, now being developed, come of age. Daniel Dennett has called this imaginary design space the Library of Mendel, and a huge library it is—so large that makes the Library of Babel look tiny by comparison (Dennett 1995).

I hope that by now you have been convinced that exponential functions, even those that appear to grow very slowly at first, rapidly reach values that defy our intuition. It turns out that the growth of almost every aspect of technology is well described by a curve that is intrinsically exponential. This exponential growth has been well documented in some specific areas. In 1946, R. Buckminster Fuller published a diagram titled “Profile of the Industrial Revolution as exposed by the chronological rate of acquisition of the basic inventory of cosmic absolutes—the 92 Elements”; it showed the exponential nature of the process clearly (Fuller and McHale 1967). In his 1999 book The Age of Spiritual Machines, Ray Kurzweil proposed a “law of accelerating returns” stating that the rate of change in a wide variety of evolutionary systems, including many technologies, tends to increase exponentially. A better-known and more quantitative demonstration of this exponential process is Moore’s Law, which states that the number of transistors that can be fitted on a silicon chip doubles periodically (a phenomenon I will discuss in more detail in the next chapter).

It is not reasonable to expect the exponential growth that characterizes many technological developments to have been constant throughout the history of humanity. Significant technological changes, such as the agricultural revolution and the industrial revolution, probably have changed the basis of the exponential—in almost all cases, if not all cases, in the direction of faster growth.

The exponential growth that is a characteristic of many technologies is what leads us to systematically underestimate the state of the art of technology in the near to midterm future. As we will see in the following chapters, many technologies are progressing at an exponential pace. In the next few decades, this exponential pace will change the world so profoundly as to make us truly unable to predict the way of life that will be in place when our grandchildren reach our current age.

Digital technologies will play a central role in the changes that will occur. As looms, tractors, engines, and robots changed the way work was done in the fields and in the factories, digital technology will continue to change the way we perform almost all tasks in our daily lives. With time, digital technologies will make many other technologies less relevant or even obsolete, in the same way they have already made typewriters and telegrams things of the past. Many professions and jobs will also become less necessary or less numerous, as has already happened with typists, bank tellers, and newspaper boys.

This book is an attempt to make an educated guess at what these developments will look like and how they will affect our society, our economy, and, ultimately, our humanity.

A Generation Born with the Computer Age

I was born with the computer age, and I belong to the first generation to design, build, program, use, and understand computers. Yet even I, who should be able to understand the fads and trends brought upon us by the ever-increasing pace of technological change, sometimes feel incapable of keeping up with them. Despite this limitation, I attempt to envision the possible developments of technology and, in particular, the developments in the convergence of computer and biomedical technologies.

It is now clear that computers, or what will come after them, will not only replace all the information delivery devices that exist today, including newspapers, television, radio, and telephones, but will also change deeply the ways we address such basic needs as transportation, food, clothing, and housing. As we come to understand better the way living beings work, we will become able to replace increasingly more complex biological systems with artificial or synthetic substitutes. Our ability to change our environment and, ultimately, our bodies, will profoundly affect the way we live our lives. Already we see signs of this deep change in the ways young people live and interact. We may miss the old times, when people talked instead of texting or walked in the park instead of browsing the Web, but, like it or not, these changes are just preludes to things to come. This may not be as bad as it seems. A common person in the twenty-second century will probably have more freedom, more choices, and more ability to create new things than anyone alive today. He or she will have access to knowledge and to technologies of which we don’t even dream. The way someone in the twenty-second century will go about his or her daily life will be, however, as alien to us as today’s ways would be to someone from the early nineteenth century.

This book is an attempt to give you a peek at things to come. I cannot predict what technology will bring in 100 years, and I don’t think anyone can. But I can try to extrapolate from existing technologies in order to predict how some things may turn out 100 years from now.

For the convenience of the reader, I will not jump forward 100 years in a single step. I will begin by considering the recent and not-so-recent history of technological developments, the trends, and the current state of the technology in a number of critical areas; I will then extrapolate them to the near future. What can reasonably be predicted to be achievable in the near future will give us the boldness required to guess what wonders the ever-increasing speed of technological development will bring. The future will certainly be different from how we may guess it will be, and even more unfamiliar and alien than anything we can predict today. We are more likely to be wrong for being too conservative than for being too bold.

I will begin with the technologies that led to the current digital revolution—a revolution that began with the discovery of electricity and gained pace with the invention of a seemingly humble piece of technology: the transistor, probably the most revolutionary technology ever developed by mankind.