1 The Red Queen’s Race

To a casual observer, computers, cells, and brains may not seem to have much in common. Computers are electronic devices, designed by humans to simplify and improve their lives; cells, the basic elements of all living beings, are biological entities crafted by evolution; brains are the containers and creators of our minds, with their loads of hopes, fears, and desires.

However, computers, cells, and brains are, in one way or another, simply information-processing devices. Computers represent the latest way to process information, in digital form. Before them, information processing was done by living organisms, in effect creating order out of chaos. Computers, cells, and brains are the results of complex physical and chemical processes as old as the universe, the most recent products of evolution, the winners of a race started eons ago.

Everything Is Connected

We are made of the atoms created in the Big Bang, 14 billion years ago, or in the explosions of distant stars that occurred for billions of years after that event, which signaled the beginning of time. At every breath, every one of us inhales some of the very same oxygen molecules that were in the last breath of Julius Caesar, reused over the centuries to sustain all life on Earth. We know now that the wing beat of a butterfly may indeed influence the path of a hurricane many months later, and that the preservation of one species may be critically dependent on the life of all other species on Earth. The evolutionary process that started about 4 billion years ago has led to us, and to almost everything surrounding us, including the many devices and tools we own and use.

This book touches many different areas I view as sharing strong connections. It covers computers, evolution, life, brains, minds, and even a bit of physics. You may view these areas as disjoint and unrelated. I will try to show you that they are connected and that there is a common thread connecting physics, computation, and life. I am aware that the topic of each of the chapters of this book fully deserves a complete treatise by itself (in some cases, several treatises). I hope the coherence of the book is not affected by the fact that each of the many areas covered is touched upon only lightly, barely enough to enable the reader to understand the basic principles.

In a way, everything boils down to physics. Ernest Rutherford supposedly said that “science is either physics or stamp collecting,” meaning that the laws of physics should be sufficient to explain all phenomena and that all other sciences are no more than different abstractions of the laws of physics. But physics cannot be used to directly explain or study everything. Computation, biology, chemistry, and other disciplines are necessary to understand how the universe works. They are not unrelated, though. The principles that apply to one of them also apply to the others.

This realization that everything is related to everything else is, in large part, a result of our improved understanding of the world and our ability to master technology. Only relatively recently have advances in science made it possible for us to understand that the same equations describe the light we receive from the sun, the behavior of a magnet, and the workings of the brain. Until only a few hundred years ago, those were all independent realities—separate mysteries, hidden from humanity by ignorance.

Technology, by making use of the ever-improving understanding provided by science, has been changing our lives at an ever-increasing pace for thousands of years, but the word technology is relatively new. Johann Beckmann, a German scholar, deserves credit for coining the word (which means “science of craft”) in 1772, and for somehow creating the concept (Kelly 2010). Before Beckmann, the various aspects of technology were individually known as tools, arts, and crafts. Beckmann used the word technology in a number of works, including a book that was later translated into English as Guide to Technology, or to the knowledge of crafts, factories and manufactories. However, the word was rarely if ever used in common language before the twentieth century. In the second half of the twentieth century, use of the word in common language increased steadily. Today it is a very common word in political, social, and economic texts.

With technology came the idea that innovation and novelty are intrinsic components of civilization. Constant changes in technologies, society, and economics are so ingrained in our daily lives that it is hard to understand that this state of affairs wasn’t the rule in the ancient days. A few hundred years ago, change was so slow that most people expected the future to be much like the past. The concept that the future would bring improvements in people’s lives was never common, much less popular. All that changed when changes began to occur so often that they were not only perceptible but expected. Since the advent of technology, people expect the future to bring new things that will improve their daily lives. However, many of us now fear that the changes may come too fast, and may be too profound, for normal people to assimilate them.

Old Dogs, New Tricks

If you are more than a few decades old, you probably feel that technology is changing so fast than you can’t keep up with it, that new devices and fads of very dubious interest appear every day, and that it is hard to keep up with the pace of change. The things young people do and use today are increasingly foreign to the elders, and it is difficult to keep up with new trends, tools, and toys. Young children are better than you with a smartphone, are more at ease with computers, play games you don’t understand, and flock to new sites that are of little interest to you. They don’t even know what a VCR is—that miracle technology of the 1980s, which was famously difficult to operate. Even CDs and DVDs—digital technologies that emerged in the last few decades of the twentieth century—seem to be on their way out, having lasted for only a few decades (significantly less time than the vinyl records and film reels they replaced, which lasted almost a hundred years).

If you were born in the present century, on the other hand, you don’t understand how your parents and grandparents can have such a hard time with technological innovations. New ideas and gadgets come naturally to you, and you feel at home with the latest app, website, or social network.

Yet reasonably literate and even technologically sophisticated people of my generation don’t feel at all ready to give up on advancing technology. Our generation invented computers, cell phones, the World Wide Web, and DNA sequencing, among many other things. We should be able to understand and use any new things technology might throw at us in the next decades.

I believe that technology will keep changing at an ever-increasing pace. As happened with our parents and our grandparents before us, our knowledge about technology is likely to become rapidly obsolete, and it will be difficult to understand, use, and follow the technological developments of coming decades. There is some truth to the saying “you can’t teach an old dog new tricks.”

Whether our children and grandchildren will follow the same inevitable path to obsolescence will depend on how technology continues to change. Will new technological developments keep coming, faster and faster? Or are we now in a golden age of technological development, a time when things are changing as rapidly as they ever will?

When I was eight years old, I enjoyed visiting my grandfather. He lived in a small village located in what I then believed to be a fairly remote place, about sixty miles from Lisbon, the capital of Portugal. The drive from my house to the village where he lived would take us half a day, since the roads were narrow and winding. He would take me with him, to work in the fields, on a little wagon pulled by a horse. To a little boy, being able to travel on a horse-drawn wagon was quite exciting—a return to the old days. Other things would reinforce the feeling of traveling back to the past. There were no electric lights in his house, no refrigerator, no television, and no books. Those “modern” technologies weren’t deemed necessary, as my grandparents lived in a centuries-old fashion.

By comparison, my parents, who left their village when they married, were familiar with very advanced technologies. They were literate, had a TV, and even owned a car. To my grandparents, those technologies never meant much. They had no interest in TV, newspapers, or books, most of which reported or referred to a reality so remote and so removed from their daily experience that it meant nothing to them. Cars, trains, and planes didn’t mean a lot to people who rarely traveled outside their small village and never felt the desire to do so.

Forty years later, my parents still have a TV, still read books, and still drive around in a car. I would have thought that, from their generation to ours, the technological gap would have become much smaller. I would have imagined that, forty years later, they would be much closer to my generation than they were to the previous generation in their ability to understand technology. However, such is not the case, and the gap seems to increase with each generation. My parents, which are now over 80, never quite realized that a computer is a universal machine that can be used to play games, to obtain information, or to communicate, and they never understood that a computer is just a terminal of a complex network of information devices that can deliver targeted information when and where one needs it. Many people of their generation never understood that—except for minor inconveniences, caused by limitations in technologies, that will soon disappear—there is no reason why computers will not replace books, newspapers, radio, television, and almost every other device for the delivery of information.

You may think that my generation understands what a computer can do and is not going to be so easily outpaced by technological developments. After all, almost all of us know exactly what a computer is, and many of us even know how a computer works. That knowledge should give us some confidence that we will not be overtaken by new developments in technology, as our parents and our grandparents were. However, this confidence is probably misplaced, mostly because of the Red Queen effect, the name of which is inspired by the character in Lewis Carroll’s masterpiece, Through the Looking Glass: “Now, here, you see, it takes all the running you can do, to keep in the same place.”

The Red Queen effect results from the fact that, as evolution goes by, organisms must become more and more sophisticated, not to gain competitive advantage, but merely to stay alive as the other organisms in the system constantly evolve and become more competitive.

Although the Red Queen effect has to do with evolution and the competition between species, it can be equally well applied to any other environment in which competition results in rapid change—for example, business or technology.

I anticipate that, in the future, each generation will be more dramatically outpaced than the generation before it. Thirty years from now, we will understand even less about the technologies of the day than our parents understand about today’s technologies. I believe this process of generational obsolescence will inevitably continue to accelerate, generation after generation, and that even the most basic concepts of daily life in the world of a hundred years from now would be alien to members of my generation.

Arthur C. Clarke’s third law states that any sufficiently advanced technology is indistinguishable from magic. A hundred years from today, technology may be so alien to anyone alive today as to look like magic.

From Computers and Algorithms to Cells and Neurons

New technologies are not a new thing. However, never before have so many technological innovations appeared in such a short period of time as today. In coming decades, we will continue to observe the rapid development and the convergence of a number of technologies that, until recently, were viewed as separate.

The first of these technologies, which is recent but has already changed the world greatly, is computing technology, which was made possible by the development of electronics and computer science. Computers are now so pervasive that, to many of us, it is difficult to imagine a world without them. Computers, however, are useful only because they execute programs, which are nothing more than implementations of algorithms. Algorithms are everywhere, and they are the ultimate reason for the existence of computers. Without algorithms, computers would be useless.

Developers of algorithms look for the best way to tell computers how to perform specific computations efficiently and correctly. Algorithms are simply very detailed recipes—sequences of small steps a computer executes to obtain some specific result. One well-known example of an algorithm is the algorithm of addition we all learned in school. It is a sequence of small steps that enables anyone following the recipe to add any two numbers, no matter how large. This algorithm is at the core of each modern computer and is used in every application of computers.

Algorithms are described to computers using some specific programming language. The algorithms themselves don’t change with the pro­gramming language; they are merely sequences of abstract instructions that describe how to reach a certain result. Algorithm design is, in my view, one of the most elegant and fascinating fields of mathematics and computer science.

Algorithms are always developed for specific purposes. There are many areas of application of algorithms, and two of these areas will play a fundamental role in the development of future technologies.

The first of these areas is machine learning. Machine learning algorithms enable computers to learn from experience. You may be convinced that computers don’t learn and that they do only what they are explicitly told to do, but that isn’t true. There are many ways in which computers can learn, and we use the ability of computers to learn when we watch TV, search the Web, use a credit card, or talk on a phone. In many cases the actual learning mechanisms are hidden from view, but learning takes place nonetheless.

The second of these areas is bioinformatics, the application of algorithms to the understanding of biological systems. Bioinformatics (also known as computational biology) uses algorithms to process the biological and medical data obtained by modern technologies. Our ability to sequence genomes, to gather data about biological mechanisms, and to use those data to understand the way biological systems work depends, in large part, on the use of algorithms developed specially for that purpose. Bioinformatics is the technology that makes it possible to model and understand the behavior of cells and organisms. Recent advances in biology are intricately linked with advances in bioinformatics.

Evolution, the process that has created all living things is, in a way, also an algorithm. It uses a very different platform to run, and it has been running for roughly 4 billion years, but it is, in its essence, an algorithm that optimizes the reproductive ability of living creatures. Four billion years of evolution have created not only cells and organisms, but also brains and intelligent beings.

Despite the enormous variety of living beings created by evolution and the range of new technologies that have been invented, members of the genus Homo have been the exclusive owners of higher intelligence on Earth. It is this particular characteristic than has enabled humans to rule the Earth and to adapt it to their needs and desires, sometimes at the expense of other important considerations.

Technology, however, has evolved so much that now, for the first time, we face the real possibility that other entities—entities created by us—could become intelligent. This possibility arises from the revolution in computing technologies that has occurred in the past fifty years, including artificial intelligence and machine learning, but also from the significant advances in our understanding of living beings—particular in our understanding of the human body and the human brain. Computing technologies, which are only a few decades old, have changed so many things in our daily lives that civilization as we now know it would not be possible without computers. Physics and biology have also made enormous advances, and for the first time we may have the tools and the knowledge to understand in detail how the human body and the human brain work.

Advances in medical techniques have already led to a significant increase in the life expectancy of most people alive today, but in coming decades we are likely to see unprecedented improvements in our ability to control or cure deadly diseases. These improvements will result from our increased understanding of biological processes—an understanding that will be made possible by new technologies in biology, physics, and computation. Ultimately, we may come to understand biological processes so well that we will be able to reproduce and simulate them in computers, opening up new possibilities in medicine and engineering.

In this book we will explore the possibility that, with the advances in medical and computing technologies, we may one day understand enough of the way the brain works to be able to reproduce intelligence in a digital support—that is, we may be able to write a program, executed by a digital computer, that will exhibit intelligence. There are a number of ways in which this could happen, but any one of them will lead to a situation in which non-biological minds will come into existence and become a members of our society. I called them digital minds because, almost certainly, they will be made possible by the existence of digital computer technology. The fact that non-biological minds may soon exist on Earth will unleash a social revolution unlike any that has been witnessed so far. However, most people are blind not only to the possibility of this revolution but also to the deep changes it will bring to our social and political systems. This book is also an attempt to raise the public awareness of the consequences of that revolution.

The majority of the predictions I will make here are likely to be wrong. So far, the future has always created things stranger, more innovative, and more challenging than what humans have been able to imagine. The coming years will be no exception and, if I am lucky enough to be alive, I will find myself surprised by new technologies and discoveries that were not at all the ones I expected. However, such is the nature of technology, and making predictions is always hard, especially about the future.

It is now time to embark on a journey—a journey that will take us from the very beginnings of technology to the future of the mind. I will begin by showing that exponential growth is a pattern built into the scheme of life but also a characteristic of the development of many technologies.