MISPLACED WORRIES

DAN SPERBER

Social & cognitive scientist, Central European University, Budapest, & Institut Jean Nicod, Paris; coauthor (with Deirdre Wilson), Meaning and Relevance

Worrying is an investment of cognitive resources laced with emotions from the anxiety spectrum and aimed at solving some specific problem. It has its costs and benefits, and so does not worrying. Worrying for a few minutes about what to serve for dinner in order please one’s guests may be a sound investment of resources. Worrying about what will happen to your soul after death is a total waste. Human ancestors and other animals with foresight may have worried only about genuine and pressing problems, such as not finding food, or being eaten. Ever since, they have become much more imaginative and have fed their imagination with rich cultural inputs; that is, for at least 40,000 years—possibly much longer—humans have also worried about improving their lot individually and collectively (sensible worries) and about the evil eye, the displeasure of dead ancestors, the purity of their blood (misplaced worries).

A new kind of misplaced worries is likely to become more and more common. The ever accelerating current scientific and technological revolution results in a flow of problems and opportunities presenting unprecedented cognitive and decisional challenges. Our ability to anticipate these problems and opportunities is swamped by their number, novelty, speed of arrival, and complexity.

Every day, for instance, we have reasons to rejoice in the new opportunities afforded by the Internet. The worry of fifteen years ago—that it would create yet another major social divide, between those with access to the Internet and those without—is so last century! Actually, no technology in human history has ever spread so far, so fast, so deep. But what about the worry that by making detailed information about every user available to companies, agencies, and governments the Internet destroys privacy and threatens freedom in much subtler ways than Orwell’s Big Brother? Is this what we should worry about? Or should we focus on making sure that as much information as possible is freely accessible as widely as possible, forsaking old ideas of secrecy, and even privacy, and trusting that genuine information will overcome misinformation and that well-informed people will be less vulnerable to manipulation and control—in other words, that with much freer access to information a more radical kind of democracy is becoming possible?

Genetic engineering promises new crops, new cures, improvement of the human genome. How much should we be thrilled, how much frightened? How much and how should the development of genetic engineering itself be controlled, and by whom?

New arms of destruction—atomic, chemical, biological—are becoming more and more powerful and more and more accessible. Terrorist acts and local wars of new magnitude are likely to occur. When they do, the argument will be made even more forcefully than it was in the U.S. after 9/11 that powerful states should be given the means to try and prevent them, including means that curtail democratic rights. What should we worry most about—terrorism and wars or increased limitations to rights?

Looking further into the future: Humans will soon be living with and depending on intelligent robots. Will this develop into a new kind of master-servant dialectic, with the masters alienated by their servants? Will in fact the robots themselves evolve into masters or even into intelligent, purposeful beings with no use for humans? Are such worries sound or silly?

These are just some examples. Scientific and technical developments introduce, at a faster and faster pace, novel opportunities and risks we had not even imagined. Of course, in most cases you and I form opinions as to what we should worry about. But how confidently can we hold these opinions, pursue these worries?

What I am particularly worried about is that humans will be less and less able to appreciate what they should be worrying about and that their worries will do them more harm than good. Maybe, just as in rafting through rapids, one should try not to slow down but to optimize a trajectory one does not really control—not because safety is thereby guaranteed and the optimism is justified (the worst could still happen) but because there is no better option than hope.