MIND CHANGE

A GLOBAL PHENOMENON

Let’s enter a world unimaginable even a few decades ago, one like no other in human history. It’s a two-dimensional world of only sight and sound, offering instant information, connected identity, and the opportunity for here-and-now experiences so vivid and mesmerizing that they can outcompete the dreary reality around us. It’s a world teeming with so many facts and opinions that there will never be enough time to evaluate and understand even the smallest fraction of them. For an increasing number of its inhabitants, this virtual world can seem more immediate and significant than the smelly, tasty, touchy 3-D counterpart: it’s a place of nagging anxiety or triumphant exhilaration as you are swept along in a social networking swirl of collective consciousness. It’s a parallel world where you can be on the move in the real world, yet always hooked into an alternative time and place. The subsequent transformation of how we might all be living very soon is a vitally important issue, perhaps even the most important issue of our time.1 Why? Because it may be that a daily existence revolving around smartphone, iPad, laptop, and Xbox is radically changing not just our everyday lifestyles but also our identities and even our inner thoughts in unprecedented ways.2 As a neuroscientist, I’m fascinated by the potential effects of a screen-oriented daily existence on how we think and what we feel, and I want to explore how that exquisitely adaptable organ, the brain, may now be reacting to this novel environment, recently dubbed the “digital wildfire.”3

In the developed world, there is now a one in three chance that children will live to 100 years of age.4 Thanks to the advances of biomedicine, we can anticipate longer and healthier lives; thanks to technology, we can foresee an existence increasingly freed from the daily domestic grind that characterized the lives of previous generations. Unlike so much of humanity in the past and still in many nightmare scenarios around the world, we take it as the norm and as our entitlement not to be hungry, cold, in pain, or in constant fear for our lives. Unsurprisingly, therefore, there are many in our society who are convinced that we’re doing just fine, that these digital technologies are not so much a raging wildfire but more of a welcoming hearth at the heart of our current lifestyles. Accordingly, various reassuring arguments are ready at hand to counter reservations and concerns that might otherwise be viewed as exaggerated, even hysterical.

One starting premise is that surely everyone has enough common sense to ensure that we don’t let the new cyberculture hijack daily life wholesale. Surely we are sensible and responsible enough to self-regulate how much time we spend online and to ensure that our children don’t become completely obsessed by the screen. But the argument that we are automatically rational beings does not stand the test of history: when has common sense ever automatically prevailed over easy, profitable, or enjoyable possibilities? Just look at the persistence of hundreds of millions worldwide who still spend money on a habit that caused a hundred million fatalities in the twentieth century and which, if present trends continue, promises up to one billion deaths in this century: smoking.5 Not much common sense at work there.

Then again, the reliability of human nature might work in our favor if only we could assume that our innate genetic makeup leads most of us to do the right thing, regardless of any corrupting external influences. Yet in itself, this idea immediately runs counter to the superlative adaptability of the human brain, which allows us to occupy more ecological niches than any other species on the planet. The Internet was initially created as a way for scientists to contact each other, and this invention spawned phenomena such as 4chan, a collection of message boards where people post images and short text comments, mostly anonymously and with no holds barred.6 This form of self-expression is a new niche to which we may adapt, with consequences as extreme as the medium itself. If it is the hallmark of our species to thrive wherever we find ourselves, then the digital technologies could bring out the worst in human nature rather than being rendered harmless by it.

Another way of dismissing out of hand the concerns that the effects of digital technology may bring is a kind of solipsistic stance in which the screen enthusiast proudly points to his or her own perfectly balanced existence, which combines the pleasures and advantages of cyberculture with life in three dimensions. Yet psychologists have been telling us for many years that such subjective introspection is an unreliable barometer of mental state.7 In any case, it should be obvious enough that just because a single individual may be able to achieve an ideal mix between the virtual and the real, it does not automatically mean that others are capable of exercising similar restraint and sound judgment. And even those individuals who think they’ve got everything just right will often admit in an unguarded moment that “It’s easy to waste a lot of time on Facebook,” that they are “addicted” to Twitter, or that, yes, they do find it hard to concentrate long enough to read a whole newspaper article. In the United Kingdom, the advent of I, an abbreviated version of the national quality paper The Independent, and the introduction on the BBC of the 90 Second News Update stand as testimony to the demands of an ever larger constituency of readers and viewers—not just the younger generation—who have a reduced attention span and are demanding print and broadcast media to match.

Another consolation is the conviction that the next generation will work out just fine, thanks to parents who take control and intervene where necessary. Sadly, this idea has already proved to be a nonstarter. For reasons we shall explore shortly, parents often complain that they cannot control what their offspring do online, and many already despair at their inability to prize their children away from the screen and back into a world of three dimensions.

Marc Prensky, an American technologist, coined the term “Digital Native” for someone defined by his or her perceived outlook and abilities, based on an automatic facility and familiarity with digital technologies.8 By contrast, “Digital Immigrants” are those of us who, according to Prensky, “have adopted many aspects of the technology, but just like those who learn another language later in life, retain an ‘accent’ because we still have one foot in the past.” It is unlikely that anyone reading these words will not have strong views as to which side of the divide he or she belongs and whether the distinction is cause for unalloyed celebration or deep anxiety. Generally speaking, it corresponds to age, although Prensky himself did not pinpoint a specific line of demarcation. The date of birth of the Digital Native seems therefore to be uncertain: we could start as far back as the 1960s, when the term “computer” entered into common parlance, or as late as 1990, for by the time a young Digital Native born then could read and write, email (which started around 1993) would have become an inescapable part of life.

The important distinction is that Digital Natives know no other way of life other than the culture of Internet, laptop, and mobile. They can be freed from the constraints of local mores and hierarchical authority and, as autonomous citizens of the world, will personalize screen-based activities and services while collaborating with, and contributing to, global social networks and information sources.

But a much gloomier portrait of the Digital Native is being painted by pundits such as the British American author Andrew Keen:

MySpace and Facebook are creating a youth culture of digital narcissism; open-source knowledge sharing sites like Wikipedia are undermining the authority of teachers in the classroom; the YouTube generation are more interested in self-expression than in learning about the world; the cacophony of anonymous blogs and user-generated content is deafening today’s youth to the voices of informed experts.9

Then again, perhaps the Digital Native doesn’t actually exist after all. Neil Selwyn, of the Institute of Education in London, argues that the current generation is actually no different from preceding ones: young people are not hardwired to have unprecedented brains.10 Rather, many young people are using technology in a far more sporadic, passive, solitary, and, above all, unspectacular way than the hype of the blogosphere and zealous proponents of cyberculture might have us believe.

Irrespective of whether the digital age has spawned a new type of superbeing or just ordinary humans better adapted to screen life, suffice it to say that, for the moment, parents are most likely to be Digital Immigrants and their children Digital Natives. The former are still learning the enormous potential of these technologies in adulthood, while the latter have known nothing else. This cultural divide often makes it hard for parents to know how best to approach situations that they intuitively perceive to be a problem, such as seemingly excessive time spent on computer-based activities; meanwhile, children may feel misunderstood and impatient with views they regard as inappropriate and outdated for present-day life.

Although reports and surveys have focused largely on the next generation, the concerns I want to flag are not limited to the Digital Native alone. Far from it. But a generational divide has undoubtedly arisen from the vertiginous increase in the pace of ever smarter digital devices and applications. What will be the effects on each generation, and on the relationship between them?

In a 2011 report, Virtual Lives, researchers for the U.K. children’s charity Kidscape assessed the online activities of more than two thousand children between the ages of eleven and eighteen. Just under half of the children questioned said they behaved differently online compared to their normal lives, with many claiming it made them feel more powerful and confident. One explained: “It’s easier to be who you want to be, because nobody knows you and if you don’t like the situation you can just exit and it is over.” Another echoed this sentiment, noting: “You can say anything online. You can talk to people that you don’t normally speak to and you can edit your pictures so you look better. It is as if you are a completely different person.” These findings, the report argues, “suggest that children see cyberspace as detachable from the real world and as a place where they can explore parts of their behavior and personality that they possibly would not show in real life. They seem unable to understand that actions online can have repercussions in the real world.”11 The easy opportunity of alternative identity and the notion that actions don’t have consequences have never previously featured in a child’s development, and they are posing unprecedented questions as to what might be for the best. While the brain is indeed not hardwired to interface effectively with screen technologies, it has evolved to respond with exquisite sensitivity to external influences—to the environment it inhabits. And the digital environment is getting ever more pervasive at an ever younger age. Recently Fischer-Price introduced a potty-training seat complete with an iPad holder,12 presumably to complement an infant lifestyle where the recliner in which the baby may spend many hours is also dominated by a screen.13

This is why the question of the impact of digital technologies is so very important. Hardened captains of industry or slick entrepreneurs will often sidle up to me during the coffee break at corporate events and let their professional mask slip as they recount in despair the obsessional fixation of their teenage son or daughter with the computer. But these anxieties remain unchanneled and unfocused. Where can these troubled parents share their experiences with others on a wider platform and articulate them in a formal and cogent way? At the moment, nowhere. In the following pages, we’ll be looking at many studies on preteens as well as teenagers; unfortunately, there are far fewer studies on adults, perhaps because they are less cohesive and identifiable as a group than a volunteer student body or a captive classroom. But, in any event, it’s important to view the data not as a self-help guide for bringing up kids but rather as a pivotal factor in the bigger picture of society as a whole.

Another argument sometimes used to dismiss any concerns about digital culture is the idea that we’ll muddle through as long as appropriate regulation is in place. All too often we hear something like this from professional policy makers and government officials: There is no conclusive evidence for concern as yet. If and when there is, all the appropriate checks and balances will of course be duly put in place. In the meantime, as long as we are sensible and proportionate, we can enjoy and benefit from all the advantages of the cyberlife. Technology clearly brings us previously unimagined opportunities, and such advances will of course be balanced out by always being alert to potential negative impacts.14 Yet while moderation may well be the key, technology is not necessarily being used in moderation. Young people in the United States, on average, use entertainment media more than fifty-three hours per week.15 When media multitasking, or using more than one medium at once, is taken into account, young people average nearly eleven hours’ worth of entertainment media use per day—hardly moderate.

The deeper problem with seeing regulation as the “solution” is that it is always reactive. Regulatory procedures can only respond to, and then sweep up behind, some new event, discovery, or phenomenon in order to eliminate clear harm, as with junk food, air pollution, or, to use an Internet example, the sexual grooming of children or their access to extreme violence. But regulation always has to play catch-up: politicians and civil servants will always be leery about predictions because they are rightly aware they are spending taxpayers’ or donors’ money on what could be regarded as speculation. However much guidelines and laws may be needed for the obvious and immediate dangers of the cyberworld, they are inadequate to the task of looking forward, of imagining the best uses to which new technologies can be put. For that we need long-term imagination and bold thinking, qualities not necessarily associated nowadays with cash-strapped civil servants or politicians with an eye to imminent reelection and easy wins in the short term. And so it is up to the rest of us. Technology can be empowering and can help us shape more fulfilling lives, but only if we ourselves step up to the plate and help take on the task.

Digital technologies are eroding the age-old constraints of space and time. I’ll always remember a speech by former U.S. president Bill Clinton that I attended in Aspen, Colorado, back in 2004, where he described how the history of civilization could be marked by three stages: isolation, interaction, and integration. Isolation characterized the segregation of the remote empires of the past, access to which even until the last century was intermittent, time-consuming, and hazardous. Interaction, as Clinton pointed out, subsequently proved to be both positive, in the form of trade, exchanges of ideas, and so on, and negative, with the increased facility and scale of warfare. But this century is perhaps exemplifying for the first time the realization of a massive integration.

And yet this idea, at least as a hypothetical scenario, is not that revolutionary. As long ago as 1950, the French philosopher and Jesuit priest Pierre Teilhard de Chardin developed the idea of globalized thought, an eventual scenario he dubbed the “noosphere.”16 According to Teilhard de Chardin, the noosphere would emerge through, and be composed of, the interaction of human minds. As humanity progressed into more complex social networks, the noosphere would be elevated in awareness. Teilhard de Chardin saw the ultimate apotheosis of the noosphere as the Omega Point, the greatest degree of collective consciousness to which the universe would evolve, with individuals still as distinct entities. Tempting as it is to believe that the digitally induced globalization in instant thought sharing and worldwide communication is realizing his vision, we cannot assume that this erstwhile hypothetical idea is now becoming our reality. What if one immediate outcome of global outreach and a correspondingly homogenized culture might be that we all start to react and behave in a more homogeneous style, one that eventually blurs cultural diversity and identity? Obviously, while there are huge advantages to understanding previously alien lifestyles and agendas, there is a big difference between a world enriched by other, contrasting ways of living and one that shares a single standardized, cookie-cutter existence. While diversity in societies brings great insights into the human condition, surely such comparisons can only be based on a clear and confident identity and lifestyle. A mere global homogenization of mindset might in the long run have serious consequences for how we see ourselves and the societies in which we live.

While speed, efficiency, and ubiquity must surely be good things, this new life of integration may have other, less beneficial effects that we need to think about. In days gone by we waited for the delivery of the mail at fixed times daily. An international phone call was, for everyone other than the very rich, generally an option only for special or emergency circumstances. But we now take for granted the constant availability of international communication. We tend to expect instant responses, and in turn assume we ourselves will reply immediately, oscillating incessantly between transmit and receive modes.

At a formal breakfast I attended recently where the main speaker was the British deputy prime minister, Nick Clegg, the woman sitting next to me was so busy tweeting that she was at a breakfast with Clegg that she wasn’t actually listening to what he was saying. Twenty-four percent of users of U.S. adult social networking sites reported a curious phenomenon in 2012—that they missed out on a key event or moment in their lives because they were so absorbed in updating their social networking site about that event or moment.17 Alternatively, you can monitor the flood of consciousness of others, almost as a way of life. When I asked a colleague how often she used Twitter, she showed me an email from a friend that is not uncommon in what it describes: “I have Twitter open on my PC all day so I look at it between calls, when on hold on the phone etc. I’d say pretty much our whole office does.”

We no longer need to wait, to acknowledge the passing of time between cause and effect or between action and reaction. For most people who a few decades ago would never have contemplated foreign travel or having a network of friends beyond the local community into which they were born, there are now nonstop thrilling opportunities for encompassing the entire planet. The advantages of this effortless communication are many. No one could make a convincing case for turning back the clock to when postal deliveries took days. But perhaps there is some merit in having time to reflect before responding to views or information. Perhaps there are benefits to pacing your day according to your own choice, at your own speed.

The crucial issue here is how we digest internally what is happening around us as we travel through each day. The Austrian physician who developed the current treatment for Parkinson’s disease back in the 1960s, Oleh Hornykiewicz, once offered this insight: “Thinking is movement confined to the brain.” A movement is characterized by a chain of linked actions that take place in a particular order. The simplest example, walking, is a series of steps in which placing one foot forward leads to the other foot overtaking it; one step thus leads to the next in a cause-and-effect chain that is not random but a fixed linear sequence. So it is with thought. All thought, be it fantasy, memory, logical argument, business plan, hope, or grievance, shares this basic common characteristic of a fixed sequence. And since there is clearly a defined beginning, middle, and end to the sequence, there has to be a time frame. As I see it, this idea of sequence is the very quintessence of a thought, and it is the mental step needed that will distinguish a line or train of thought from a one-off instantaneous emotion captured in a shriek of laughter or a scream. Unlike a raw feeling that occurs as a momentary reaction, the thought process transcends the here and now and links past with future.

Human beings are not alone in possessing sufficient memory to link a previous event, a cause, with a subsequent one, an effect, and even to see a likely result in the future. A rat that receives a food pellet for pressing a bar can soon “think” about its next best move and learn to press the bar again. The link between stimulus and response has been forged. But we humans are unique in being able to link events, people, and objects that are not physically present in front of us into a stream of thought. We have the ability to see one thing, including an abstract word, in terms of another. Unlike all other animals, and even human infants, we have spoken and written language. We are liberated from the press of the moment around us because we can turn toward the past and then to the future by using symbols, words, to stand for things that are not physically present: we can remember and plan and imagine. But it takes time to do so, and the more complex the thought, the more time we need to take the necessary mental steps.

But if you place a human brain, with its evolutionary mandate to adapt to its environment, in an environment where there is no obvious linear sequence, where facts can be accessed at random, where everything is reversible, where the gap between stimulus and response is minimal, and above all where time is short, then the train of thought can be derailed. Add in the sensory distractions of an all-encompassing and vivid audiovisual universe encouraging a shorter attention span, and you might become, as it were, a computer yourself: a system responding efficiently and processing information very well, but devoid of deeper thought.

Thirty or so years ago, the term “climate change” meant little to most people; now it is understood by virtually everyone as an umbrella concept encompassing a wide variety of topics, including carbon sequestration, alternative energy sources, and water use, to cite just a few examples. Some feel that we’re doomed, others that the different problems are exaggerated, and still others that science can help. Climate change is therefore not only global and unprecedented but also multifaceted and controversial. When we turn to the question of how future generations will think and feel, “Mind Change” can be an equally useful umbrella concept.

The argument underlying the notion of Mind Change goes like this. The human brain will adapt to whatever environment in which it is placed. The cyberworld of the twenty-first century is offering a new type of environment. Therefore, the brain could be changing in parallel, in correspondingly new ways. To the extent that we can begin to understand and anticipate these changes, positive or negative, we will be better able to navigate this new world. So let’s probe further into how Mind Change, just like climate change, is not only global, as we’ve just seen, but also unprecedented, controversial, and multifaceted.