Climate change, according to the Intergovernmental Panel on Climate Change, “may be due to natural internal processes or external forcings, or to persistent anthropogenic changes in the composition of the atmosphere or in land use.”1 Nobody could dispute that there’s a multitude of issues involved here. So it is with Mind Change, which I’m suggesting is comparably multifaceted, throwing up a range of different questions that need to be explored independently. These different questions fall into three main areas, worth previewing here: social networking and the implications for identity and relationships; gaming and the implications for attention, addiction, and aggression; and search engines and the implications for learning and memory.
In no particular order of priority, let’s start with social networking. A recent radio program on the BBC featured Kaylan, an eighteen-year-old man who decided to take advantage of the opportunity offered by Facebook as of September 2011 to remove all privacy settings on his page so that any number of followers could track his daily life in the public domain. He boasted of having some one hundred thousand followers at the time of the broadcast. Kaylan also admitted he had done nothing at all to deserve fame. His posts were often mundane photos of himself throughout the day leading a “crazy life.”
So what was it that was so attractive to his followers? Well, there were a whole bunch of similar folk like him who could engage in arguments with each other. Then the followers could take sides. Yes, Kaylan had his fair share of “haters.” After all, he added, “you can’t be nice on Facebook.” By saying unpleasant things such as “Kill yourself,” these haters would then garner maximum praise and “fame” for themselves. While Kaylan is obviously very far from being a typical Facebook user, both he and his hundred thousand followers serve as an example of the unprecedented extremes to which the medium can be taken. Your importance as revealed by social networking activity can even now be quantified.2
The majority of Facebook users are far less dramatic. Still, in a Pew Research Center survey, U.S. social network users aged twelve to nineteen overwhelmingly chose negative rather than positive adjectives to describe how people act on social networking sites, including “rude, fake, crude, over-dramatic and disrespectful.”3 For example, one middle school girl respondent commented: “I think people get, like when they get on Facebook, they get ruthless, stuff like that.… They act different in school and stuff like that, but when they get online they’re like a totally different person, you get a lot of confidence.” Another girl said: “That’s what a lot of people do. Like they won’t say it to your face, but they will write it online.”
A recent meta-analysis looking at data collected over thirty years from fourteen thousand U.S. college students indicates that overall levels of empathy may be declining, with an especially steep drop in the last ten years—a time frame that corresponds well with the advent of social networking among Digital Natives.4 Of course, a correlation is not a causal link, but this is just the type of close correspondence that should serve as a starting point for rigorous epidemiology to establish whether there might be a direct causal link between screen time and a reduction in empathy. We should also be asking why those who already have problems empathizing, such as individuals with autistic spectrum disorder, are particularly comfortable in the cyberworld. More generally, could this sanitized and limited type of interaction account for the ease with which bullying, always a dark part of human nature, has now found unconstrained expression in the cyberworld? After all, if you haven’t rehearsed the basic nonverbal communication skills of eye contact, voice modulation, body language perception, and physical contact, you won’t be particularly good at them, and it will be harder for you to empathize with others.
More than a billion people worldwide use Facebook to keep in touch with friends, share pictures and video clips, and post regular updates of their movements and thoughts.5 Another estimate is 12 percent of the entire global population, with 50 percent of North Americans, 38 percent of antipodeans, 29 percent of Europeans, and 28 percent of Latin Americans signed up.6 (These figures are based on total population; if we exclude newborn babies, the severely infirm, and others with no computer access, the number of Facebook users as a proportion of the computer-using population is probably far higher). A further two hundred million actively use Twitter, the “microblogging” service that lets users circulate short messages about themselves, post pictures, and follow the minutiae of others’ stream of consciousness or daily routines.7
Nowadays, all generations are represented on the sites, with octogenarians able to stay in touch with grandchildren living far away, but it is the Digital Natives who are the most avid users. In the United Kingdom, 64 percent of adult Internet users age sixteen and over are social network site users, whereas 92 percent of those aged sixteen to twenty-four who use the Internet have profiled themselves on a social networking site.8 In the United States, 80 percent of online teens ages twelve to seventeen use social networking sites, mostly Facebook and MySpace.9 U.S. users have on average 262 friends,10 a figure higher than the world average of roughly 140 friends.11 Twelve- to twenty-four-year-old Facebook users have, on average, more than five hundred Facebook friends.12 Roughly 22 percent are from high school, 12 percent are immediate family, 10 percent are co-workers, 9 percent are from college, and 10 percent of friends have never been met in person or only been met once.13
On an average day, 26 percent of Facebook users “like” a friend’s status, 22 percent comment on a friend’s status, while only 15 percent update their own status.14 So more people spend time interacting with other users’ content rather than posting their own. All of which points to a blindingly obvious truth: social networking has become a central factor in the culture of all but the very poorest and most deprived regions in the world, or the most ideologically repressed. A critical question then is, quite simply, what is so special about social networking? What is the basic need that this new culture is meeting in an apparently unprecedented and yet effective way? If we are to understand and appreciate the changing mind of the mid-twenty-first century, this is one of the most important questions to ask.
The benefits of social networking seem irrefutable: direct marketing to the consumer, dating sites, career building, contact with old friends. Being “connected” is often cited with an enthusiasm that automatically assumes it is a desirable scenario. But what worries me is whether this almost incessant communication through the screen might have a downside as well. As always, there’s the key issue of being “sensible”: while social networking sites could provide harmless fun and complement real friendships if they are used in moderation, if they are used excessively or to the exclusion of real relationships, they might perhaps impact in a very fundamental and unforeseen way on how you view your friends, friendship, and ultimately yourself.
If you’re increasingly anchored in the present and consequently devoting all your time to the demands of the outside world, a robust sense of inner identity might be harder to sustain. Perhaps the constant accessing of social networking sites will mean living a life where the mere thrill of reporting and receiving information completely trumps the ongoing experience itself—a life where checking in at a restaurant, posting pictures of a meal, and yearning for “likes” and “comments” generates more excitement than the occasion of dining out itself. The momentary exhilaration you’d be feeling would shift from being generated by a firsthand life experience toward the slightly delayed indirect experience of the continuing reaction and approval of everyone else. If we’re going to be living in a world where face-to-face interaction is less practiced and is thereby uncomfortable, then the “push” of such an aversion to messy real-life, three-dimensional communication, combined with the “pull” of the appeal of a more collective identity of external reassurance and approval, may be transforming the very nature of personal relationships. The knee-jerk speed required for reaction and the reduced time for reflection might mean that those reactions and evaluations themselves are becoming increasingly superficial: already people are using phrases such as “kill yourself” and “hater” on Facebook in a context that conveys far less depth of real feeling and of individual background history than these terms previously would have implied.
Privacy appears to be becoming a less prized commodity: among the U.S. young aged thirteen to seventeen, more than half have given out personal information to someone they don’t know, including photos and physical descriptions.15 Meanwhile, Digital Natives post personal information on their Facebook page that is typically shared with more than five hundred “friends” at a time, fully aware that each of these friends could then pass on that information to hundreds more in their networks.
It has become more important to have attention, to be “famous.” The trade-off for such fame is, and always has been, as the mid-twentieth-century film star Greta Garbo famously exemplified in her repeated pleas of “I want to be alone,”: loss of privacy. So why is it that the privacy we so treasured previously we now hold in increasingly casual disregard? Until now, privacy has been the other side of the coin to our identity. We have seen ourselves as individual entities, in contact with the outside world, yet distinct from it. We interact with that outside world, but only in the ways and at the times we choose. We have secrets, memories, and hopes to which no one else has automatic access. This secret life is our identity, distinct from a professional one and even more intimate than a private life of individual friendships in which we vary what and how much we confide to others. It is a kind of inner narrative that, until now, has provided each individual with his or her own way of linking past, present, and future—an ongoing subjective, internal commentary that meshes past memories and future hopes with the happenstance of each day. Now, for the first time, this secret story line is being opened up to the outside world, to an external audience that can be uncaringly capricious and judgmental in its reaction. A particular identity therefore no longer is an internal, subjective experience, but is constructed externally and therefore is much less robust and more volatile, as has already been suggested in a recent report to the British government on “future identities.”16
A second cornerstone of the digital lifestyle is gaming. In the mid-1980s, children might have spent about four hours a week on average playing videogames at home and in arcades.17 But fast-forward a decade or so and the videogame has become an integral part of the home scene and beyond.18
A 2012 study of U.S. adolescents reported that boys between the ages of ten and thirteen were playing on average a staggering forty-three hours a week (although, admittedly, the number of subjects was fairly small, 184).19 Yet even conservative estimates (from 2009) indicate that the average U.S. child between the ages of eight and eighteen is spending seventy-three minutes a day recreationally in this one screen-based activity, up from twenty-three minutes in 1999.20 That means at least an hour a day spent not interacting with the real world, and in particular not studying. In a survey of U.S. youth between ages ten and nineteen, gamers spent 30 percent less time reading and 34 percent less time doing homework.21 Granted, it is hard to separate the chicken from the egg: perhaps children who perform more poorly in school are likely to spend more time playing games, which may give them a sense of mastery that eludes them in the classroom. We need to go beyond correlation to cause, but what we can’t do is just ignore the issue altogether.
Videogames open up fertile territory for controversy. On one hand, there are clear positives, as we’ll explore in detail later: for example, improved sensorimotor coordination and perceptual learning. On the other hand, various and many stories around the world can paint a terrible picture of a modern lifestyle of overindulgence in the unfettered fun of playing videogames. For example, in Taiwan in February 2012, a twenty-three-year-old man was found dead in an Internet cafe after twenty-three hours of continuous gaming.22 Another young man in Taiwan, age eighteen, died in July 2012 after forty hours of continuous gaming.23 Then there was the report of two parents neglecting their own real baby, who subsequently died, in order to raise an online virtual baby.24 In December 2010 a man in the north of England received a life sentence after killing a toddler immediately after losing in a violent videogame.25 Then there was the case of a gamer who hunted down his virtual opponent in real life and stabbed him as revenge for being stabbed in the game.26 And this is not to mention the list of high-profile suicides of gamers.
The immediate defense of a gaming fan would probably be that (1) this is all scaremongering and unlikely to be true; (2) it is unlikely to be the whole story, with other, more important factors really to blame or to mitigate the circumstances; or (3) these examples, horrendous though they may be, are isolated cases that are actually extremely rare. All of these possibilities are not mutually exclusive and may indeed be the case, but they should be conclusions, not starting premises. Moreover, even if such stories are exaggerated and uncommon, they may still be important as caricatures of certain prevailing trends now emanating from society, albeit in a much milder form: a profile of addiction, aggression, impulsivity, and recklessness.
Modern gamers enter a visually rich world where they can assume a character completely unlike themselves or, in some games, create whatever kind of character (avatar) they desire. They navigate these fictional beings through situations involving moral choices, violence or aggression, and role playing, with intricate reward systems built into the games that provide the incentive to carry on living out the fantasy. Some individuals can become so immersed that they lose track of the real world and time; they report that they turn into their avatars when they load the game. Alternatively, gamers may develop an emotional attachment to their character. So how are these highly stimulating, often violent games with possible addictive qualities actually affecting us?
One outcome could be enhanced aggression. Experimental studies are revealing that violent videogames lead to increases in aggressive behavior and aggressive thinking accompanied by decreases in prosocial behavior.27 It seems that videogame-induced aggression is directly caused not only by immediate provocation but also by more indirect biological predispositions and environmental influences, as an individual gradually develops a more adversarial worldview. Although violent games have not been proved to be the immediate trigger for criminally violent behavior, there is strong evidence that playing them may increase the type of low-grade hostility that occurs every day in schools or offices.
It may also be that videogames lead to excessive recklessness. In one recent investigation using brain imaging, the key finding was of an enlargement of a specific area of the brain (the nucleus accumbens) typically seen in the brains of compulsive gamblers.28 Most intriguing of all is that this particular brain region releases dopamine, a key chemical messenger whose production is increased by all addictive psychoactive drugs. These chemical similarities between the brains of gamers and those of gamblers do not prove that gaming is technically addictive, but both may well share a further feature: recklessness. After all, it is a dangerous lesson to learn that death lasts only until the next round—it may suggest that actions in the real world don’t have real consequences.
The crucial factor once again will be whether an individual is being, in the minister’s words in our House of Lords debate back in 2011, “sensible and proportionate” about playing games. It’s a bit like eating chocolate: the occasional treat in an otherwise balanced diet is relatively harmless and enjoyable, whereas an unremitting daily diet exclusively of chocolate would have dire consequences. The problem is not with those who might play games occasionally as a pastime in a portfolio of other interests and activities in the real world, but the number of frequent gamers who, from the amount of time they spend on gaming to the exclusion of all else, end up obsessional or hooked.
Finally, in addition to social networking and gaming, there’s a third aspect to Mind Change: surfing the Internet, particularly with search engines. If you are not using digital technologies interactively to engage in a relationship or to play a game, then the screen can still have intoxicating appeal simply because of what it can tell and show you—some might go so far as to say teach you. It’s almost unbelievable that this essential facility started less than twenty years ago, in 1994, when Yahoo! was created by Stanford University students Jerry Wang and David Filo in a campus trailer, originally as an Internet bookmark list and directory of interesting sites. Then in 1996 Sergey Brin and Larry Page, two Stanford University students, tested Backrub, a new search engine that ranked sites according to relevance and popularity. Backrub was destined to become Google, which currently has around 80 percent of the global market share in search, while its nearest competitors are in single digits.29 The brand name has become a verb: almost everyone “Googles.”
Sometimes, for no obvious reason, seemingly pointless activities, striking a funny posture, “planking” or performing a little dance such as the Harlem Shake, draw crowds. I have my own direct experience of how powerful such viral phenomena can be. Back in April 2010 I was being interviewed by Alice Thomson of the UK Times about the impact of digital technology on how we feel and think. We had progressed to discussing how fast-paced technology might mandate correspondingly fast views and reactions. Trying to provide her with a sound-bite summary, I raised the prospect of humans being reduced to simple negative or positive gut reactions, such as “yuck” or “wow,” to whatever flashed on the screen. Because I tend to talk quickly, Alice misheard and transcribed what I’d said as “yaka-wow.” This may be amusing enough in itself, but the point is that just twenty-four hours later one could find seventy-five thousand results for this term on Google. Moreover, someone bought the domain name, and soon I was astonished to see mugs and T-shirts sporting the term “yaka-wow.” On one website, the First Church of the Yaka-Wow welcomed “breezy people to a world of no consequences.” The term had gone viral within a time frame that would have been unthinkable only a decade or so earlier.
So what is the potential of digital technologies to help everyone, of any age, to learn things, in the broadest sense of the term? Presumably, when people surf they are feeding into a search engine specific terms or names, if not formal questions, and receiving relevant information in response. They are “learning.” The dictionary defines learning as “the act or process of acquiring knowledge or skill.” The current digital technology may enhance this ancient, superlative human talent, or then again it may jeopardize it, but we need to unpack the various issues involved. The appeal of the surfing experience, the differences between silicon and paper, the educational value of digital technologies, and, above all, access to a nearly infinite amount of information all operate as different and unprecedented factors to shape our thought processes.
Search engines are now part of our lives, and for many they are the immediate and obvious first stop for finding out a fact or learning more about a subject. So screens could shape our cognitive skills in a fundamentally new way. Surely one of the most important issues to explore is whether the next generation might be learning very differently compared to their predecessors who used books. The most obvious difference is a tactile one—we handle paper much more differently than we do screens. That being so, how might the pleasures of reading on a screen match up to those of paper? Flicking pages back and forth, highlighting sentences, and scribbling in the margin may all be positive features that contribute to the absorption of what you are reading, so the potential for personal interaction with a paper book may be greater than with a screen.
Anne Mangen at the University of Oslo explored the importance of actually touching paper by comparing the performance for readers of paper compared to readers of the screen. Her investigation indicated that reading on a computer screen entails different strategies, covering everything from browsing to simple word detection, that together lead to poorer reading comprehension in contrast to reading the same texts on paper.30 Moreover, apart from the physical features of the printed page compared to the pixelated one, the screen can have an additional feature that the book can never have: hypertext. Above all, a hypertext connection is not one that you have made yourself, and it will not necessarily have a place in your own unique conceptual framework. Therefore, it will not necessarily help you understand and digest what you’re reading, and it may even distract you.
But the whole point of screens is not simply that they can serve as substitute books. A still deeper issue is how computers, tablets, and e-readers can provide information in an utterly different, nonverbal way, and thereby perhaps actually transform how we think. If inputs arrive in the brain as images and pictures rather than as words, might that, by default, predispose the recipient to view things more literally rather than in abstract terms?
These, then, are the ever more invasive and pervasive technologies that have the power to transform not just what we think, but how. Yet Mind Change involves more than innovative gadgetry: just as critical is the mind that is to be changed. It is the growth and connections between the brain cells we are born with that will turn us into the unique beings we are, with brains capable of individual and original thought. There are many talents we as a species lack: we don’t run particularly fast or see particularly well, nor are we particularly strong compared to others in the animal kingdom. But our brains have the superlative talent to adapt to any environment into which we are placed, a process known as plasticity. As we make our personal, idiosyncratic way through life, we develop our own particular perspective as a consequence of these personalized connections in our brains. It’s this unique pattern of connectivity that I’d like to suggest amounts to an individual mind. So in order to appreciate the impact of these global, unprecedented, controversial, and multifaceted technologies on the twenty-first-century human mind, we need next to look through the prism of neuroscience.