VIDEOGAMES AND ATTENTION

“The sounds of silence are a dim recollection now, like mystery, privacy and paying attention to one thing or one person at a time,” writes New York Times columnist Maureen Dowd, looking wistfully back to another era.1 Perhaps we shouldn’t be too surprised that if nowadays we end up engaged for hours in activities bombarding us with fast-paced stimuli, then our exquisitely adaptable human brain will obligingly adapt to that environment, an environment that does not require sustained attention. And the more stimulation flooding in, the shorter the attention span that can be allocated to each input. So could videogames, given their fast-paced and vivid content, be affecting attention in a way that is unprecedented and unique compared to all the usual, more muted distractions of real life?

Before we can even think about answering this question, we need to sort out the common and understandable complaint that the Internet in general, and gaming in particular, is to blame for a range of problems that also might be justifiably generalized to human nature, the modern world as a whole, or at least any screen-based technology, such as the good old TV. Such critics have a fair point. For example, at the Seattle Children’s Hospital, Dimitri Christakis examined more than a thousand children at one year of age and a similar number at age three.2 He found that 10 percent of the children sampled had attentional problems at seven years of age that were linked to the number of hours of television that had been viewed per day between the ages of one and three. So while shortening the attention span is obviously not a good thing, gaming can’t have any additional impact compared to other, older screen-based experiences … or can it?3

Edward Swing and his team at Iowa State University have conducted the first long-term study on the specific effects of videogame use by elementary school children.4 The project involved 1,323 children between six and twelve years old who, together with their parents, recorded their television and videogame exposure at four points over a thirteen-month period. Teachers measured attention problems by reporting difficulties the participants had staying on task and paying attention, and whether a child often interrupted another child’s work. It turned out that those who had more than two hours of screen time (television and videogames combined) per day were more likely to be above the norm in showing attention problems. However, the results also revealed that playing games was linked specifically to a greater risk of developing attentional problems, and that it was in fact a more robust predictor than television viewing. Even after allowing for the effect of TV exposure, as well as any earlier attention problems the child already had, the amount of time spent playing videogames by each child accurately predicted increases in problems with attention just over a year later.5 So gaming would seem to have a specific detrimental effect.

Subsequent research has investigated in more detail the relationships between gaming and attentional problems and reached similar conclusions. At Iowa State University, Douglas Gentile and his team followed up with a sample of more than three thousand children and adolescents tracked over three years.6 Children who spent more time gaming had more attention problems, even when earlier attention problems, sex, age, race, and socioeconomic status were statistically controlled for. Interestingly enough, children who were more impulsive or had more attention problems subsequently spent more time playing videogames, indicating a possible bidirectional effect of gaming on attention problems: the one enhances the other, and vice versa.

These investigations provide the strongest evidence to date that the association between videogame play and attention problems is not coincidental but causal. This possible interrelationship has potentially interesting implications for Mind Change. It demonstrates clearly how the brain and the environment are in such constant dialogue with each other that it’s often hard to tease out the chicken and the egg, as we’ve seen already. Someone who is impulsive and readily distracted might find in videogames the perfect vehicle for his or her disposition, while habitually spending time in a world mandating quick reactions and instant feedback will guarantee that the brain adapts to that fast-paced environment.

Modern videogames, with their visually rich and fast-paced play, are likely to place significant visuo-spatial and cognitive demands on a gamer, and these demands will in turn leave their mark via the plasticity of their brain and hence on the individual’s subsequent behavior—but not necessarily with negative consequences. Research shows that gamers make excellent drone pilots, and even outperform real pilots on certain tasks.7 In the same spirit, scientists at the Duke School of Medicine have investigated just how effectively skilled gamers might eventually become highly proficient drone pilots, compared to their student colleagues who didn’t play action games.8 Greg Appelbaum, an assistant professor of psychiatry, set the subjects a visual memory task to see how efficiently they could recall information they had just seen for the first time. The experienced gamers beat their rookie counterparts, proving that they could respond to visual stimuli much more quickly. This draws upon the skill needed in first-person shooter games, where gamers need to decide what to “blast” every second. “Gamers see the world differently. They are able to extract more information from a visual scene,” Appelbaum concludes. “They need less information to arrive at a probabilistic conclusion, and they do it faster.”9

Some researchers have suggested that it is in fact the motivations of gamers that can create differences between gamers and nongamers, rather than superior visuo-spatial skills.10 Think about it: gaming enthusiasts spend their free time using computers for the enjoyment and competition of game tasks, whereas nongamers recruited into different studies obviously will not have a preference for such activities if other options are available. Thus perhaps it’s simply that gamers have a certain mindset leading them to be more competitive, to enjoy computer tasks, or to be more incentivized to do well in the scenarios that result in the visuo-spatial improvements.

A whole host of different processes and functions, such as vision and motor control, appear to be enhanced by regular gaming.11 Compared to nonplayers, seasoned action gamers have demonstrably better hand-eye coordination and visuo-motor skills, such as resistance to distraction, sensitivity to information in the peripheral vision, and an ability to count briefly presented objects. With the development of the PlayStation Move, Kinect, and Wii, videogames can also lay persuasive claims to developing motor skills by encouraging full-body movement.

One of the key studies showing the beneficial effects of gaming took place as long ago as 2003, when Shawn Green and Daphne Bavelier at the University of Rochester investigated the impact of action videogame playing on vision. They were interested in whether learning could improve performance in different tasks other than those on which the training was focused. Initial experiments confirmed the expected improvements: in different aspects of visual attention (the ability to focus on one part of the visual field), the habitual videogame players outperformed the rookies. Most significant, however, was that in a final experiment the nonplayers who had been subsequently trained on an action videogame showed a marked improvement that transferred to skills well beyond the training task. Green and Bavelier concluded, “Therefore, although videogame playing may seem to be rather mindless, it is capable of radically altering visual attentional processing.”12

Subsequently, multiple investigations have confirmed that playing certain videogames confers on the gamer a wide range of diverse benefits, including enhancements in low-level vision, visual attention, and speed of processing, among others.13 The fact that a number of properly controlled studies have repeatedly demonstrated a causal link between videogame playing and the enhancement of these abilities proves that the videogames, and not any preternatural gifts of the players themselves, are causing this improvement. Nor does the videogame experience have to result only in an immediate advantage in current tasks. A real benefit of playing appears to be the even more impressive ability to improve on how gamers will learn completely new tasks. These newfound talents have subsequent real-world applications. They include, for example, a superior ability to see small details, faster processing of rapidly presented information, higher capacity in short-term memory, increased capacity to process multiple objects simultaneously, and flexible switching between tasks—all useful skills in a variety of precision-demanding jobs. Laparoscopic surgeons who are habitual gamers turn out to be better surgeons than their nongaming peers in terms of speed of execution and reliability.14

Time spent on videogames is not a simple rehearsal of a specific skill but, remarkably, can be generalized to other situations and a wide range of unforeseen skills and behaviors. It is hardly surprising, therefore, that Nintendo advertises Big Brain Academy as a game that “trains your brain with a course load of mind-bending activities across five categories: think, memorize, analyze, compute, and identify.”15 Moreover, one of the promises is that, compared to traditional training methods, the game can be engaging and entertaining.

And it is not just the normal, healthy Digital Native brain that appears to flourish. The evidence is convincing that games can have beneficial, remedial effects over a wide range of impairments, including a reversal of cognitive decline in the elderly. In one study, the researchers trained older adults in a videogame for a total of 23.5 hours.16 They assessed their subjects with a battery of cognitive tasks, including tests for executive control and visuo-spatial skills, before, during, and after videogame training. The subjects improved significantly within the game but, most important, also showed clear improvement in executive control functions, such as task switching, working memory, short-term visual memory, and reasoning. Specifically, participants trained on the videogame were able to switch between two tasks with less effort or cost to their attention than the control subjects, and showed short-term improvements in recall in the executive function tasks they were tested on before and after the training period.

When used to treat patients with a wide range of brain disorders, it seems, videogames can offer a truly beneficial and enjoyable experience. For example, they have been effective in reducing delusional symptoms in schizophrenic patients after just eight weeks.17 In a pilot study in adolescents with autistic spectrum disorders, there were visible changes in brain scans in response to emotional words and emotions during a six-week period of prosocial game playing.18 In the rehabilitation of the victims of motor vehicle accidents with post-traumatic stress disorder, the virtual-reality experience of driving or riding in a car in a computer game improved symptoms and promoted recovery.19 Videogames catering to specific psychological needs in certain disorders can offer effective complementary treatment options, such as for those with impulse control problems.20 Meanwhile, neuroscientists have been using popular iPhone games such as Fruit Ninja (where you simply slice fruit in half with your finger) to rehabilitate stroke victims.21

Playing videogames could also potentially have positive effects on more abstract aspects of brain function, such as social development and psychological well-being. For example, playing videogames together with parents has been linked with decreased levels of aggression and increased levels of prosocial behavior, albeit only in girls.22 However, the same research found that the length of time spent gaming, in general, was associated with increased aggression and lower prosocial behavior. Therefore, the beneficial effect here could be due more to the joint activity with parents than to the actual action being played out on the screen. Even gender stereotyping might play a part. The authors speculate that because boys play more videogames than girls, the time the boys spent playing games on their own may have diluted the beneficial effects of time spent playing games with parents. Additionally, they suggest that boys typically play more age-inappropriate videogames than girls, and this may also offset the benefits of gaming with parents.

As we’ve already seen with social networking, videogame worlds may be a realm where gamers can freely explore their identities.23 Research shows that tapping into leadership potential in MMORPGs can spill over into workplace potential.24 Such games could perhaps help develop new organizational training techniques; then again, it could just be the case that a gamer who has the potential to be a leader in a videogame ends up as a leader in the real world, while losers in the real world remain losers in a game. It’s still debatable whether videogames serve as a useful lesson for real life or as an escape from it. Games may indeed demonstrate to the gamer that choices are sometimes hard to make, as when gamers are trying to achieve a goal and must weigh consequences, benefits, and the strength of their individual skill set as they decide whether to confront or avoid a problem. On the other hand, experience in the real world will teach that anyway. After all, if there were no difference between real life and gaming, what would be the point of the game in the first place? But if there is a difference, would the game experience actually be that useful in terms of real-life applications?

Almost all real-life tasks could be considered dull in comparison to well-designed, highly stimulating games, and this difference may have seriously negative consequences. Kira Bailey and her research group at Iowa State University cautiously note that while some videogames may have positive educational and therapeutic effects, overall their data suggested “that high levels of videogame experience may be associated with a reduction in the efficiency processes supporting proactive cognitive control, that allow one to maintain goal-directed information processing in contexts that do not naturally hold one’s attention.”25 Or to put it more simply, gaming could be bad for sustained attention.

While extensive research has shown how action gaming can improve focusing on the screen, this gain may come at a cost. Videogames reward players for quickly modifying their behavior when conflict is experienced, and this specific feature of action games may have differential effects on proactive and reactive control. Think of reactive control as the just-in-time type of response to a stimulus that is used only when needed, whereas proactive control would be deployed consistently and in anticipation of future stimuli, indicating an individual’s capacity to choose what she pays attention to and what she ignores.26 While high-frequency gamers (playing more than forty hours a week) are well rehearsed in responding instantaneously to suddenly presented stimuli (reactive control), their ability to maintain proactive attention over an entire task is less impressive. Videogames may train an individual to respond rapidly to suddenly presented stimuli, but they may provide no advantage in being able to maintain focus during mundane tasks.27

In contrast, other recent work suggests that frequent gamers may be more persistent than infrequent gamers in sticking with complex puzzles involving anagrams and riddles.28 Frequent videogame players spent more time on unsolved problems compared to infrequent videogame players. These results were taken as proving that videogame use can lead to more perseverance across a variety of tasks. However, once again it could be the case that different character traits are responsible for the crucial difference within an experimental protocol. Gamers may be more competitive than nongamers, and a laboratory assessment task measuring skill of any type will automatically motivate a frequent gamer to want to win. Moreover, the gamers in this study may have seen the puzzle as a game itself and not as a boring task. So the question is still open as to whether frequent gamers will have the ability to pay more attention generally, regardless of the task at hand.

How can we reconcile conflicting conclusions as to whether gaming improves or impairs attention? The answer may lie in the type of attention required to be successful at action games. There are a number of taxonomies that attempt to describe the human attention system. Selective or focused attention, defined as the ability to focus on a specific set of stimuli, is a kind of attention that is typically driven by internal motivations. Sustained attention, by contrast, is the ability to maintain vigilance over longer periods of time, and is often required during a tedious activity. While videogames might rehearse and therefore be beneficial to the type of attention requiring the processing of selective stimuli, the maintenance of attention over long periods in the absence of fast-paced moment-to-moment stimulation could well be diminished. So gamers may have a problem not with selective attention but rather with sustained attention.

One interesting question about these impairments in attention is their possible connection to the prevalence of attention deficit hyperactivity disorder.29 For some, the idea that attentional disorders could be linked to gaming is mere speculation. In a review assessing the impact of digital technologies on human well-being, Paul Howard-Jones concluded, “We do not know [if] the use of digital technology by young children is a causal factor in developing ADHD.”30

Subsequently, Alison Parkes and her team at the University of Glasgow surveyed more than eleven thousand children and reported that videogames had no effect on their psychosocial development, including attentional problems.31 The size of the cohort studied here might seem impressive and hence the findings reassuringly conclusive. But there are some serious drawbacks to the underlying methodology. First, the study investigated children between the ages of five and seven, while almost all the rest of the research literature focuses on older children, who have greater opportunity to play stimulating, violent, or reckless action games not typically available to the very young. Second, the possible symptoms for ADHD were assessed solely by subjective report of the far-from-unbiased parents (hence the unusually large sample size, as data was relatively easy to collect); in contrast, other studies have used more comprehensive, time-consuming, and objective assessment tools. Third, the Glasgow project measured only weekday videogame use, and there may be many more hours of videogames played on the weekend, so the study does not provide a complete picture of total videogame use.

In any event, before we can be sure of a link between attentional problems and gaming, various other issues need to be unpacked. A number of studies have investigated the relationship between excessive Internet use generally and ADHD symptoms.32 A huge caveat, however, is that gaming and excessive Internet use are two distinct activities: one might be related to ADHD, while the other might not. A further complicating factor is that certain genres of games may have different effects on ADHD. MMORPGs are actually associated with lower levels of impulsivity and ADHD symptoms, but in turn are linked to higher levels of anxiety and social withdrawal.33 Moreover, the relationship between ADHD and gaming might hinge on the actual frequency of playing, which will not necessarily have been taken into consideration. In addition, any relationship between excessive Internet use and ADHD may be attributable to an addictive state and not to the activity itself. That said, given that so many excessive Internet users are gamers, the relationship between excessive Internet use and ADHD needs to be explored.

Taking all the above considerations into account, and bearing in mind that there is no single “cause” of ADHD, there is still persuasive evidence that excessive gaming can indeed be associated with attentional disorders. In 2006 Jee Hyun Ha and his colleagues investigated large numbers of children in Korea in two stages. The first consisted of screening all participants for Internet addiction disorder and then, from those who screened positive, randomly selecting a smaller group for a thorough psychiatric assessment. Tellingly, the Internet-addicted children used the Internet primarily for Internet gaming. Over half of these youths (who were ages nine to thirteen) qualified for a diagnosis of ADHD.34 A year later a psychiatric comorbidity survey of more than two thousand Taiwanese high school students aged fifteen to twenty-three reported that 18 percent of students were classified as Internet addicts and that Internet addiction was strongly associated with ADHD symptoms.35

As well as the finding that restriction of children’s exposure to TV and videogames reduced the likelihood of attention problems in class, a study by Philip Chan and Terry Rabinowitz at Rhode Island Hospital found that if teenagers played videogames for more than one hour per day, they displayed more features of ADHD, including inattention.36 However, the authors highlighted the now familiar chicken-and-egg problem: “It is unclear whether playing videogames for more than one hour leads to an increase in ADHD symptoms, or whether adolescents with ADHD symptoms spend more time on videogames.”37

While there is a significant association between the level of ADHD symptoms and the severity of Internet addiction in children, it also appears that the presence of ADHD in a child might predict the likelihood of developing gaming addiction. In a study of young people with and without ADHD, ages six to sixteen years, there were no differences in the frequency or duration of gaming between the two groups.38 However, the ADHD group had significantly higher gaming addiction scores, indicating that ADHD children may experience gaming activity with more intensity than non-ADHD children and thus may be particularly vulnerable to gaming addiction. So if Internet use and obsessive gaming are influencing each other, it may not be that one is causing the other, but rather that both are symptomatic of the same single common brain state: two sides of the same mental coin. A clue as to what that brain state might be comes from looking a bit more closely at a medication used to treat ADHD.

Methylphenidate, perhaps best known by one of its brand names, Ritalin, is a stimulant drug given widely to treat attentional disorders. In the United Kingdom, the number of prescriptions for methylphenidate soared from 158,000 in 1999 to 661,463 in 2010.39 In the United States, Benedetto Vitiello of the National Institute of Mental Health documented stimulant prescriptions between 1996 and 2008 and found that the number of prescriptions for children younger than nineteen increased significantly during that twelve-year period.40 Those aged six to twelve had the most prescriptions, but teens aged thirteen to eighteen had the biggest increase in prescriptions. A similar trend was found in Australia, where the use of stimulant drugs to treat ADHD in children has escalated dramatically, with prescriptions for Ritalin and its equivalents up 300 percent between 2002 and 2009.41

Of course, it could be that these colossal increases in prescriptions across three different continents have nothing to do with an increase in ADHD itself but owe more to a current clinical trend to medicalize a particular behavior and/or a greater willingness to prescribe medication for the condition.42 Nonetheless, the current association between ADHD medication and abnormally short attention spans brings into play our old friend dopamine, as methylphenidate results in an increase in this chemical messenger in the brain. It has proved a continuing riddle to neuroscientists why such a drug should be effective in treating a short attention span.

When dopamine goes to work in the brain, you become more aroused, more excited. The apparent paradox of a stimulant drug such as Ritalin effectively combating hyperarousal can be explained by its ability to desensitize dopamine’s normal chemical targets. As we’ve discussed already, the interaction of these chemical targets (receptors) with their respective brain neurotransmitter resembles a molecular handshake. But if the handshake is persistent and strong, then the hand (the receptor) will become numb, less sensitive (desensitized). The result is that the dopamine in the brain will be less effective and you will be less hyperactive. In an individual who does not have ADHD, the drug can prolong attention span, which could be viewed as a desirable “cognitive enhancement.”

Modafinil, a novel wakefulness-promoting agent, has a pharmacological profile similar to that of conventional stimulants such as methylphenidate. Psychologist Trevor Robbins and his team in Cambridge were interested in assessing whether modafinil might offer similar potential as a cognitive enhancer in those who were perfectly normal.43 Sixty healthy young adult male volunteers received a single oral dose of either a placebo (an inert substance that they thought could have beneficial effects) or of modafinil prior to performing a variety of tasks designed to test memory and attention. Only modafinil significantly enhanced performance on various cognitive tests, including visual pattern recognition memory, spatial planning, and reaction time. The subjects also said that they felt more alert, attentive, and energetic on the drug. A further effect seemed to be to reduce impulsivity. So might drugs such as modafinil give us an insight into the link between ADHD and excessive gaming?

In 2009 associate professor of psychiatry Doug Hyun Han and his team at the University of Utah prospectively studied a large number of teenagers, the great majority of whom were male. The subjects all had a history of ADHD, as well as track records of excessive use of videogames. The idea was to examine whether both videogame play and methylphenidate increased dopamine release in a way that could enable the teenagers to concentrate better. Han administered Concerta XL (similar to Ritalin) and followed up the performance of the subjects after eight weeks. There was a reduction in Internet addiction scores and total time of Internet use, indicating that methylphenidate could reduce such obsessive behavior in subjects with co-occurring ADHD and excessive gaming. Although the authors did not clarify how much of the Internet activity was gaming, they came to the fascinating conclusion that if ADHD and gaming really are two sides of the same coin, same brain state, then “Internet videogame playing might be a means of self-medication for children with ADHD.”44

If videogames are a kind of self-medication for those suffering from ADHD, the most obvious common factor is excessive dopamine release in the brain, in turn related to addiction, reward, and arousal. Paul Howard-Jones at Bristol University has even suggested that this process could be harnessed by allowing children to play videogames; they would thereby become more aroused and be cognitively enhanced in the classroom.45 So perhaps, under the right conditions, videogames might prove a valuable tool for teachers. Yet while the amounts of endogenous dopamine released naturally within the brain as a consequence of gaming would probably not lead to the same level of receptor desensitization that could occur with usual doses of modafinil or Ritalin, do we really want students to be in a permanent state of high arousal? It would surely not be that different from giving them low doses of amphetamine.

Most immediately, it seems there is a clear link between gaming and attention generally. Although selective visual attention for focusing on a screen object or avatar might be improved in the short term with gaming, it could be to the detriment of the all-important sustained type of attention over the longer term, the kind of attention needed to reflect and to understand something in depth. Moreover, the implication of dopamine as a central player in the brain of the gamer might be providing a truly helpful insight into understanding the appeal of the activity, compared to real life. But could a mindset used to experiencing reliable if not easy rewards, also be one inclined to aggression and recklessness?