CHAPTER FIVE

GAMBLING AND OTHER MODERN COMPULSIONS

Whether we are diagnosing Internet addicts, gambling addicts, and porn addicts or examining the motivations of chocaholics and shopaholics, our everyday speech has come to promote the idea that one can become addicted to almost any pleasurable activity. Certainly there’s a thread of truth in this assumption—compulsive behaviors can impact people’s lives to varying degrees. But how similar are such behaviors at a purely biological level? Are addictions to video games, gambling, or shopping really like drug or alcohol addiction in terms of brain function? Or are they just convenient examples of metaphoric language? In her book Desire: Where Sex Meets Addiction, Susan Cheever gets to the heart of the issue:

“Addiction” is the buzzword of the twenty-first century. What we call addiction ranges from the seriousness of methamphetamine addiction …to people who say casually they are addicted to Starbucks lattes …or sleeping on 600-thread-count sheets. In fact, we especially seem to use the word “addiction” for things to which we are not destructively addicted… . These are social habits, and we embrace the word “addiction” to describe them; using it erodes its powers and it identifies us as someone serious, but someone who knows when to take things lightly.1

While I agree with Cheever that designer-sheet addiction is overstating the case, both compulsive gambling and video game playing do meet many of the formal behavioral definitions of addiction that have been developed by psychologists, and there are certainly cases where people’s lives have been severely affected and even destroyed by such activities. However, behavioral addictions don’t necessarily have the same life trajectory as addiction to substances (like drugs, alcohol, or food). In fact, recent community-based studies (as opposed to studies of people in treatment, which are not a representative sample) show that about a third of gambling addicts and video game addicts are able to break their addictions within a given year without seeking outside help, something that rarely happens in drug addiction.

At the biological level there is now reason to believe that a broad definition of addiction—one that encompasses drugs, sex, food, gambling, video games, and some other compulsions—is valid. The developing story is that activation and then alteration of the medial forebrain pleasure circuit is the heart of all these addictions. Brain imaging studies have revealed that, like certain drugs or orgasm, both gambling and video game playing engage the medial forebrain pleasure circuit and cause dopamine release in VTA target regions. Recall our earlier discussion of patients who are given dopamine receptor agonist drugs to treat Parkinson’s disease. While they have an unusually high incidence of compulsive gambling, their strong urge to do so abates when the drug is withdrawn.

In our zeal to fashion an overall theory of pleasure, reward, and addiction, we must be careful not to overgeneralize. After all, we all eat food and have sex, yet most of us don’t become food or sex addicts. In the case of drugs, most people who use alcohol or barbiturates or even cocaine do not develop addictions to these substances. Similarly, most people can gamble or play video games occasionally without this behavior becoming compulsive and ruining their lives.2 Why is this so? What factors in the biology and/or experience of some individuals will turn pleasure into pathology?

 

In his fascinating memoir of his life as a compulsive gambler, Born to Lose,3 Bill Lee describes a trail of pathology stretching back generations. His grandfather sold his father to another family in China to cover a gambling debt. Lee’s father was raised by this surrogate family and then emigrated to the United States, where he also gambled compulsively at mah-jongg and pai-gow poker. Growing up in San Francisco’s Chinatown, Lee would accompany his father to his gambling dens as a sort of “good luck charm.” In elementary school he was already cutting class to gamble for coins or baseball cards, often losing everything. By high school he was hustling pool and playing poker with varying success and running afoul of loan sharks. Still, he was successful in his studies and went on to get a college degree and a series of well-paying jobs in California’s Silicon Valley as a skilled manager and “headhunter” for high-tech firms. He married and had a son, Eric. But as his career advanced he began to gamble more often and for higher stakes. He played the stock market, trading in options. He would work all day in Silicon Valley, drive four hours to the Nevada casinos to play blackjack for a few hours, and then drive back half asleep on icy mountain roads to start work the next morning. This recklessness contributed to the end of his marriage and a subsequent bitter custody battle:

As the custody for Eric became extremely contentious, my urges to gamble became stronger and more frequent. Whereas my preoccupation with gambling used to begin a day or two before my next excursion, it began surfacing sooner. Eventually, the urges started almost as soon as I returned home [from the casino]. All I could think about was getting back to the tables. It wasn’t about feeling good or having fun; it was more about not feeling bad.4

Within a few years Lee was utterly bankrupt, having gambled away his entire life savings and his home. This destructive cycle continued, destroying a second marriage and squandering another small fortune, leading him to the brink of suicide. Lee joined the twelve-step group Gamblers Anonymous but dropped out several times over the course of many years as the urge to gamble became overwhelming. At one point, having gone ninety days without placing a bet, he went to sleep with a sense of accomplishment at having reached this goal. But then, he recalls, “I woke up drenched in sweat and shaking. My urge to gamble left my entire body feeling like one big mosquito bite, and no amount of willpower would have been able to stop me from scratching myself.”5 As of 2005, he had not placed a bet for four years and was an enthusiastic advocate of Gamblers Anonymous.

Bill Lee’s heartwrenching story illustrates not only how gambling addiction can destroy lives, but also some general themes of this disease. As his experience suggests, compulsive gambling runs in families and is much more prevalent in men than in women. Almost certainly, an increased risk for compulsive gambling in people who have a close relative with a gambling addiction reflects both nature and nurture. Several studies have examined gambling addiction in male and female twin pairs, comparing monozygotic (identical) and dizygotic (fraternal) twins. These analyses have suggested that inherited factors account for about 35 to 55 percent of the variation in compulsive gambling among men. In women the story is less clear, and while some studies have reported no significant heritability, these analyses are complicated by the much smaller sample of female gambling addicts.6

In chapter 3 we discussed how carriers of the TaqIA A1 allele of the D2 dopamine receptor, who have reduced dopamine signaling in VTA target regions, are more likely to struggle with several different substance addictions: food, drugs, alcohol. Carriers of this genetic variant are also at greater risk for behavioral ad dictions such as compulsive shopping and gambling, as well as attention-deficit hyperactivity disorder (ADHD). Not surprisingly, genetic analysis has revealed a number of other variants (situated in D4 and D1 dopamine receptors and dopamine transporters) that reduce dopamine signaling that are also associated with gambling addiction and one or more of the other addictive behaviors.7 Such findings confirm what we already know anecdotally: Anyone who has spent even a little time in a casino has seen that nicotine addiction, alcoholism, and compulsive gambling are often concurrent, reflecting a common underlying disorder of the dopamine-using pleasure circuit. Indeed, the rate of alcoholism among compulsive gamblers is about ten times higher and the rate of tobacco use is about six times higher than in the general age-matched population in the United States.

Bill Lee’s story points to several other risk factors for gambling addiction. He was exposed to gambling, as both an observer and a participant, when he was very young and quite poor. And though it may seem relatively trivial, gambling opportunities, both legal and illegal, were also easily available to him in the form of the stock market, card rooms, casinos, and so on. Many studies conducted around the world have come to the same conclusion: When legal gambling becomes more readily accessible, the prevalence of gambling addiction increases. Online gambling, now popular throughout much of the world, is ideally suited to foster gambling addiction, as it is can be indulged in twenty-four hours a day with little social constraint, as no one is likely to be present to urge the gambler to stop.

It is also worth noting that, throughout his cycles of gambling and relapse, Lee managed to thrive in his professional life. Indeed, the risk-taking, hard-driving, and obsessive personality traits often found in compulsive gamblers can be harnessed by some to make them very effective in the workplace. Many gambling addicts are among the most successful, productive, and innovative figures in the business world, a profile that contributes to a self-image of being in control and makes them extremely reluctant to seek help, even in dire circumstances.

When we consider the addiction trajectory of a heroin user, we discover remarkable parallels to Bill Lee’s story. Tolerance, withdrawal, craving, and relapse are all present. Lee’s clear and resonant description of how his addiction eventually drained all the pleasure from gambling, leaving only a raw desire, could just as easily have been written by a heroin or cocaine addict. The slow transition from liking to wanting that he traces is precisely the same phenomenon experienced by a drug addict and is likely to represent a similar use-dependent rewiring of the pleasure circuit. Furthermore, just like those of drug or food addicts, Lee’s most devastating gambling binges were triggered by unusually stressful situations (initially by his divorce and custody battle and again, years later, when he witnessed a mass murder at his workplace). Even after swearing off gambling and joining Gamblers Anonymous, he relapsed several times—an experience typical of both drug and gambling addicts. In fact, one study in Scotland showed that only 8 percent of attendees at Gamblers Anonymous meetings had completely abstained from gambling one year later.8

Despite their similarities, it’s easy to imagine that compulsive gambling is somehow less destructive than addiction to drugs. However, there are some respects in which it is worse. Most gambling addicts go deeply into debt, and many wind up committing crimes to cover their losses. The life consequences of a disastrous gambling binge can linger for years. Perhaps this is one reason why the attempted suicide rate for gambling addicts is so very high: about 20 percent for Gamblers Anonymous members, rising to as high as 40 percent for a group of men in a residential treatment program for gambling addicts run by the U.S. Veteran’s Administration.9

 

A dollar picked up in the road is more satisfaction to you than the ninety-and-nine which you had to work for, and money won at faro or in stocks snuggles into your heart in the same way.

—Mark Twain

So how does someone learn to like gambling? One model holds that early reward is crucial. If you’ve never gambled before and sit down in the casino to play a few hands of blackjack, you might lose the first five hands, get frustrated, and walk away. You are left with only negative associations (losing money) with gambling and are therefore less likely to try it again. Alternatively, you might win a hand or two early on, thus positively reinforcing the gambling behavior. A subset of people get a small but noticeable pleasure jolt out of this early success, which increases their risk of developing a gambling addiction as they seek more and more stimulation to achieve a “set point of pleasure.”

While on the face of it this model seems reasonable, it’s likely to be either incorrect or incomplete. Many people who like gambling or even go on to develop gambling addictions did not have an “early win” experience. Similarly, the vast majority of compulsive lottery players, as only one example, will never win the jackpot in a lifetime of betting. An alternative model that has recently emerged from experiments with monkeys and rats suggests that our brains are hardwired to find certain kinds of uncertainty pleasurable (or “rewarding,” as it is termed in the cognitive neuroscience literature).10

In experiments conducted by Wolfram Schultz and his coworkers at the University of Cambridge, monkeys were trained to watch a computer screen for visual cues while a lick-tube was placed near them to deliver a drop of sweet sugar syrup. At the same time, electrodes were inserted into the monkey’s brain to record the activity of individual neurons in the VTA. On the screen there were lights that would turn on and stay on for about two seconds. When a light appeared on the screen—we’ll call it green—it indicated that two seconds later a syrup drop would always be delivered. Another light, this one red, indicated that two seconds later no reward would be given.11

Let’s work through the experiment by following an individual monkey. (This is a bit complicated, so please follow along using Figure 5.1 as a guide.) In the first trial, no light goes on; the delivery of a syrup drop is rapidly followed by a brief burst of dopamine neuron firing in the monkey’s brain: In its untrained state it perceives the syrup drop as intrinsically rewarding. (It should be noted here that dopamine neurons in the VTA and other regions are not completely silent at rest. The droplet-evoked burst of spikes is superimposed on a low level of background activity.) The monkey next receives a series of randomly intermixed red-light and green-light trials. In the first few green-light trials, the dopamine neuron fires a burst at the delivery of the syrup droplet, but not at the onset of the green light. But gradually, as the monkey learns that the green light is a reliable predictor of the syrup droplet, a fascinating change occurs. The dopamine neuron gradually stops responding to the reward itself and instead displays a similar burst of activity at the onset of the green light. The monkey also learns that the red-light trials reliably predict no reward and so red-light trials show no burst activity at any time point.

image

Figure 5.1 VTA dopamine neurons are activated in anticipation of a reward in the experiments of Schultz and coworkers. See the main text for a complete explanation. Illustration by Joan M. K. Tycko.

Let’s pause for a moment to consider how amazing this mechanism is: The activity of a single VTA dopamine neuron no longer merely indicates simple hardwired pleasure but now represents the learned association between the green light and the sugar droplet reward. While this may seem a trivial point, when pleasure and associative learning are mixed in this way, a minor miracle actually takes place. Now behaviorally compelling stimuli don’t have to be intrinsically pleasurable, like sex or food, or artificially pleasurable, like drugs. Any sound, smell, sight, or memory can become associated with pleasure and can thereby become pleasurable in its own right.

Back to our monkey story. Next, the experimenters did a clever thing: They broke the rules. For example, in a well-trained monkey, they flashed the green light but failed to deliver the syrup drop. In this case there was a burst of firing at the green-light onset, but then, two seconds later, when the anticipated syrup droplet failed to arrive, there was a brief decrease in background activity, temporarily driving the neuron to near silence. Alternatively, again using a well-trained monkey, the experimenters flashed the red light, but then violated the learned rule by delivering a syrup drop at red-light offset. This resulted in no burst at red-light onset but a burst immediately following delivery of the unexpected syrup drop. These responses have proven to be extremely useful for guiding learning in the real world. Oftentimes a particular learned association is no longer valid and has to be overwritten by new experience. In order to do this, the pleasure/reward circuitry of the monkey’s brain has to be able to calculate what learning theorists call reward prediction error: the difference between what is expected to happen and what actually happens. Or, in a simple equation:

dopamine neuron response (which encodes reward prediction error) = reward occurrence – reward prediction

When reward occurrence = reward prediction, like in green-light-followed-by-reward trials or red-light-followed-by-nothing trials in well-trained animals, the dopamine neurons fail to burst at light offset. However, when the learned rule is later violated, the dopamine neuron fires a burst at light offset, signaling a reward prediction error. This tells the monkey’s pleasure circuit that the old rules don’t apply anymore and it may be time to learn a new association.

By now you are probably wondering, with good reason, “What does this have to do with gambling, where the outcome is always unpredictable?” In a later experiment, Schultz and his coworkers added yet another visual cue, which we’ll call a blue light. The blue light signaled that two seconds after it flashed, a reward would be delivered randomly on 50 percent of the trials. Well-trained animals showed a brief burst of activity when the blue light came on, but then a strange thing happened: In the approximately 1.8-second-long interval between the end of this burst and the offset of the blue light, there was a gradual increase in the level of dopamine neuron firing, such that high rates of firing were achieved by the time the blue light shut off (Figure 5.1, bottom trace). Furthermore, when blue-light trials were presented using extra-large syrup drops, the maximal firing rate achieved during the “waiting interval” was increased.

Essentially, these researchers have created a sort of monkey casino. The period between the onset and the offset of the blue light, when the reward outcome is uncertain, produced a gradually increasing activation of the pleasure circuit in the VTA target regions. This is analogous to the period during which a player is watching the slot machine or the roulette wheel spin or waiting for the turn of a card in blackjack. One reasonable interpretation of these results is that we are hardwired to get a pleasure buzz from risky events. In this model it’s not that we need an early reward to like gambling. Rather, the uncertain nature of the payoff is pleasurable in its own right. Evolutionary scenarios have been proposed in which neural systems to drive risk-taking were adaptive, helping an animal beset with indecision to find more reliable predictors of important events. In ancestral humans, this risk-taking may have been more adaptive for male hunters than for female gatherers, potentially underlying the increased risk of present-day males for gambling addiction and other impulse control disorders.

 

Though monkeys experience a sustained dopamine pleasure buzz in an experiment where the delivery of a syrup drop reward is uncertain, one can take the casino analogy only so far with regard to humans. First, we know that humans have a greatly expanded frontal cortex to guide planning and deciding, a mechanism that might crucially impact the responses to uncertainty. Second, syrup drops are a natural reward, and as we discussed in chapter 3, it’s reasonable to conclude that we’re hardwired to like sweet foods and drinks. Money, however, is an abstraction, and early ancestral humans certainly didn’t use it. Does money really trigger the human pleasure circuit?

Hans Breiter and his coworkers addressed these questions by adapting the monkey protocols of the Schultz lab for use in human brain scanning experiments.12 Initially each subject received an account containing $50 worth of credit. They were instructed that they were working with real money and that they would be paid the balance of their account in cash at the end of the experiment. In the brain scanner, they watched a video screen that showed one of three wheels, each of which was divided into three pie-shaped segments labeled with a monetary outcome. The “bad” wheel had only negative or neutral outcomes (–$6.00, –$1.50, or $0), an “intermediate” one had mixed results (+$2.50, –$1.50, $0), and a final “good” wheel primarily had rewards (+$10.00, +$2.50, $0). After a particular wheel type was presented on the screen, the subject would push a button that would initiate rotation of an animated pointer. The pointer would spin for about five seconds and then come to rest, seemingly randomly, on one of the three possible outcomes, where it would remain for five more seconds. The design of this experiment makes it possible to measure brain activation during both an anticipation phase (while the pointer is spinning) and an outcome phase (after the pointer has stopped). Of course, the software running the pointer is controlled by the experimenters so that it can deliver all of the possible monetary outcomes in a balanced manner (Figure 5.2).

The main finding was that, as with Schultz’s monkeys, VTA target regions (the nucleus accumbens, the orbital gyrus, and the amygdala) were activated during both the anticipation phase and the outcome phase when the outcomes were positive. The anticipation phase responses were graded according to the possible outcome: There was greater activity while the “good” wheel’s pointer was spinning than when that of the “intermediate” or “bad” wheel was spinning. And finally, during the outcome phase with the “good” wheel, greatest activation was seen for the largest monetary rewards. Thus even anticipation and experience of an abstract reward, like money, can activate the human pleasure circuit.

This experiment was also designed to test another hypothesis about monetary reward in gambling. Using a related task, Barbara Mellers and coworkers demonstrated that people regard a $0 outcome on the “good” wheel as a loss but a $0 outcome on the “bad” wheel as a win.13 If our minds were completely rational, we would value these outcomes the same way, but we don’t. We are influenced by the counterfactual possibility of “what might have been.” Was this irrational belief reflected in brain activation? The response strength to the $0 outcome on the “good” wheel was lower than that for the “bad” wheel. However, the responses to the $0 outcome on the “intermediate” wheel did not fall between the levels for the good and the bad $0 responses, as would be predicted. The theory that counterfactual comparison modulates brain pleasure circuit activation is therefore possible, but remains unproven.

image

Figure 5.2 The design of an experiment to test the response of human subjects to the anticipation and experience of monetary gains and losses. Adapted from H. C. Breiter, I. Aharon, D. Kahneman, A. Dale, and P. Shizgal, “Functional imaging of neural responses to expectancy and experience of monetary gains and losses,” Neuron 30 (2001): 619–39, with permission from Elsevier.

Another irrational idea about gambling involves near misses. For example, if a horse one bets on to win comes in second, or if two of three reels on a slot machine’s payline match, it will be experienced as a near miss rather than as a loss. A number of experiments have manipulated near-miss frequency and have shown that near misses promote continued gambling. In fact, there appears to be an optimal frequency of near misses to maximally extend slot machine gambling—about 30 percent.14 Manufacturers of video slot machines are well aware of this effect, and some have programmed their devices to increase the rate of near misses above random levels.15

In games of pure chance, like craps or the lottery, gamblers have the same probability of winning whether or not they have a direct involvement in the process (such as buying the lottery ticket or rolling the dice). Nonetheless, many studies have shown that gamblers will bet more and continue gambling longer if they do have a personal role in these fundamentally random events. In some cases, this even affects the style of the particular actions involved in the game. For example, craps players tend to throw the dice with less force when trying to roll low numbers.16 While both the near-miss effect and the direct-involvement effect are seen in general populations, they are even more prevalent in gambling addicts. Considering these irrational aspects of gambling, Luke Clark and his colleagues at the University of Cambridge hypothesized that there would be significant activation of the pleasure circuit by near misses on a video slot machine and that this activation would be stronger on trials where the gambler had some personal control, as opposed to those presented exclusively by the computer.17 They placed forty subjects in a brain scanner and presented them with a simplified two-reel video slot machine in which one reel was fixed and the other spun (Figure 5.3). The position of the fixed reel was set by the subject on some trials and by the computer on others. Hits in which the two reels matched yielded a payout of 50 pence. Near misses were those trials in which the matching symbol of the spun reel came to rest either one row above or one row below the payline. Neither near misses nor full misses produced a payout. The computer was programmed to produce near misses on two out of six trials, hits on one out of six trials, and full misses on three out of six trials.

Before each trial the subject was asked, “How do you rate your chances of winning?” After each trial, the subject was asked, “How pleased are you with the result?” and “How much do you want to continue to play the game?” In confirmation of previous  findings, personal control of the fixed reel increased both the subject’s estimation of his chances and his interest in continuing to play. Also, on winning trials, the pleased-with-result ratings were higher on the personal-control trials as compared with the computer-control trials. When compared to full misses, near misses were experienced as less pleasant but as stimulating the desire to continue to play, but only for those trials where the subject had personal control of the fixed reel.

image

Figure 5.3 A near miss on a simplified video slot machine in the experiments of Clark et al. (2009). The arrow indicates the payline row. The left reel is fixed on a symbol chosen by either the subject or the computer, and the right reel spins to determine the outcome. From L. Clark, A. J. Lawrence, F. Astley-Jones, and N. Gray, “Gambling near-misses enhance motivation to gamble and recruit win-related brain circuitry,” Neuron 61 (2009): 481–90, with permission from Elsevier.

When the brain scanning data were examined, there were two main findings. First, in all trials, near misses activated much of the same VTA-target pleasure circuit as wins. Both results activated the nucleus accumbens and the anterior insula. However, wins and personal-control trial near misses, but not computer-control trial near misses, also activated another nearby region: the rostral anterior cingulate cortex. These results might help to explain some of the irrational behavior involving gambling: Activation of win-related regions by near-miss outcomes is somehow pleasurable and is more pleasurable when the subject has personal control. This pattern of brain activation could underlie the ability of near misses to promote continued gambling. It’s interesting that near-miss outcomes on personal-control trials are simultaneously rated as less pleasant but more compelling to continue. Perhaps this reflects the activation of the pleasure circuit, blended with loss-evoked feelings from other brain regions.

 

To recap, we know that winning money can activate the human dopamine-using pleasure circuit. We also know that blunted dopamine function in the human pleasure circuit has been found in both drug addicts and food addicts, leading to a suggestion that their addictions result from an attempt to achieve a set point of pleasure that nonaddicts can reach more easily. Could a similar model explain gambling addiction? To test this idea, Christian Büchel and his colleagues at University Hospital Hamburg-Eppendorf in Germany recruited twelve gambling addicts and twelve control subjects to take part in a guessing game with a monetary reward while their brains were scanned.18 Each subject started with 15 euros and was informed that he would receive the entire balance in cash at the end of the experiment. The simple game involved the presentation of video images of two playing cards, facedown. The subjects were told that one of the cards was red and were asked to guess which by choosing either the right or the left card with a button press. After a two-second delay, the  selected card was flipped over. A red card won the subject 1 euro, while a black card lost the subject 1 euro. Of course, the experimenters manipulated the software to control the proportion of wins and losses and their order. The results were engineered so that, at the end of 237 trials, the subject would have a total balance of 23 euros (Figure 5.4).

In both the gambling addicts and the controls, there was significantly greater activation of the nucleus accumbens and the ventrolateral prefrontal cortex (another region that receives VTA dopamine projections) by winning compared to losing. However, when the winning trials were compared in the two groups, the results support the blunted dopamine hypothesis for gambling addiction. Both of these VTA target regions in the gambling addicts were significantly less activated by winning. Interestingly, while this reduction was present on both sides of the gambling addicts’ brains, the right side showed a larger reduction than the left. This result is consistent with the findings discussed earlier in which genetic variants that suppress dopamine signaling, particularly in the medial forebrain, were associated with higher rates of gambling addiction.

 

While money is not an intrinsic reward in the same way that food, water, and sex are, one could argue that it has come to represent the possibility of intrinsic rewards, and so activation of the pleasure circuit by money is not strictly arbitrary. This begs the question: Can the human pleasure circuit be activated by stimuli that are entirely arbitrary? Video games could be a good test case for this question, as they may not provide an intrinsic reward.

Allan Reiss and his coworkers at Stanford University performed brain scanning on subjects playing a simple video game.19 The subjects were eleven male and eleven female Stanford students, selected to have similar, moderate previous experience with video games and computers generally. The video game involved a screen with a vertical dividing line and leftward-moving balls on the right-hand side, which the player could click to remove (Figure 5.5). When a ball hit the divider, it caused the divider to move slightly leftward, reducing the player’s “territory” on the left-hand side of the screen. Conversely, for each second that the area near the divider was kept clear of balls, it would move rightward, gaining territory for the player. The only instruction given was to “click on as many balls as possible.” All players soon deduced the point of the game and adopted a click strategy to increase territory.

In all subjects, game play activated a large number of brain regions, including those associated with visual processing, visuospatial attention, motor function, and sensorimotor integration. While these are not surprising results for this task, what was interesting was that key regions of the medial forebrain pleasure circuit were also activated, including the nucleus accumbens, the amygdala, and the orbitofrontal cortex. While both men and women showed activation in these regions during game trials, the effect was significantly stronger in men.

The most provocative aspect of these results is the general finding: Video game play, a completely unnatural behavior divorced from intrinsic reward, activated the pleasure circuit to some degree in all subjects. Perhaps video games tap into some very general pleasure related to goal fulfillment and personal involvement. It’s also likely that many video games offer a very highly effective reward schedule: Just like cigarettes, the pleasurable moments they provide are brief, but they have rapid onset and are repeated often.

image

Figure 5.4 Gambling addiction is associated with reduced activation of the medial forebrain pleasure circuit. Top: Brain scan images showing reduced activation of the nucleus accumbens on winning trials in gambling addicts. Bottom: In this graph, each plot point is a different experimental subject, and the scatter plot shows that subjects with the most severe gambling addiction tended to have greater reductions of the pleasure circuit in winning trials. Adapted from J. Reuter, T. Raedler, M. Rose, I. Hand, J. Gläscher, and C. Büchel, “Pathological gambling is linked to reduced activation of the mesolimbic reward system,” Nature Neuroscience 8 (2005): 147–48, with permission from Macmillan Publishers Ltd., copyright 2005.

The increased level of activation in men is also interesting, but somewhat harder to interpret. Is there something general about video games that makes them more pleasurable for men? Or is there something about “gaining territory” in a video game that is particularly male-focused? My own suspicion is that the answer lies in the particular details of the game: If they repeated this study with a combined pattern-recognition and reflex game like Tetris, the gender difference would likely disappear.

image

Figure 5.5 The simple video game used to conclude that activation of the midbrain pleasure circuit during video game play is greater in males than females. The player clicked on the balls in the right-hand field in order to move the dividing line to the right and thereby claim more territory. Adapted from F. Hoeft, C. L. Watson, S. R. Kesler, K. E. Bettinger, and A. L. Reiss, “Gender differences in the mesocorticolimbic system during computer game play,” Journal of Psychiatric Research 42 (2008): 253–58, with permission from Elsevier.

An earlier study using a different form of brain scanning (positron emission tomography, or PET) revealed increased dopamine release in subjects playing a tank-driving video game.20 Furthermore, those subjects who scored highest in the game had the largest dopamine-release signals in the dorsal striatum and nucleus accumbens. While this study is consistent with others demonstrating dopamine pleasure circuit activation in video games, it is complicated by the fact that the subjects were paid (eight UK pounds) for each video game level they completed successfully—thus conflating monetary reward and game play.

If video games can activate the dopamine pleasure circuit, does that mean that one can become addicted to them? The answer seems to be a qualified yes. There is already a burgeoning industry, complete with standardized questionnaires and dubious therapies, that claims to aid in the treatment of video game addiction and Internet addiction.21 However, media accounts, particularly those originating from East Asia, have overstated both the extent of the problem and its severity. The best indications are that most video game addicts recover without intervention.

 

This chapter has seen our ideas of the dopamine pleasure circuit extended in some provocative directions. Initially it seemed that the pleasure circuit was either naturally activated by intrinsically adaptive stimuli like food, water, or sex, or artificially engaged by drugs or stimulating electrodes placed deep within the brain. We also discussed how the development of addiction could slowly modify the structure and function of the pleasure circuit and thereby drain the pleasure out of any of these activities, replacing liking with wanting. These observations are all true, but they don’t tell the whole story.22

We now know from Schultz’s monkey experiments that rapid associative learning can transform a pleasure signal into a reward prediction error signal that can guide learning to maximize future pleasure. It is likely that this same process is what enables humans to feel pleasure from arbitrary rewards like monetary gain (or even near misses in monetary gain) or winning at a video game. This line of thought leads to some interesting evolutionary and developmental questions. When, exactly, did the ability to feel pleasure from arbitrary rewards develop? And are these rewards really entirely arbitrary, or is there some common theme or quality that runs through them? Can a monkey derive pleasure from playing a video game if there is no intrinsically pleasurable stimulus like a syrup droplet or a jolt of cocaine as the reward? What about a rat? Or a human toddler?