We are what we repeatedly do. Excellence, then, is not an act, but a habit.
—Aristotle
Sylvia always entered therapy enthusiastically. However, hope quickly dissipated after a few sessions, with her arguing that it was again not the “right fit.” She terminated after the third session after detecting an inkling of pressure to change her habits of buying beyond her budget and excessive eating while glued to the TV. She claimed these were her “stress relievers.” Though not quite a hoarder, her buying sprees got her out of the house and provided a rush of excitement at the moment of purchase. After slipping back into the dark hole of her “comfort cocoon,” friends and family urged her to make another go of therapy. She would finally agree again, noting that her problems centered around stress at work.
Her therapists would discover that her job stress resulted from her supervisor’s concerns about her lack of initiative and failure to turn in complete, neat, and timely assignments. When she sought the support of the employment assistance counselor at work, she would complain that “my coworkers demand too much of me,” the same complaint that she had about her family. As she denied depression, anxiety, or other negative emotions, my gut feeling was that she was putting me on notice that I could not provide what she was looking for. In reality, she had no idea what she wanted or needed, and finding agreed-upon goals and a motivation to change was going to be a challenge.
Clients are resistant to change partly because of the initial displeasure it brings. Many people, like Sylvia, come to therapy initially expressing motivation to improve their lives, and some are able to identify behaviors, mood, addictions, or thought patterns troublesome enough that they want to do something about them. However, too many people lose motivation and fall back into bad habits that they come to rationalize as viable coping tools—at least for awhile.
All animals possess behaviors critical for their survival, motivated by a reward-seeking system (Panksepp & Bivens, 2012). Our ancestors evolved reward circuits wired to take advantage of opportunities that were few and far between. When game and vegetable matter were available, they took as much as they could. Because food and other rewards were not constantly available, the reward circuit that added a spark plug to anticipate gratification so that we would seek out opportunities. This reward circuit served well during our evolutionary history to motivate our ancestors to seek out resources by hunting and gathering for survival, when complacency would mean death.
For those of our ancestors living in colder climates with harsh winters, the D2 dopamine receptors worked fine, not slowing down the drive to reward-seek even when rewards were plentiful. Those of our ancestors who lived closer to the equator did not have to jump at any opportunity because resources were rich and available all year long. Their D2 dopamine receptors were designed to slow down reward seeking when rewards were plentiful. With constant plenty, continuous reward seeking would have been maladaptive. In the developed world, those with the A1 allele of the D2 dopamine receptor are at a disadvantage, with people like Sylvia developing addictions to such habits as shopping, comfort food, and excessive computer use, social media, or television, and others developing addictions to drugs, alcohol, and gambling.
Adults with the A1 allele of the D2 dopamine receptor who have endured adverse childhood experiences tend to be impulsive novelty seekers who engage in high-risk behaviors, while those with the same genetics raised in a positive environment do not show these traits (Keltikangas-Jarvinen et al., 2009). People with the A1 allele of the D2 dopamine receptor may be more vulnerable to alcohol abuse and require a more nurturing environment to maintain mental and physical health. Adults who endured adverse childhood experiences also tend to have higher levels of dopamine and cortisol (Pruessner et al., 2004).
Sam was a hard-working insurance broker who said “most people die of boredom doing what I do.” He said there is always a different combination of policies to discover, “like three level chess.” While he was proud of his work habits, there was another habit he couldn’t break: his obsession with spectator sports. It wasn’t just that he was a fan of one or two teams, or had interest in one or two different sports. His addiction was to all sports that could be broadcasted on cable and radio channels, including pay-per-view events. While this may seem like an innocuous problem, his wife threatened to leave him if he did not quit. He admitted that even when they were out to dinner he would sneak into the bathroom to check his iPhone for up-to-the-minute scores of the games.
His drive to engage and maintain these bad habits is supported by a dynamic neurocircuitry, including the nucleus accumbens, a key part of the reward circuit and his salience network. Its proximity to the amygdala reflects how implicit memory system is intertwined with motivation, determining how much work to put into seeking a potential reward. The information stored in implicit memory regulates the release of dopamine from the ventral tegmental area, which signals the desirability, value, incentive, and salience of the reward.
Dopamine serves as a key player driving both Sam’s ambitious work ethic and his addiction to spectator sports. Misconstrued as the pleasure neurotransmitter, dopamine is actually associated with the anticipation and motivation to seek the reward, rather than the reward itself. Thus, wanting is associated with dopamine circuits and working memory (Berridge, 2009). It provides incentive value to a stimulus and helps generate curiosity, interest, and motivation. In contrast, liking or enjoying something you’ve already attained involves opioid and cannabinoid circuitry. But the liking circuits also involve wanting. Though a variety of areas of the brain represent wanting alone, no regions represent liking without wanting. Many behaviors, including addictions, can transition into wanting without liking. Sam wanted to know all the scores of the games, though he found little pleasure in the knowledge. He stated that he could not wait to get home from dinner to watch all the games he taped. Yet he derived little pleasure from doing so. This distinction between wanting and liking represents an aspect of addiction. A person such as Sam feels compelled (wants) to watch, but he had grown to dislike it, and certainly himself afterward. This seemingly innocuous habit had taken over his life and pushed his wife away.
Not far from where I live in Santa Fe some of the Native American Pueblos had developed casinos to entice foolish people to wager their paychecks on the chance that they might quadruple their money. Ted was one of them. He said that each time he drove past the casino he felt a jolt of excitement, “not unlike a drug high.” He complained of stress at work and poor health. When more stressed and unhealthy, he was more susceptible to the temptation of going to the casino.
Ted’s dopamine neurons fire faster when the anticipated reward is greater. Driving by the casino cues these dopamine neurons to fire at a fast rate, but not as fast as when he walked into the casino. And the greater the rewards Ted expected, the greater the firing rate of his dopamine neurons. If the situation was particularly tempting, especially based on prior experience of rewards in similar situations, his dopamine neurons would fire particularly fast in anticipation. Like a Geiger counter approaching a radiation source, the closer he gets to the casino, the faster his dopamine neurons fire. As he drives past, just like the Geiger counter, the firing grows slower (Trafton & Gifford, 2008). The more dopamine neurons fire when an opportunity for reward is present (in this case the casino), the more likely the neurons in the nucleus accumbens initiate reward-seeking habits (in this case, going to the casino to gamble). The outcome of the reward seeking will determine whether the drive to seek reward in that situation will increase or decrease. When expected rewards are received, dopamine neurons fire as before the next time an opportunity arises. If the rewards are better than expected, firing will increase next time. If the rewards are not received, firing declines during future opportunities. In this way, dopamine neurons encourage reward seeking when opportunities are present and learn and adjust predictions about rewards as experience is gained.
Because dopamine triggers reward seeking (i.e., wanting, the drive to work for an anticipated quick benefit), dopamine is a strong driver of addiction. To become addicted you need to want not necessarily what you are addicted to—exaggerated wanting is enough. The activity of dopamine directed toward the nucleus accumbens generates activity in the reward-learning or habit-driving centers of Ted’s brain. But unfortunately for Ted, casinos have learned to use the quirks of the brain’s reward-learning systems to keep him coming back, such as slot machines making gonging noises when someone wins. These habit circuits amplify dopamine firing when rewards are unexpected, infrequent, and unpredictable. As a result, they encourage focused learning in situations when the brain cannot yet reliably predict when a reward will be received. By keeping the rewards large but unpredictable, casinos make the opportunities to gamble trigger unnaturally rapid firing rates. The intense craving Ted has for gambling involves an exaggerated firing rate in his dopamine neurons and a “rush” of excitement. To keep comparisons between natural rewards and these commercially amplified dopamine neuron responses to scale, Ted’s brain was forced to start reducing dopamine neuron firing in response to natural rewards, making other life activities less motivating. Though Ted consciously may not enjoy gambling—like the lyric in the song, “the thrill is gone”—wanting that thrill lingers. He will continue to be drawn to the casino, as his learned gambling habits are initiated by the automatic response of his dopamine neurons to those cues he has associated with gambling wins in the past.
Figures 6.1a and 6.1b: Two possible sequences of dopamine activation leading to addiction. (based on Trafton, Gordon, & Misra, 2016).
Because Ted’s dopamine neuron firing rate reflect the potential for gambling wins, the faster they fire, the greater the expected win. Eventually, every association with gambling serves as a craving cue, generating a burst of firing fueling the anticipation of a jackpot and increasing likelihood that he will drive to the casino.
The challenge of inhibiting immediate gratification is that the activation of the reward circuit in the striatum can happen quickly, sometimes so quickly that it precedes completion of parallel decision-making processes in the prefrontal cortex. The prefrontal cortex is the region of the brain that employs conscious reasoning and can work to recall the past negative effects of gambling. Because reward-learning systems can make and act on decisions before conscious choices can be made, it is easier for Ted to decide not to drive to the casino than to resist gambling once he sees the blackjack tables, hears the clang of the slot machines, and has a few of the free drinks. These learned implicit cues predict potential rewards, so that his brain is sensitized to respond to them without conscious awareness. Once he developed the habit circuits in his striatum, frequent use made cells fire together and wire together associations between cues and gambling behaviors. The powerful drive for immediate gratification was hard to resist. His prefrontal cortex had a difficult time delaying gratification directed by his reward circuits, sending stop signals only after gambling behaviors had been started. Too often gambling won over his more carefully made long-term plans.
In the same way, Sylvia’s reward circuit drove her to that second piece of cake. Although her executive network had to learn to foresee the long-term consequences, such as gaining more weight and becoming lazy and depressed, it was too slow to prevent her from eating. Though her executive and salience networks respond to dopamine, her executive network was focused on reasoning how to achieve better long-term rewards such as more energy, better mood, and improved health. Prefrontal decisions might align with the short-term reward-learning circuit decisions, for example, when considering the short- and long-term benefits of eating a nutritious gourmet dinner. But they might be at odds over decisions with short-term benefits but long-term negative consequences, such as eating cake. In these cases, Sylvia’s choice will depend on whether the reward-learning circuits slow down long enough to wait for input from the prefrontal cortex.
I can speculate that perceiving my smile of encouragement triggered a release of dopamine for Sylvia (Depue & Morrone-Strupinsky, 2005). Even when I offer a smile for a fraction of a second, followed by neutral comment, that quick reward increases her positive expectation so that she was more motivated to do things to improve her self-care. At work, too, the positive response to her new efforts are met with encouraging expressions because of her contributions.
Priming Sylvia’s motivation at work necessitated feedback between her executive network and her salient network’s reward system. Her executive network kept track of her healthy intentions and goals, while interoceptive information was processed by her salience network about feeling full and satisfied, all of which influenced the dopamine-firing responses that provided valuation of the culinary options available to her. Her hippocampus helped recall episodic memories about the novelty of the decor and the creative menu and recipes. Her reward circuit uses this information to switch on dopamine release. The feedback to her prefrontal cortex and hippocampus ensured that she maintained access to information about all the rewards she most desired. The bottom line is that the same regions of the brain that provide inputs to the dopamine neurons also receive the outputs. In other words, she learned that the rewards are often more valuable if she inhibits the temptation for immediate gratification and acts toward achieving longer-term goals.
Beginning in the late 1970s a well-known study referred to as the marshmallow test illustrated how early ability to defer gratification predicts a lot about later success of the person. Children were presented one marshmallow and then told that if they wait until a little while later they could get two marshmallows, but only if they did not eat the one in hand immediately. The follow-up, years later, revealed that those who were more able to defer gratification (i.e., wait patiently with a marshmallow in hand) grew up to be more successful adults. Prefrontal cortex development facilitates the capacity to defer immediate gratification in the service of long-term planning and goal-directed behaviors. Prefrontal deficits, on the other hand, are associated with impulse control problems and affect regulatory problems such as anger, substance abuse, and relationship problems.
Of course, a person’s success throughout life necessitates more than deferring gratification. The aspects of the salience network, including personal relevance, in terms of emotions, gratification, and meaning, are intertwined. The nucleus accumbens and the amygdala, which are in close proximity, play a significant role in his motivational system. The nucleus accumbens integrates incoming sensory information, evaluates information coming from the amygdala, and influences decisions made by the prefrontal cortex regarding whether to either “go” or “stop” on a possible motivated behavior (Hoebel et al., 1999).
Because the success of psychotherapy depends on the clients’ motivation to reach long-term goals over short-term gratifications, the incentives must be clear, as well as the personal relevance of their new behaviors that you encourage. If clients feel that their behavior results in positive outcomes, they will tend to repeat the behavior, which further strengthens the synapses driving dopamine firing and their motivation to engage in that behavior again.
The capacity to respond to potentially pleasurable experiences that go beyond those immediately obtainable requires more than top-down control—bottom-up changes must take place, too. In other words, being able to defer short-term pleasure for long-term gains necessitates not only a well-functioning prefrontal cortex but also a refined subcortical reward system. To facilitate these bottom-up changes, the nucleus accumbens contains medium spiny neurons, so named because they have spiny shapes. Medium spiny neurons are part of two distinct circuits that respond to dopamine: the direct pathway, which includes neurons with D1 dopamine receptors, and the indirect pathway, which includes neurons with D2 dopamine receptors. The direct pathway acts quickly to motivate Ted to act now to gain immediate gratification: to stop at the casino on the way home instead of showing up for dinner with his family. The indirect pathway helps him resist the temptation to behave impulsively for short-term gain and immediate gratification. By toning down immediate gratification-seeking behavior, his indirect pathway enables him to consider his options and make better choices for long-term gain. In short, the indirect pathway puts the brakes on the quick-acting direct pathway.
The indirect pathway’s ability to slow down immediate gratification-seeking decisions is experience dependent and takes learning. So how can Ted strengthen his indirect pathway? The indirect pathway will get stronger as Ted learns more options for achieving quick wins in his everyday life. Ted expanded the range of pleasurable activities in his life and learned effective, healthful ways to find comfort or relief in the face of problems, stress, and negative affect. The combination of increasing self-care, coping skills, and healthy pleasurable activities is what rehabilitation programs and Alcoholics Anonymous and Narcotics Anonymous support groups encourage. This is for good reason. Increasing exposure to small, natural rewards in day-to-day life leads to frequent activation of the D2 dopamine receptors in the indirect pathway, reshaping their responses to increasingly slow the reward-seeking decisions of the direct pathway. As a result, his indirect pathway brought his direct pathway under control, giving the prefrontal cortex and other brain regions time to weigh in on reward-seeking decisions. With this shift in decision-making control from short-term-focused direct-pathway circuits to long-term-focused prefrontal circuits, he sought gains when they were practical and in the interest of his long-term health.
Yet, increasing the range of pleasurable options and developing coping and problem-solving skills in the service of building his indirect (D2) receptors took time. Ted discovered that progressive shaping of goal-directed behaviors was easier when he broke up goals into small and obtainable successes. Ted’s exposure to a greater variety of pleasurable options made him less vulnerable to immediate gratification. The more rewarding and varied his opportunities, the greater the strength of the indirect pathways, allowing his dopamine system to better weigh the overall value of all options—both immediate and long-term—and put the brakes on his impulsivity (Trafton & Gifford, 2008).
When Sylvia knew few ways to obtain immediate gratification, such as when she could recognize no benefits to attending a birthday party other than cake, her indirect pathway D2 receptor containing neurons became weak, and old habits were difficult to inhibit. However, when she learned social skills, such as that the people at the party become valued opportunities for approval and support, and problem-solving skills, such as that games and other party events became valued opportunities to overcome challenges, a party shifted from a bare environment, with nothing but the quick fix of cake to offer, to a rich environment filled with opportunities for near-term benefit. As illustrated in Chapter 3, genetic endowment does not determine behavior. What we have learned from the fields of epigenetics and neuroscience is that lifestyle and learning matters immensely. Behavioral health represents a continuum.
No Blank Slate
We do not start with the same vulnerability or potentiality. Neither do similar attachment experiences result in exactly the same styles of relating. Essentially, no one starts with a blank slate, contrary to what John Locke argued several centuries ago. People born with the A1 allele of the D2 dopamine receptor (with low D2 receptor availability) are more likely to develop addictions and have more trouble quitting once they become addicted (Comings & Blum, 2000). People with the A1 allele of the D2 dopamine receptor gene are thus more likely to become obese, develop gambling problems, and have trouble controlling urges for immediate gratification. They are also at greater risk of developing posttraumatic stress disorder after a traumatic experience. Because they tend to express fewer D2 dopamine receptors, those with the A1 allele of the D2 dopamine receptor have difficulty slowing the tendency toward immediate gratification seeking, including the drive to escape even mildly stressful situations.
When Sylvia’s or Ted’s D2 dopamine receptors (their indirect pathway) received increased levels dopamine, they became stronger and could restrain automatic-habit activities such as overspending for Sylvia and gambling for Ted. Strong indirect pathways supported control over their impulsive bad habits, so that newly formed good habits could dominate. However, when encountering overly tempting cues, such as shopping malls or casinos, the direct pathways are still strongly activated and may overwhelm the indirect pathways. This can prevent a client from inhibiting old habits.
With conscious and sustained attention, the development of new pleasurable habits blocked old habits stored in striatal systems. When more options were available, their D2 receptors became activated by dopamine, priming them to fire more easily and better inhibit reward seeking driven by the direct pathway (Dong et al., 2006). The total value of the rewarding opportunities that they experienced got translated into the strength and excitability of the indirect neurons (Trafton & Gifford, 2008). The richer their environment became, the more the indirect pathway slowed reward-seeking decisions, leading to greater consideration of long-term consequences. As they learned to recognize more rewarding opportunities, their indirect neurons were increasingly able to resist their old habitual immediate gratification-seeking behaviors. Indirect neurons are inhibitory and disfavor quick decisions to seek short-term gains regardless of long-term consequences, by putting the brakes on striatal circuits and their automatic habits. Essentially, by strengthening their indirect neurons though engaging in more varied beneficial activities, they reduced the tendency to act impulsivity and repeat bad habits.
When the direct pathway is dominant, the salience network becomes more hedonistic and activity in the executive network is ignored. When executive network decisions are hijacked, they are in the service of explaining reward-seeking behaviors that have already been done, rather than strategic planning to guide good decisions. The brain’s executive systems are relegated to making excuses for what the person has done. In doing so, the executive network rationalizes why, for example, gambling or overeating are not so bad. And the default-mode network repeats the story line of these rationalizations, perhaps even fantasies about the big win at the blackjack table. Through relapse prevention strategies that included contingency management, Ted was able to plan what he would do for pleasure instead of gambling. By developing stronger indirect pathways through D2 dopamine receptor activation, his executive network was able to switch from reactive crisis management to proactive strategic planning.
Many clients state that they have a limited number of activities they enjoy. They report feeling stuck in bad habits and claim that their habits are hardwired. You can help them understand that the developed habit and the appreciation of pleasure is soft-wired by practice. From this understanding the client can build motivation to gain a wider range of gratifying experiences.
Without knowing the neuroscience that underlies habits, twelve-step programs have promoted expanding the range of pleasurable and healthy activities available for participants that do not involve alcohol or drugs. When participants cultivate a greater range of pleasurable activities, they become enriched not only by a variety life experiences but also by building the capacity to inhibit the drive toward immediate gratification. When previously old habits of drinking or drugging offered the only pleasurable experiences, the newly broadened range of go-to positive and healthy habits provides a new landscape of opportunity. The brain responds by recognizing this new wealth of possibilities, slowing decisions in order to consider all of its newly recognized options. As engaging in productive new habits becomes increasingly enjoyable, so does the motivation to continue a healthy course. However, breaking bad habits and establishing new habit requires moderating expectations.
Everything in moderation, nothing in excess
—Aristotle
Optimizing motivation necessitates managing expectations. Because our brain develops expectations and predictions about rewards by varying dopamine activity, kindling motivation is not only a simple matter of finding a greater range of pleasurable behaviors. Dopamine activity changes constantly while we learn and update expectations based on the near-term outcomes of our behavior. Our brain recalibrates dopamine activity based on whether our decisions yield the rewards we expect. If the rewards were exactly as predicted, our dopamine neurons have done their job. If our choices do not result in the expected reward, our dopamine neurons will slow down to indicate we have made a mistake. Thus, our dopamine neurons both predict potential reward and provide feedback about the accuracy of these predictions.
The intensity and rate of dopamine firing both shape our behavior and guide decision making, especially when we encounter unanticipated rewards. For example, Pete had maintained sobriety for about a year when he met an old friend and his wife for lunch. As he walked into a café, he became elated to see an attractive woman named Sophia sitting with them, spiking his dopamine activity. The lunch was far more enjoyable than anticipated. A few days later he e-mailed his friends, saying, “I hope to see you all at the café tomorrow.” Of course he meant that they include Sophia. After his friend responded by saying, “Great, see you there,” his dopamine neuron activity shot up again. However, that evening when he entered the café and only saw his friend and his wife, his dopamine neurons dramatically slowed. His salience network dominated the other networks, not with positive feelings but with gut-level disappointment. He felt immediately like ordering a drink. As he turned to the wine list on the menu, his friend gave him a knowing and empathetic head shake. That social support was enough for Pete to feel temporarily soothed. Meanwhile, his executive network gained more balance with his salience network.
A few days later, his friend e-mailed to invite him to lunch at the café and noted that Sophia was invited too. Upon reading the e-mail his dopamine neurons fired up again. And when he walked into the café his dopamine neurons spiked again. As they all engaged in a lively discussion about traveling to Greece, he drifted briefly into his default-mode network, fantasizing about traveling with Sophia. Everything changed abruptly when she mentioned that she was excited about convincing her husband to go to Greece and take their kids there on a family trip. His dopamine neuron activity plummeted.
Our brain becomes temporarily shaped by a process of prediction and correction based on the accuracy of the predictions. Through trial and error, learning errors signal the difference between the prediction and the actual reward as we get feedback about what is really possible. Initially for Pete, dopamine neurons fired in anticipation of the potential reward of an intimate relationship with Sophia. As he learned to enjoy the relationship for what it actually offered, the dopamine firing rate became more successful at predicting the realistic value of friendship. After moderating his expectations, his motivation slowly increased to encourage the new friendship and trigger thoughts, emotions, and behaviors to cultivate it. This moderated effort allowed his reward-learning system to encourage reactions that supported enjoying the friendship with Sophia. By going beyond the search for highs and instead exploring the many gradations of positive experiences, everyday experiences can be enjoyed. Durable and sustainable motivation is particularly important when working to heal from addictions.
Pete had been struggling with alcoholism for several years. By the time he met Sophia he had convinced himself that it was time to get sober. His “bottom” amounted to losing his wife and being demoted at work with a stern warning. Those losses initially pushed him into recovery but were not enough alone to keep him there. Prior to recovery he had not developed regular pleasurable activities in his life. During stressful times, the instant gratification of alcohol was hard to resist. Alcohol had come to represent reward and relief from stress. The cues and reward-seeking responses occur so quickly that they are largely nonconscious. Dopamine-containing neurons in the ventral tegmental area will fire when tempted by the possibility of quick relief from alcohol even though it may be self-destructive in the long term.
Alcohol had previously tricked Pete’s brain into valuing it more than other, healthier opportunities for reward. Alcohol had hijacked his dopamine reward systems and his nucleus accumbens to drive drinking behaviors while encouraging his prefrontal cortex to come up with rationalizations about its worth as a method of pleasure and coping.
Because our brain is essentially an organ of adaptation that addresses survival needs, living in a resource-deprived environment promotes brain activity that responds quickly to opportunities for rewards. This makes for an overly impulse-driven and reward-seeking brain that jumps at the chance for immediate rewards, especially when there are few to be had. When we enjoy a resource-rich life, we have no need to act on immediate rewards and can focus on long-term goals.
The classic example of this scenario occurred in the 1980s in the inner cities in the United States when cocaine in the form of “rock” became cheap and easy to score. Neighborhoods crumbled around crack houses. Currently in the United States the opioid epidemic was primed first by the overprescription of synthetic opioid pain medications and then by cheap and easy-to-obtain heroin. The resulting addictions have devastated many people and families.
The method of taking a drug affects the likelihood of developing an addiction: methods that deliver the drug to the brain quickly tend to be particularly overvalued by the reward system. For example, injecting an addictive drug delivers it directly into the blood supply and to the brain almost immediately. Conversely, drugs that go through the gastrointestinal track and interact with gut bacteria before being absorbed will lead to a slower, graded response of the brain to the drug. Of course, there are many variations within these extremes, such as drinking on an empty stomach or snorting drugs like methamphetamines or cocaine.
Also, how often the drug is taken affects the likelihood of developing addiction. Repeated use of addictive drugs leads to a loss of D2 dopamine receptors in the indirect pathway of the nucleus accumbens. A similar reduction of these receptors is seen with other nondrug addictions. Chronic consumption of high fat, salt, sugar, and junk food has been shown to lead to the loss of these receptors (Adams et al., 2015).
Suffering from the effects of chronic illness also leads to a variety of brain changes. People suffering from chronic pain suffer from the loss of D2 dopamine receptors in the indirect pathway (Martikainen et al., 2015). The use of pain medications may also reduce the number of these receptors. This downregulation of D2 dopamine receptors in the indirect pathway neurons occurs for a variety of reasons. Fortunately, the sum total of Pete’s rewarding new experiences helped modify activity through the remaining D2 dopamine receptor neurons and lessened vulnerability to opportunities for immediate gratification. He built sustainable motivation by cultivating a wide range of healthy go-to behaviors that offered gratifying feelings.
When dopamine activity is moderate, the brain has time to recycle D2 dopamine receptors after use, and downregulation tends not to occur. In fact, there may be an upregulation if gene expression is shifted to make more receptors. However, when dopamine release is extreme, such as that caused by use of addictive drugs, the brain does not have time to recover. The extreme activation of all D2 dopamine receptors all at once leaves the neurons with no remaining unused receptors to respond. In contrast, with moderate activity neurons have time to recycle dopamine receptors. Normally, when dopamine binds to D2 dopamine receptors, the receptors change shape and cannot send another signal until they go through a recycling process. The receptor is taken inside the neuron and chemically treated so that it can return to a functional state. This recycling process is messy, with the loss of some receptors in the process. If loss of receptors outpaces the rate at which the neuron makes new ones, D2 dopamine receptor levels will decline. Moderate-size rewards stimulate moderate dopamine release, and a relatively small portion of the receptors go through this recycling process, leaving a large population of D2 dopamine receptors available to put on the indirect pathway brakes. In contrast, drug use surges dopamine release to the extreme; with overwhelming dopamine release the D2 dopamine receptor population becomes depleted. The person becomes less able to put the brakes on habits. In recovery those receptors come back over a period of weeks and months (Rominger et al., 2012).
The archaic tendency to segregate addictions from other psychological disorders fails to appreciate the habit circuits in the brain. People who suffer from anxiety and depression are more vulnerable to the development of addictions. In the 1980s there was a campaign to combat addiction with the simple suggestion to “just say no.” Top-down willpower alone is not enough for most people to change bad habits into good ones. Certainly the prefrontal cortex networks are critical, but psychotherapy theorists have long identified the paradox of how when people apply their complete attention trying to stop a behavior, thought, or emotion they may actually increase them. Similarly, when they try hard not to have a panic attack, their mind paradoxically is on the lookout for any hint of a panic attack coming on. Any sensation, however subtle, such as a fluctuation in heartbeat or quickness in breath, tends to be amplified. Similarly, trying hard not to crave often leads to craving.
There are a variety of reasons why willpower alone tends to be inadequate. Through the prefrontal cortex circuits play major roles in initiating and inhibiting behaviors, there are limits to what a person can focus on at any one time. As the executive network, using working memory, attempts to focus on not using drugs, the thoughts of drugs nevertheless intermingle with decisions that may not be related to the substance the person is trying to avoid. This is because working memory is an active process; you cannot intentionally not do something without thinking intently about the thing you are trying not to do. With all that thought about drug use, it only takes a minor distraction to forget why you were thinking so much about drugs.
Multitasking, which produces working memory load, increases the activity in the anterior cingulate cortex to monitor and assess errors in learned behaviors (Weinberg & Hajcak, 2011)). But addiction can cause trouble activating the anterior cingulate cortex. This is why many recovery programs incorporate contingency planning and relapse prevention so that problem solving ahead of time can minimize working memory load.
Developing healthy habits, especially during recovery, can be modeled, monitored, and reinforced by contingency management programs (Petry, 2000). Crucial to the success of monitoring and feedback include the following:
Changing bad habits into productive habits not only is challenging but also can be confusing. It is difficult to develop motivation to change because people typically feel worse before they feel better. For example, though Sylvia knew that she suffered from overeating, she gained temporary relief from stress by briefly activating reward circuits. Through her therapists’ encouragement she had practiced eating nutritious food, but she felt more anxious before she felt less anxious. But when I explained that she would feel worse before she felt better, she shifted her expectations and gained motivation to persevere (Arden, 2015).
Comprising various aspects of the executive, salience, and default mode networks, the prefrontal cortex forms and modifies habits through three loops with subcortical areas, as shown in Figure 6.3 (Trafton, Gordon, & Misra, 2016). In addition to the planning, focus, and emotional involvement necessary to replace an old habit with a new habit, repeating and refining the new habit are critical for the development of healthy habits.
The lower loop includes circuits of the salience and default mode networks, including the orbitofrontal cortex, with projections to the caudate nucleus and from the thalamus. This loop is critically involved in habits related to reward-seeking and social restraint, affecting appetite and craving for food, drugs, and sex. Underdevelopment or damage to this area can result in impaired empathy, distorted moral judgment, and risky decision making (Fuster, 2008). In fact, research has shown reductions in gray matter in the orbitofrontal cortex with drug-dependent people (Goldstein, Peretz, Johnsen, & Adolphs, 2007). When this loop is impaired a person tends not to gain from accurate gut level information.
Figure 6.2: The habit circuits.
The middle loop comprises the anterior cingular cortex, dorsostriatum, globus pallidus, thalamus, and then back to the anterior cingular cortex. Underdevelopment or damage to this loop can result in reduced motivation and goal-directed activity, loss of curiosity, and lack of interest in new experiences. This spectrum of deficits, along with the loss of concern for others, makes kindling this circuit important for people with depressive syndromes (addressed in Chapter 9).
Finally, the upper loop circuits of the executive network, consisting of the projections from dorsolateral prefrontal cortex to the head of the caudate nucleus and then back to the dorsolateral prefrontal cortex through the thalamus. This executive-network circuit involves planning and decision making. Underdevelopment or impairment in this circuit causes loss of a sense of memory for the future, failure to foresee contingencies, and inability to plan, prioritize, and engage in behavior that is flexible and preventative. Given that this loop is integrally involved in working memory, disruption here could result in losing the ability to organize the necessary activities to break bad habits.
All together, these three loops represent interrelated systems that can be strengthened to transform bad habits into healthy habits. The lower loop is involved in exclusion and the ability to suppress extraneous thoughts and manage affect, the middle loop maintains intentions to focus, and the upper loop maintains the object of intention.
Most social-cognitive theories assume that intention to change is the best predictor of actual change. But as we all know, people do not necessarily behave in accordance with their intentions. For example, how people with an addiction respond to cognitive distortions may or may not influence their feelings and behavior. Changing how they think does not necessarily determine whether they will enter treatment or be successful in recovery.
There are thus limitations in a top-down approach. Cognitive reappraisal is a top-down strategy where the prefrontal cortex attempts to modify habit-based activity, including the amygdala (Goldin et al., 2012). Reappraisal engages the dorsolateral prefrontal cortex and the executive network to dispute negative thoughts, and ideally people can develop adaptive ways of responding to result in decreased amygdala and insular activity (Goldin et al., 2012).
Because direct top-down approaches can be complicated by bottom-up feedback loops, working with addiction requires accessing reward circuits. A multilevel approach may kindle activity in the salience network, where there are bottom-up and top-down circuits. Balanced by activity in the executive network to stay in the present, emotions, thoughts, and motivation can remain in sync. In this way, the prefrontal cortex moderates the enticement of the reward circuits with motivation for sobriety.
Helping people find the motivation to change has always been a challenge in psychotherapy. Some recent approaches have attempted to stimulate linking wanting with liking. For example, motivational interviewing attempts to expose the incongruity between what a person wanted and what came to be. It highlights the disagreement between actual behaviors and stated goals (Miller & Rollnick, 2012). Consistent with the acceptance commitment therapy concept of revealing the conflict between values and goals, these approaches attempt to help motivate behavioral change through accessing emotionally relevant circuitry as well as the reward system.
Change talk (“you should stop using opioids”) inhibits activation in brain regions that respond to the salience of the addiction cues. Counterchange talk (“I can’t live without my drug”) activates multiple areas, including the insula and striatum, involved in the circuitry of alcohol dependence. Motivational interviewing may work by engaging and changing the neural network that supports the maladaptive habits. Perhaps for this reason, a meta-analysis of controlled clinical trials showed motivational interviewing to be a promising approach for treating disorders of alcohol, drugs, diet, and exercise (Burke et al., 2003).
Motivational interviewing proposes four critical elements:
•Expressing empathy
Activating intrinsic motivation involves focusing on enjoyable and satisfying sources of motivation without immediate gratification. Building on intrinsic motivation, the natural tendency to seek out novelty, challenges, the capacities to explore, and to learn can form the foundation for sustainable change. By contrast, extrinsic motivation is characterized by secondary reinforcers, such as money, prestige, and praise.