CHAPTER TWO

STONED AGAIN

The psychoactive drugs of my people are transcendent and sublime, while those of yours are base, crude, and sinful. My drugs give me wisdom and foster creativity and spiritual insight; yours are merely feeble crutches that reveal your lack of fortitude and willpower. They transform you into a lazy and repulsive creature. They make you behave like a beast.

 

All cultures use drugs that influence the brain. They range from mild stimulants like caffeine to drugs with potent euphoric effects, like morphine. Some carry a high risk of addiction, some do not. Some alter perception, others mood, and some affect both. A few can kill when used to excess. The specific attitudes and laws relating to psychoactive drug use vary widely among cultures, however, and as such reinforce a narrative—summarized above—in which those drugs used by insiders are considered acceptable, while the use of drugs by outsiders is condemned and confirms their status as somewhat less than human. This attitude is particularly evident in the course of the building of empires, with its attendant subjugation of other cultures. Mordecai Cooke, an English naturalist, writing in 1860, was, for his time, unusually insightful in this regard:

Opium indulgence is, after all, very un-English, and if we smoke our pipes of tobacco ourselves, while in the midst of the clouds, we cannot forebear expressing our astonishment at the Chinese and other who indulge in opium… . [A]ll who have a predilection for other narcotics than those which Johnny Englishman delights in, come in for his share of contempt.1

These ideas about the psychoactive drugs of other cultures are not just a curiosity of Victorian England but are widespread and persist to this day. Nearly a century after Cooke’s observation, the American author and drug enthusiast William S. Burroughs expressed a very similar notion when he wrote in his novel Naked Lunch, “Our national drug is alcohol. We tend to regard the use of any other drug with special horror.”

Before we begin to examine the biology of psychoactive drugs—before we open the hood, so to speak—let’s consider a few examples of drug use by different cultures in a variety of historical periods. This will serve to calibrate our thinking by offering a somewhat broader view in attempting to formulate cross-cultural, biological explanations of drug use.

 

Rome, AD 170. It’s a fine time to be a Roman nobleman. The empire is large and thriving, and the military seems invincible. Marcus Aurelius, later to be called “the last of the five good Roman emperors,” is on the throne, attended by Galen, the famous Greek physician. And opium is freely available. Marcus Aurelius is best known today for his Meditations, a classic work of Late Stoic philosophy that holds that the denial of emotion is the key to transcending the troubles and pains of the material world.2 Perhaps it was easier to be a Stoic while stoned: The emperor was a notorious opium user, starting each day, even while on military campaigns, by downing a nubbin of the stuff dissolved in his morning cup of wine. Writings by Galen suggest that Marcus Aurelius was indeed an addict, and his accounts of the emperor’s brief periods without opium, as occurred during a campaign on the Danube, provide an accurate description of the symptoms of opiate withdrawal.

Opium, prepared from the poppy plant, Papaver somniferum, was in use long before the time of imperial Rome. Evidence from the archeological record places it in Mesopotamia (present-day Iraq) around 3000 BC. Opium was widely consumed—either by being eaten, dissolved in wine, or inserted in the rectum—for both medical and ritual purposes by the ancient Egyptians and by the Greeks soon thereafter. The Ebers Papyrus, an ancient Egyptian medical text from the year 1552 BC, even recommends opium as an aid to help small children sleep. One method to achieve this was to smear the drug on the nipples of the nursing mother.

While Galen popularized the use of opium, and it became widely consumed among the Roman nobility, it was not until several years later, during the reign of Septimus Severus, that the last legal restrictions on distribution of the substance were removed. This triggered the widespread adoption of opium as a Roman recreational drug. In the years that followed, the poppy plant became a symbol of Rome, stamped upon its coins, inscribed upon its temples, and woven into its religious practice. By the census of AD 312, opium could be procured in 793 different shops in Rome, and the taxes on its sale constituted a substantial fraction of total receipts for the emperor.

 

County Derry, Ireland, 1880. During the 1830s Ireland was awash in alcohol, much of it produced locally in response to high alcohol import taxes imposed by the ruling British government. While many locals were assiduously distilling illegal poitín from potatoes or malted barley, a backlash against alcohol was also growing. The leading figure in this Irish temperance movement was a Catholic priest named Father Theobald Mathew, who in 1838 established the Total Abstinence Society. Its credo was simple: People who joined did not merely promise to consume in moderation, but took The Pledge, a commitment to complete abstinence from alcohol from that day forward. This simple approach was remarkably effective: In a single day more than twenty thousand drinkers were reported to have taken an oath of total abstinence at Nenagh, in County Tipperary. In fact, it is estimated that by 1844 roughly three million people, or about half the adult population of Ireland, had taken The Pledge.

Not surprisingly, some people looked for a way to keep to the letter of The Pledge while violating its spirit. One of these was a Dr. Kelly of Draperstown, County Derry, who realized that as a nonalcoholic tipple, ether filled the bill nicely. Ether is a highly volatile liquid that may be produced by mixing sulfuric acid with alcohol, as discovered by the German chemist Valerius Cordus around 1540.3 The inhalation of ether vapors leads to effects that range from euphoria to stupor to unconsciousness. In fact, ether was the first drug ever to be used for general anesthesia when in 1842 Dr. Crawford Long of Jefferson, Georgia, employed it during the removal of a tumor from the neck of a patient. Dr. Long had been introduced to ether as a recreational drug during “ether frolic” parties while a medical student at the University of Pennsylvania and had the insight to imagine its practical use during surgery.

Dr. Kelly, desperate to become intoxicated while maintaining The Pledge, realized that not only could ether vapors be inhaled, but liquid ether could be swallowed. Around 1845 he began consuming tiny glasses of ether, and then started dispensing these to his patients and friends as a nonalcoholic libation. It wasn’t long before it became a popular beverage, with one priest going so far as to declare that ether was “a liquor on which a man could get drunk with a clean conscience.” In some respects ingesting ether is less damaging to the system than severe alcohol intoxication. Ether’s volatility—it’s a liquid at room temperature but a gas at body temperature—dramatically speeds its effects. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly; the stages of excitement, mental confusion, loss of muscular control, and loss of consciousness follow each other so quickly that they cannot be clearly separated.” Recovery is similarly rapid. Not only were ether drunks who were picked up by the police on the street often completely sober by the time they reached the station, but they suffered no hangovers.

Ether drinking spread rapidly throughout Ireland, particularly in the north, and the substance soon could be purchased from grocers, druggists, publicans, and even traveling salesmen. Because ether was produced in bulk for certain industrial uses, it could also be obtained quite inexpensively. Its low price and rapid action meant that even the poorest could afford to get drunk several times a day on it. By the 1880s ether, distilled in England or Scotland, was being imported and widely distributed to even the smallest villages. Many Irish market towns would “reek of the mawkish fumes of the drug” on fair days when “its odor seems to cling to the very hedges and houses for some time.” In 1891, Norman Kerr, writing in the Journal of the American Medical Association, painted a vivid picture of pervasive ether intoxication:

Sturdy Irish lads and beautiful Irish lasses, brimful of Hibernian wit, … are slaves to ether drunkenness. The mother may be seen with her daughters and maybe a neighboring Irishwoman or two at a friendly ether “bee.” The habit has become so general that small shopkeepers treat the children who have been sent to purchase some article, with a small dose of ether, and schoolmasters have detected ether on the breaths of children from 10 to 14 (or even younger) on their arrival at school.4

It is interesting to note that, even at the peak of the Irish ether-drinking craze, the possession, sale, and private use of ether remained legal. The first attempt to control the problem involved adulterating industrial ether with naphtha, which has an odor and taste even more offensive than ether itself. This was an utter failure—people just blended it with sugar and spices to mask the taste, held their noses, and tossed it back. Ether drinking in Ireland was finally curtailed in 1891 when the British government classified ether as a poison and enforced strict controls on its sale and possession, thus dramatically restricting its distribution and use. The practice lingered for a few years longer but appeared to be completely abolished by the 1920s.

Cheap, quick, and no hangover afterward? No wonder ether was so popular. However, before you head out the door to score some, it’s worth mentioning a few of the downsides. These include a truly awful smell and taste, coupled with a strong burning sensation while the foul stuff is going down. Plus, it makes you drool like a Saint Bernard dog on a hot summer day, not to mention stimulating truly monumental burps and farts. These aren’t normal emissions—they are laden with highly flammable ether vapors. You can imagine what happened when an ether drinker would light up a pipe and belch or sit down by an open fire and break wind. Severe burns at either end of the alimentary canal were a common hazard.5

 

Iquitos, Peru, 1932. Emilio Andrade Gomez was born in the Peruvian Amazon, the son of a white father and an Amerindian mother. In 1932, at the age of fourteen, he was given the herbal hallucinogenic drink called ayahuasca by local shamans in order to recover his strength following a period of illness.6 He saw visions that the shamans explained were revelations that he was chosen by the plants in the ayahuasca brew to receive knowledge from them. He was to learn traditional medicine and become a shaman himself. This was an elaborate and extended process that required him to live in near isolation in the jungle for a period of three years. During this time he was provided a strict traditional diet, consisting mostly of plantains and fish. He could eat some jungle fowl, but only the left breast—no other portion of the meat was allowed. Alcohol and sexual contact were strictly prohibited. His food was prepared and delivered to him by either a young girl or a postmenopausal woman, and whatever portion remained uneaten was carefully collected and destroyed so that no other man or animal might consume it.

During the period of this ritual diet, Don Emilio wandered in the jungle to study plants, and about every two weeks he would prepare and drink ayahuasca. The substance revealed to him a supernatural world, filled with spirits of both malign and benevolent character. The malign spirits were those of evil shamans who sought to pierce his body with magical darts that would sap his strength or even kill him. The benevolent spirits were sometimes those of good shamans, but more commonly were the spirits of jungle plants. In his ayahuasca dreams of those years, Don Emilio learned about sixty different songs from the plant spirits. When the initiation period was complete, he began his own practice of medicine, under the tutelage of an elder shaman. He used the icaros, the magic songs taught by the plants, to reinforce the effectiveness of his own herbal preparations and thereby help others to cure particular diseases, attract game or fish, repel the attack of evil shamans, or win the love of another.

It is not known when or exactly where the ritual use of ayahuasca began, but it is likely to have been hundreds of years before European colonization. Ayahuasca first became known to Europeans following the 1851 Amazon expedition of the English botanist Richard Spruce, who observed it being consumed by the Tukano people living on the Rio Uaupés in Brazil. Since then, ayahuasca use has been found to be widespread among Amerindian groups in the upper Amazon basin, with practitioners in present-day Peru, Ecuador, Brazil, and Colombia.

image

Figure 2.1 Don Emilio Andrade Gomez preparing ayahuasca near Iquitos, Peru, in 1981. This photo shows the finished product being decanted. Photograph by Dr. Luis Eduardo Luna. Used with permission.

The recipe for ayahuasca varies across different groups, but the most common preparation is as follows: A shaman gathers a particular type of liana vine (Banisteriopis caapi) and prepares about thirty pieces of the crushed stem, each about a foot long, in about four gallons of water. To this are added about two hundred leaves of the chacruna bush (Psychotria viridis), following which the woody, leafy stew is boiled slowly for about twelve hours, until the volume of the liquid is reduced to about a quart and has become an oily, gritty brown syrup (see Figure 2.1). This is sufficient for about twelve foul-smelling doses. About a half hour after swallowing the brew, hallucinations begin, and typically last for three to six hours. These are primarily visual, sometimes auditory, and frequently terrifying. For most, it’s a rather inward-turning, paranoiac, fearful experience, but one that can yield insight and self-knowledge. Ayahuasca drinking is almost always accompanied by vomiting, and examination of the vomit is said to yield clues as to the efficacy and nature of the treatment.

While plant-based psychoactive drugs are common throughout the world, ayahuasca is unusual in that its action specifically requires the activity of two different classes of compound from two different species of plant. The hallucinations are the result of the dimethyltryptamine (DMT) in the chacruna leaves. DMT is a molecule with a chemical structure similar to that of the well-known synthetic hallucinogen LSD. However, unlike LSD, DMT taken by mouth is completely broken down in the digestive tract by the enzyme monoamine oxidase, and so none reaches the brain to produce its psychoactive effects. The liana contains a group of related beta-carboline compounds, one of which is called harmaline. Harmaline and its relatives are potent monoamine oxidase inhibitors. When taken alone at the doses typically found in ayahuasca preparations, harmaline does not lead to hallucinations. (It does, however, produce a strong tremor and discoordination of movement.) But when liana and chacruna extracts are swallowed together, as in ayahuasca, the harmaline blocks the action of monoamine oxidase in the gut, allowing the DMT from the chacruna to escape breakdown and reach the brain. This is an interesting finding, because it suggests that the Amerindian discoverers of the ayahuasca preparation were unlikely to have simply stumbled upon the properties of the mixture during food preparation. It’s more likely that ancient traditional healers of the Amazon basin had made a systematic study of the effects of specific combinations of plant extracts.

 

Berkeley, 1981. When I was in college, I knew some guys who had devised a unique and very dangerous form of Friday night recreation. After each had consumed about ten beers, they would gather around a huge fishbowl that had been filled about half full with many types of prescription pills, mostly psychoactive drugs. Quaaludes, Valium, amyl nitrite, Dexedrine, Percodan, and Nembutal were all in the mix, as were antihistamines, laxatives, over-the-counter painkillers, and God knows what else. The game was to reach into the bowl and randomly grab two different pills, make note of their color and shape, and swallow them immediately. Then, while waiting for the buzz to kick in, each celebrant would open the huge book next to the fishbowl (the Physicians’ Desk Reference, which lists all the pills produced by drug companies, together with a set of color photographs to aid in their identification) to learn about what he had just ingested, reading aloud to the group. I have a strong memory of a huge, shaggy blond kid (imagine Jeff Spicoli from Fast Times at Ridgemont High writ large) working his way doggedly through a section on potential side effects and mumbling in his surfer-stoner monotone, “Whoa …cerebral hemorrhaging …cool.”

 

So what can we conclude from these four examples? First, psychoactive drugs can be used in many different social contexts: as medicine, as religious sacrament, as pure recreation, or to define oneself as part of a subgroup (elite, outsider, rebel, etc.). Second, these contexts can change and overlap. Opium in ancient Rome and Quaalude pills in the United States in the 1980s were both initially used for medicinal purposes but rapidly became mostly recreational; ayahuasca in the Amazon basin is used as both a medicine and a religious sacrament. Third, religious edicts and laws of the state can have a profound effect on drug use, often in unexpected ways. The nineteenth-century Irish ether-drinking craze resulted in large part from high taxes on ethanol imposed by the British government, combined with the influence of Father Mathew’s Total Abstinence movement. The use of opium didn’t really explode in ancient Rome until Emperor Septimus Severus lifted the last of the restrictions on its sale, in large part to increase tax revenues and thereby fund his military exploits. However, the most important lesson to take from these examples is the simplest one: Across cultures and over thousands of years of human history, people have consistently found ways to alter the function of their brains, while cultural enforcers such as governments and religious institutions have sometimes, but not always, sought to regulate the use of these substances.

Lord Byron, the British romantic and satiric poet of the early nineteenth century, wrote, “Man, being reasonable, must get drunk; the best of life is but intoxication.” While Byron was describing the effects of alcohol, the larger truth applies to psychoactive drugs generally. Because most are derived from plant extracts (cannabis, cocaine, caffeine, ibogaine, khat, heroin, nicotine) or from simple recipes applied to plants (alcohol, amphetamines) or fungi (mescaline), they are widely available and widely used.

In fact, intoxication with psychoactive drugs is not an exclusively human proclivity. Animals in the wild will also voluntarily and repeatedly consume psychoactive plants and fungi. Birds, elephants, and monkeys have all been reported to enthusiastically seek out fruits and berries that have fallen to the ground and undergone natural fermentation to produce alcohol. In Gabon, which lies in the western equatorial region of Africa, boars, elephants, porcupines, and gorillas have all been reported to consume the intoxicating, hallucinogenic iboga plant (Tabernanthe iboga). There is even some evidence that young elephants learn to eat iboga from observing the actions of their elders in the social group. In the highlands of Ethiopia, goats cut the middleman out of the Starbucks business model by munching wild coffee berries and catching a caffeine buzz.7

But do we really know whether these animals like the psychoactive effects of the drug or are just willing to put up with them as a side effect of consuming a valuable food source? After all, fermented fruit is a tasty and nutritious meal. While it’s hard to dissociate these motivations in animals, many cases suggest that the psychoactive effect is the primary motivator for consumption. Often, only a tiny amount of plant or fungus is consumed, so while its nutritional effect is minuscule, its psychoactive effect is large.

Perhaps the most dramatic example of nonnutritive animal intoxication is found among domesticated reindeer. The Chuckchee people of Siberia, who are reindeer herders, consume the bright red hallucinogenic mushroom Amanita muscaria as a ritual sacrament. Their reindeer also indulge. On discovering the mushrooms growing wild under the birch trees, they gobble them up and then stagger around in a disoriented state, twitching their heads repeatedly as they wander off from the rest of the herd for hours at a time. The active ingredient of the Amanita mushroom is ibotenic acid, a portion of which is converted in the body into another compound called muscimol—the substance that actually produces the hallucinations.8 What’s interesting about ibotenic acid is that only a fraction is metabolized in the body to form muscimol, while the rest—about 80 percent of that consumed—is passed in the urine. The reindeer have learned that licking ibotenic acid–laden urine will produce as much of a high as eating the mushroom itself. In fact, this drugged urine will attract reindeer from far and wide, and they will even fight over access to a particularly attractive patch of yellow snow. All of this has not gone unnoticed by the Chuckchee, who collect the urine of their Amanita-eating shamans for two reasons. The first is simple thriftiness: Amanita mushrooms are often scarce, so urine recycling can provide about five doses for the cost of one fresh one, albeit with a rather severe aesthetic penalty. The second is that the reindeer are just as enthusiastic about human Amanita-tainted urine as they are about their own, and so they can be effectively rounded up with a bit of the stuff sprinkled in a corral. Clearly, Siberian reindeer are not fighting over drugged urine for its nutritive value.9

 

All this begs the question: Why is the use of psychoactive drugs so widespread? For simple pleasure? For brief spurts of energy? To reduce anxiety and foster relaxation and forgetting of one’s troubles? To excuse behavior that would not otherwise be socially tolerated? To stimulate creativity and explore new forms of perception? To augment ritual practice? The answer, of course, is all of the above. The psychiatrist Ronald K. Siegel holds that all creatures, from insects munching psychoactive plants to human children playing spinning games to get dizzy, have an inborn need for intoxication. He writes, “This behavior has so much force and persistence that it functions like a drive, just like our drives of hunger, thirst, and sex.”10 Do we, in fact, have an innate drive to alter the function of our brains? And if so, why?

 

Humans seek out drugs with a wide variety of psychoactive effects. A rough taxonomy would include stimulants, sedatives, hallucinogens, opiates, and drugs with mixed actions. The stimulants, which comprise a wide range of compounds that increase wakefulness and generally up-regulate mental function, include cocaine, khat, amphetamines (including Adderall and Ritalin), and caffeine. Stimulants generally have positive effects on mood, but can sometimes cause anxiety and agitation. The sedatives, of course, produce the opposite effects: They are calming and sleep-inducing, and cause discoordination and slow reaction times. Sedatives include alcohol, ether, barbiturates, the benzodiazepine tranquilizers (such as Halcion, Xanax, Rohypnol, and Ativan) and gamma-hydroxybutyrate (GHB). The hallucinogens (substances like LSD, mescaline, PCP, ketamine, and ayahuasca) have as their primary action the disruption of perception—distorting vision, hearing, and the other senses. They also produce complex alterations of cognition and mood, often involving an interesting sensation of “oneness with the universe.” Opiates (including plant-derived compounds like opium, morphine, and heroin as well as synthetic opiates like OxyContin and fentanyl) are sedatives, but ones that deserve their own category because they produce a unique and potent euphoria (and capacity for pain relief), effects that are not shared by other sedatives with a different chemical action.

Of course, we know from our own experience that this kind of drug taxonomy is a blunt instrument. For example, cocaine is typically not considered a hallucinogen, but it can occasionally produce hallucinations at very high doses. Similar blurring of these categories of action comes when considering some of the world’s most popular drugs. While alcohol at high doses is always a sedative (to the point where it can be lethal), at lower doses it has a stimulating effect, particularly in certain social contexts. This stimulation can lead to a range of outcomes—from animated conversation to sloppy karaoke to a bar fight. Nicotine has a complex and subtle psychoactive effect, with mixed actions of a stimulant, a sedative, and a mild euphoric. The popular club drug ecstasy (methylenedioxymethamphetamine, or MDMA) is both a stimulant and a weak hallucinogen that has the additional quality of inducing a sense of intimacy with others. Cannabis is a sedative but also has mild euphoric properties (more than nicotine but much less than heroin). Antidepressant drugs, like the serotonin-specific reuptake inhibitors (Prozac, Zoloft, Celexa, and others) or the dual-action antidepressants (such as Effexor), will lighten the mood of many people, whether or not they suffer from depression, but they don’t easily fall into one of our five categories.

Perhaps the most important aspect of psychoactive drug action that is not captured by our simple taxonomy is social context. While a certain drug will always have the same chemical action, that action is influenced by one’s ongoing brain state in ways that can modulate its effects. People who are given morphine for pain relief typically report a lot of pain abatement and only a mild euphoria. Others taking the same dose of morphine recreationally will report a much higher degree of euphoria. In a recent laboratory study, one group of experimental subjects were told that they were smoking an unusually potent strain of cannabis while another was told that they were smoking an unusually weak strain; in fact, both were given the same average-potency strain. The two groups smoked the same amount over roughly the same time period. The individuals who were told that they were smoking the super-potent cannabis not only reported significantly higher subjective ratings of euphoria (which is perhaps not so surprising), but they also had slower reaction times and greater discoordination of movement as measured with a precision reaching task. As the British addiction expert Griffith Edwards says, “A lot of what drugs do to the mind is in the mind.”11

Many years ago this interaction between mental state, social context, and drug effect was brought home to me in dramatic fashion. Two college friends of mine who decided to take LSD together during finals week asked me to “babysit” them. One fellow, “Ned,” had just finished his exams and was feeling as if the weight of the world had been lifted from his shoulders. The other (we’ll call him “Fred”) had one exam remaining a few days hence—in physics, a class in which he had struggled. Ned and Fred swallowed their doses, put on a Pink Floyd record (it was 1979, after all), and settled in on the sofa for their trips. Ned had a typical happy LSD experience, watching the colors shifting on the ceiling, laughing, and feeling generally blissed out. Fred, on the other hand, had a taste of hell. He became first withdrawn and then deeply paranoid. Soon he was weeping, thrashing around, and screaming about physics equations, Kirchhoff’s circuit laws, and how he would never understand the weak nuclear force. He imagined being attacked by a monstrous Niels Bohr with blood-dripping fangs. It was the classic bad trip, and he never used LSD again.

 

Let’s now return to the issue of pleasure. We know that experiences that cause the dopamine-containing neurons of the VTA to be active and thereby release dopamine in their target regions will be felt as pleasurable and that this process can be co-opted by direct activation using implanted electrodes. One simple hypothesis regarding drug use is that the various psychoactive substances that we humans seek out, whether they are stimulants, sedatives, opiates, hallucinogens, or drugs of mixed action, all activate the medial forebrain pleasure circuit. We’ve already discussed how the stimulants cocaine and amphetamines produce dopamine release from VTA neurons by blocking the reuptake of dopamine into axon terminals, thereby prolonging dopamine action in the VTA target regions and stimulating the pleasure circuit (Figure 1.4). What about other drugs that don’t target the dopamine system? For example, we know that morphine and morphine-related drugs (like heroin and fentanyl) can produce potent euphoria yet have no direct effects on dopamine signaling.

In order to explain the action of opiate drugs, we’ll need to pause for a bit and consider a few more general issues of drug action. Some psychoactive drugs have widespread effects. Both alcohol and ether, for example, act in nonspecific ways that impact many different chemical and electrical functions of neurons. Similarly, caffeine has a wide range of effects on neurons, and its action as a stimulant cannot be traced to a single one of these. However, most drugs, both natural and synthetic, work by acting on specific neurotransmitter systems of the brain. For example, cocaine and amphetamines block the reuptake of dopamine, benzodiazepine tranquilizers like Ativan work by binding and augmenting the natural action of receptors for the inhibitory neurotransmitter GABA, and SSRI antidepressants like Prozac work by inhibiting the reuptake of released serotonin.

In many cases, the action of a drug is well known long before the identification of its target neurotransmitter system in the brain. When Sol Snyder and Candace Pert first demonstrated the biochemical function of the morphine receptor in 1973, this receptor  had no known natural activator within the body. This was puzzling—it seemed unlikely that evolution would produce receptors in the brain that would be inactive until the animal consumed a particular species of poppy plant. Sure enough, two years later John Hughes and Hans Kosterlitz were the first to identify chemicals present in the brain that bind and activate morphine receptors. These natural analogs of morphine are called endorphins. Since then, a large family of opioid receptors with different biochemical actions has been discovered, accompanied by the description of a large number of endorphins. The role of the endorphin/opioid system is multifaceted, being implicated in a variety of functions including pain perception, mood, memory, appetite, and neural control of the digestive system.

A similar story has emerged for cannabis and tobacco. The main psychoactive ingredient in cannabis is the compound tetrahydrocannabinol, or THC, which binds and activates specific and unique receptors in the brain. These receptors, which are called CB1 and CB2, are naturally activated by the brain’s own THC-like molecules. “Endocannabinoids” are the brain’s own cannabis in the same sense that the endorphins are the brain’s own morphine. To date, two endocannabinoids have been identified: 2-arachidonylglycerol and anandamide (the latter from the Sanskrit word ananda, meaning “bliss”). In tobacco the key psychoactive ingredient is nicotine, which activates a subset of the receptors for the endogenous neurotransmitter called acetylcholine.12

The euphoric portion of cannabis intoxication operates via an indirect signaling scheme (Figure 2.2, top). THC binds and activates CB1 endocannabinoid receptors on those presynaptic terminals that release GABA onto VTA dopamine neurons. When ongoing GABA release is reduced, VTA neurons are disinhibited and dopamine release is increased in the VTA target regions. Alcohol has an even more convoluted mode of action. It increases the secretion of both endorphins and endocannabinoids (through mechanisms that are not well understood) and thereby disinhibits VTA dopamine neurons.13

image

Figure 2.2 Heroin and related drugs (morphine, OxyContin, methadone) produce indirect activation of the pleasure circuit by reducing release of the inhibitory neurotransmitter GABA, resulting in disinhibition of VTA dopamine neurons. THC, the main psychoactive ingredient in cannabis, acts in a similar fashion (top panel). Conversely, nicotine evokes indirect activation of the pleasure circuit by increasing the release of the excitatory neurotransmitter glutamate, resulting in excitation of VTA dopamine neurons (bottom panel). Illustration by Joan M. K. Tycko.

Nicotine produces a similar end result to morphine and THC, but uses the opposite logic: It binds and activates receptors on the glutamate-containing axon terminals that contact VTA dopamine neurons (Figure 2.2, bottom). When nicotine activates these specialized receptors (called alpha-7-containing nicotinic acetylcholine receptors), this action increases glutamate release, producing greater excitation of VTA neurons and, of course, increased dopamine release.

 

So we’ve seen that a whole range of psychoactive drugs up-regulate dopamine action in VTA target regions. Interestingly, these compounds span a broad segment of our drug taxonomy: from stimulants like cocaine and amphetamines, to sedatives like alcohol, to opiates like heroin, and to drugs of mixed action like nicotine and cannabis. That’s all very well, but couldn’t dopamine release from VTA neurons merely be roughly correlated with the actions of these drugs but not have a direct role in their psychoactive effects? Probably not. When human subjects are placed in a brain scanner and given an intravenous hit of cocaine or amphetamines or heroin, strong activation of the VTA and dopamine release in the VTA target regions is seen, and these events peak precisely when the subjects report that their pleasure rush is strongest.

Our original hypothesis was that the various drugs that we humans seek out all activate the medial forebrain pleasure circuit. It seems to hold for the substances we’ve just discussed, but is the impulse universal? Do we always seek out drugs for pleasure? Well, no. For example, most hallucinogens—drugs like LSD, ayahuasca, and mescaline—don’t activate the medial forebrain pleasure circuit. Many sedatives, like the barbiturates and benzodiazepines, fail to do so as well. The widespread cross-cultural (and even cross-species) drive to tamper with our brain function cannot be entirely accounted for by activation of the pleasure circuit. Pleasure is central to some—but not all—psychoactive drugs.

If some psychoactive drugs activate the pleasure circuit while others don’t, what does this mean for the many users of these various drugs? Here’s the news: Those psychoactive drugs that strongly activate the dopamine-using medial forebrain pleasure circuit (like heroin, cocaine, and amphetamines) are the very ones that carry a substantial risk of addiction, while the drugs that weakly activate the pleasure circuit (like alcohol and cannabis) carry a smaller risk of addiction.14 Drugs that don’t activate the pleasure circuit at all (like LSD, mescaline, benzodiazepines, and SSRI antidepressants) carry little or no risk of addiction. This pleasure gradient also correlates strongly with the willingness of animals to work for these drugs. Rats will perform hundreds of lever presses for a single tiny injection of cocaine but only a few for intravenous alcohol, and they are completely uninterested in performing lever presses for LSD or benzodiazepines or SSRIs.

Speaking of the “risk of addiction” associated with various psychoactive drugs is a common practice, and while the expression rolls off the tongue easily, it’s admittedly a very crude construction. Sociocultural factors have a huge impact on the risk of addiction associated with a given drug. Obviously, if a drug isn’t easily available to you, you’re unlikely to use it. Thus legal drugs like alcohol and nicotine are broadly available, semilegal drugs like benzodiazepines, prescription amphetamines, and cannabis are somewhat less so, and illegal drugs like heroin and cocaine are difficult to procure and carry the most legal risk. In the United States many psychoactive drugs have become so inexpensive—a dose of LSD, ecstasy, or cannabis often costs the same as a large cappuccino at the mall—that purely economic considerations are minimized. Of course, the attitudes of one’s peers, family, and faith will also have an impact on an individual’s drug use.

For a particularly addictive drug like heroin, cocaine, or nicotine, it seems that the exact mode of intake is likewise crucial in determining its risk for addiction. For example, cocaine may be injected, smoked, snorted, or ingested, and it’s well established that it is more addictive when it is smoked or injected than when it is snorted. This is the basis of the crack cocaine epidemic that devastated many communities in the late 1980s and that continues to be a scourge to this day. Smoked or injected cocaine is more addictive because it reaches the target neurons in the brain with a rapid onset, while snorted cocaine produces a pleasure rush that comes on somewhat more slowly. Ingested cocaine from coca leaf chewing—a traditional practice in the Andes Mountains of Peru and Bolivia—has an even slower time course of onset and is by far the least addictive route of administration.15

The situation regarding opiates is similar. When opium was prepared as a crude extract of the poppy plant, it could either be eaten, inserted rectally, or smoked. Of course, smoking produced a highly effective delivery of morphine to the brain and was much more addictive. However, a set of innovations in the nineteenth century set the stage for even more rapid delivery. First was the purification of morphine from opium by the German chemist Friedrich Sertürner in 1805. Second was the invention of the hypodermic syringe, which allowed for the injection of pure morphine solution into the bloodstream. The first widespread use of injectable morphine came during the American Civil War and was a boon for battlefield pain control in wounded soldiers. But there was a heavy price for such treatment, as many veterans, particularly on the Union side, returned home addicted to injected morphine. The third factor was an invention by the Bayer drug company, which in 1898 introduced heroin, a simple chemical derivative of morphine (with two added acetyl groups). Heroin had the advantage of being able cross cell membranes more easily than morphine, producing an even faster onset of action in the brain and an even more pronounced pleasure rush.

image

Figure 2.3 Cannabis delivered rectally would produce a psychoactive effect with very slow onset. Smoking coffee, on the other hand, might well deliver caffeine to the brain much more rapidly than drinking it. “Five minute comic” by Joey Alison Sayers (www.jsayers.com), from her book I’m Gonna Rip Yer Face Off! Used with permission.

It’s important to note that even injecting heroin does not inevitably result in addiction. A recent study of drug use in the United States estimates that about 35 percent of all people who have tried injected heroin have become heroin addicts. While that’s a very high percentage relative to addiction rates of 22 percent for smoked or injected cocaine, about 8 percent for cannabis, and about 4 percent for alcohol, consider this shocking statistic: 80 percent of all the people who try cigarettes become addicted. In part, that remarkably high number reflects the fact that tobacco is legal and that the health and lifestyle penalties for smoking cigarettes, while significant, are much less than those related to heroin and often take many years to manifest.16

Why is cigarette smoking so addictive when its psychoactive effect is comparatively so subtle? The reason is that the cigarette is the Galil assault rifle of the nicotine delivery world: fast and reliable. Consider that while a heroin user injects a hit and feels a potent euphoric rush about fifteen seconds later, he is not going to inject again for many hours. The cigarette smoker, on the other hand, will typically take ten puffs from a single cigarette and will often smoke many cigarettes in the course of a day. Each puff will deliver nicotine to the pleasure circuit about fifteen seconds later, approximately the same delay as for intravenous heroin. So while a typical heroin addict may get two strong, rapidly delivered hits per day, the pack-a-day cigarette smoker will get two hundred weak, rapidly delivered hits per day. But why does the nearly instant delivery of a drug to the brain, as with smoking cigarettes or injecting heroin, carry a higher risk of addiction than slow delivery of the same drug, say, by chewing tobacco or eating opium?

One way to think about this is to consider that addiction is a form of learning. When someone uses a drug, associations are made between a particular act (injecting the drug or chewing the tobacco) and the pleasure that follows. Imagine that you have a dog that you’re trying to train to come when called, using a tasty morsel of food as a reward. If you want to create a learned association, you’ll call the dog, and when it comes you’ll immediately give it the treat. Now imagine that instead of presenting the reward immediately (as with injected heroin), you wait thirty minutes and then offer the reward (as with ingested opium). In the latter case, the connection between the behavior (coming when called) and the reward is quite weak, and the association is less likely to be learned. The same dog-training analogy holds true for injected heroin (one big pleasure rush) and cigarette smoking (many tiny pleasure rushes). If you call the dog once a day, and then immediately reward its compliance with a ten-ounce steak, it will eventually learn to come when called. If you call the dog twenty times per day and immediately reward each correct behavior with a small chunk of meat, the dog will learn much more quickly. So when we smoke cigarettes, we are being very effective trainers of our inner dog, creating a strong association between puffing and pleasure.17

 

Addiction can be defined as persistent, compulsive drug use in the face of increasingly negative life consequences. Addicts typically risk their health, families, careers, and friendships as they pursue their drugs of choice. Addiction doesn’t develop all at once, however, but proceeds in stages. While some drugs, like heroin, carry a high risk for addiction, not even heroin produces addiction with a single dose. When a drug user initially gets high on cocaine or heroin or amphetamines, the experience produces an intense euphoric pleasure and sense of well-being. It is repeated doses, particularly if strung closely together in a binge, that will often trigger the dark side. This is first manifested as drug tolerance: Soon after a binge, the drug user will need a higher dose to achieve the same level of euphoria, and if drug-taking continues regularly, this tolerance will become greater and greater. As tolerance to the drug develops, so does dependence, which means that the addicted person not only needs more of the drug to get high, but also will feel bad in its absence. Dependence can be experienced as both mental symptoms (such as depression, irritability, or inability to concentrate in the absence of the drug) and physical ones (such as nausea, cramps, chills, and sweats).

In the later stages of addiction, users feel strong cravings for the drug, which are often triggered by drug-associated stimuli. A crack cocaine addict may be feeling relatively stable but will have an intense desire for the drug at the sight of a pipe. The cravings of an amphetamine addict who often gets high in the bathroom of a club can be triggered by dance music or even the sound of a toilet flushing. Odors, like the musty smell of heroin cooking in a spoon prior to injection, are particularly evocative. In his moving autobiography of teenage heroin addiction, The Basketball Diaries, Jim Carroll writes of a friend who tried to kick his heroin habit by seeking spiritual solace in the Catholic church of his youth. However, the smell of the church incense reminded him so much of the musty-sweet odor of bubbling heroin that he felt an overwhelming craving and rushed home from Mass to shoot up once again.

As addiction develops and tolerance, dependence, and cravings emerge, the euphoria produced by the drug gradually drains away. Pleasure is replaced by desire; liking becomes wanting. In everyday speech, we may say of an alcoholic, “She really loves to drink,” or of a cocaine addict, “He must love to get high.” We imagine that drug addicts experience more pleasure from their drug of choice than others and that this motivates their compulsive drug-seeking. However, most active addicts report that they no longer derive much pleasure from their drug of choice. Accumulating evidence indicates that once the trajectory of addiction is under way, pleasure is suppressed, and it is wanting that comes to the fore. Unfortunately, pleasure in the drug itself isn’t the only sensation diminished in addicts, for addiction produces a broad change in the pleasure circuit that also affects the enjoyment of other experiences, like sex, food, and exercise.

Drug addiction, whether to cocaine, heroin, alcohol, or nicotine, is notoriously difficult to break. Relapses, even after months or years of drug-free living, are common, and most abstaining addicts have had to make multiple attempts to get clean and stay clean. It is well known that relapse can be triggered not only by sensory cues that are associated with past drug use (like particular people, odors, music, rooms, etc.) but also by emotional or physical stress. A central insight in recent years has been that the later phases of addiction, characterized by cravings and relapse, are associated with strong and persistent memories of the drug-taking experience. Addictive drugs, by co-opting the pleasure circuitry and activating it more strongly than any natural reward, create deeply ingrained memories that are bound up in a network of associations. Later, these memories are strongly activated and linked to emotional centers by drug-associated external cues and internal mental states. If that weren’t enough to battle, addicts who relapse and take even a small dose of drug after a period of abstinence will get a pleasure rush that’s much stronger than that felt by a first-time user, an effect called drug sensitization.

Habitual drug use produces a long-lasting rewiring of the addict’s brain, which is manifested at the level of biochemistry, electrical function, and even neuronal structure. If we want to understand and treat addiction at the level of molecules and cells and develop therapies to help people break their addiction and stay drug-free, then we need to look for drugs that can produce persistent cellular and molecular changes in the brain. Of course, the first place to focus such efforts is in the medial forebrain pleasure circuitry, and the good news is that we don’t have to start from zero. Neuroscientists have already worked out some aspects of how memory is stored in the brain through experience-driven cellular and molecular changes, and these insights can be applied to the brain’s pleasure circuits and the problem of addiction.

 

Oslo, Norway, 1964. A malaise had settled over the community of neurobiologists investigating the biological basis of memory. Because memories obviously could last for the lifetime of an animal, they had expected that experience should produce long-lasting changes in neuronal function to underlie memory traces. Their best guess for the aspect of neuronal function changed by experience was synaptic transmission, the process by which a spike enters an axon terminal and triggers the release of neurotransmitter molecules, which then diffuse across the synaptic cleft to bind receptors and thereby activate the information-receiving “postsynaptic” neuron. Synaptic transmission is the fundamental mode of rapid communication between neurons and is central to information processing in the brain. The dominant hypothesis was that particular patterns of neuronal stimulation delivered to neurons via electrodes (thereby mimicking actual experience in the world) would produce long-lasting changes in the strength of synaptic transmission. The problem at the root of the neurobiologists’ malaise was that no evidence whatsoever existed to support this postulated mechanism. The longest-lasting changes that had been recorded persisted for only a minute or two—a time scale that was totally insufficient for memory storage.

In 1964, Terje Lømo was a doctor in the Norwegian navy, soon to be discharged. On leave in Oslo to look for a job, he bumped into the neurophysiologist Per Andersen while walking down the street. After an animated conversation about synapses and neurons, he agreed to join Andersen’s laboratory as a Ph.D. student. At that time, recordings of synaptic function in the brain were, for a variety of technical reasons, very difficult to make. Most recordings of neuron-to-neuron synapses had been performed in the spinal cord. (These yielded the brief synaptic facilitation mentioned earlier.) Andersen had developed techniques to record synaptic transmission in anesthetized rabbits in a brain region called the hippocampus, buried deep within the temporal lobe. Lømo took up these techniques and began to probe the properties of hippocampal synapses. In 1965 he obtained the first hints that repeated stimulation (120 pulses at 12 pulses/second) could cause synapses to persistently increase their strength, that is, to produce a larger excitatory electrical effect in the information-receiving (postsynaptic) cell.

image

Figure 2.4 Long-term synaptic potentiation (LTP) is a long-lasting, use-dependent increase in the strength of synaptic transmission that can be triggered by particular patterns of activity. The top panel shows a time course in which the amplitude of synaptic strength is plotted on the y-axis and time on the x-axis. LTP is induced by brief high-frequency stimulation (100 pulses, each delivered every 10 milliseconds) at t = 0 min. The middle panel shows electrical traces representing the activation of an excitatory synapse that uses the neurotransmitter glutamate. The amplitude of the deflection in membrane voltage indicates the strength of the synapse, which increases after induction of LTP. The bottom panel shows a schematic diagram of some of the changes that can underlie LTP: increased neurotransmitter release, increased density of neurotransmitter receptors on the postsynaptic neuron, and growth of both the axon terminal and the postsynaptic region, which is called the “dendritic spine.” Illustration by Joan M. K. Tycko.

However, it was not until the fall of 1968, when Lømo was joined by Tim Bliss, a visiting British scientist with an interest in memory storage, that the research really took off. In their first experiment together they used a design in which a single test pulse was delivered to measure synaptic strength. After recording a series of stable baseline responses, a “conditioning stimulus” consisting of 300 pulses at 20 pulses/second was delivered. Following several repetitions of this conditioning stimulus, the response to the test pulse was larger, reflecting an increase in synaptic strength. Most important, this increase persisted not just for a minute or two, but for many hours—as long as the recording could be maintained (Figure 2.4). That day in 1968 marked the first real glimpse of a memory storage mechanism in the brain and began the modern era of memory research, in which memory is analyzed at a cellular and molecular level.18

Lømo and Bliss called their new phenomenon long-lasting potentiation, or LLP. However, as often happens in science, this name didn’t stick, and it is now known as long-term synaptic potentiation, or LTP. (Bliss once quipped that the acronym LLP didn’t catch on because it made the speaker seem as if he were in need of urgent assistance.) Starting in the 1970s, LTP created tremendous excitement among memory researchers not only because of its duration, but also due to its relevance to a well-known neurological case.

H.M. was a patient who had undergone surgery to control otherwise intractable epilepsy. The operation, which involved bilateral resection of his hippocampus and some surrounding tissue, cured the epilepsy, but left him with two profound memory impairments: He could no longer recall facts and events from a period of one to two years prior to the surgery, and, even stranger, he could no longer form any new memories for facts and events, a phenomenon called anterograde amnesia. These symptoms implicated the hippocampus in memory storage. At that time, the prevailing theory was that LTP and its later-discovered mirror twin, long-term synaptic depression, or LTD, were rare phenomena that would be found only at a few specialized synapses in the brain that had particular roles in memory storage. That has turned out not to be the case at all: LTP and LTD are nearly universal properties of synapses and can be found everywhere from the spinal cord to the most recently evolved portions of the frontal cortex, and almost every brain region in between. Even the most ancient parts of our brain—the parts we share with fish and lizards, regions that control basic functions like spinal reflexes, breathing, temperature control, and the sleep/wake cycle—have LTP and LTD and hence the capacity to be modified by experience. So, it turns out, do our pleasure circuits.

 

Let’s briefly recap what we’ve learned in this chapter in order to formulate some ideas about drug addiction and lasting changes in brain circuitry. We know that addictive drugs activate the medial forebrain pleasure circuit—in particular, the dopamine neurons of the VTA—and that this activation is central to the euphoria that these drugs produce. We also know that sensory experience can write memories into brain circuitry. These memory traces are formed, at least in part, at synapses, by means of LTP and LTD. Finally, we know that there is a time course of addiction in which initial euphoria becomes overlaid with drug tolerance and dependence and intense cravings that can last for years after drug-taking ceases. Continued cravings lead to a high incidence of relapse, often triggered by stress. These facts together suggest an interesting and straightforward general hypothesis: Repeated exposure to addictive drugs produces persistent changes in the function of the medial forebrain pleasure circuit and its targets, including (but not limited to) LTP and LTD, and these long-lasting changes can underlie certain aspects of the trajectory of addiction, in particular, tolerance, dependence, cravings, and relapse.

LTP and LTD are most commonly studied at excitatory synapses that use the neurotransmitter glutamate, like those between the axons of the prefrontal cortex or the amygdala and the dopamine neurons of the VTA. In 1993 C. McNamara and colleagues from Texas A&M University prepared rats in a Skinner box so that lever presses delivered tiny intravenous injections of cocaine. As we’ve discussed, normal rats will rapidly learn to press the lever at a furious pace to get cocaine (or other addictive drugs). However, the researchers found that when the rats received an injection of a compound called MK-801 that blocks the induction of the most common forms of LTP and LTD before being placed in the Skinner box, they were indifferent to cocaine. They pressed the lever to self-administer the drug only at chance levels.19 Likewise, when an experimenter manually controlled cocaine infusion, the rats would return to the portion of the cage where the drug was administered because they had formed an associative memory relating that particular location to the pleasure produced by the cocaine hit. However, when rats were pretreated with MK-801, this association was not formed, and the rat continued to explore the cage freely after a cocaine infusion. When MK-801 is administered systemically via an injection in the abdomen, it acts everywhere, not just at the glutamate-using synapses of the VTA. So it is important that a tiny MK-801 injection directly into the brain using a needle that specifically targets the VTA (but not other regions like the nucleus accumbens) can also block cocaine-evoked place preference.

These findings with MK-801 pretreatment suggested that LTP and/or LTD of the excitatory synapses received by the neurons of the VTA was induced in rats as a result of taking cocaine. To test this idea, rats were given a single large dose of cocaine, and following a twenty-four-hour waiting period, the strength of the synaptic connections between the excitatory glutamate-containing axons and the neurons of the VTA was measured. Remarkably, a single cocaine dose produced robust LTP, rendering those synapses stronger. Cocaine-evoked LTP was also prevented by prior treatment with MK-801. Later experiments showed that this LTP was still going strong when the VTA glutamate synapses were measured three months after a single dose.

Other work has shown that drug-evoked LTP in the VTA was not limited to cocaine, but could also be produced by doses of amphetamines, morphine, nicotine, and alcohol. Importantly, non-addictive drugs like the antidepressant fluoxetine or the mood stabilizer carbamazepine did not evoke LTP, demonstrating that LTP in the VTA is not a general effect of all drugs that act in the brain. Another key finding is that these addictive drugs did not produce LTP at synapses throughout the brain, or even all of those that use glutamate as a neurotransmitter; glutamate-using synapses in the hippocampus, for example, are not persistently altered by cocaine or morphine treatment.

So what does the discovery of drug-evoked LTP of glutamate-using synapses in the VTA mean for behavior? Recall that these synapses convey information from the prefrontal cortex, which is important for converting sensory information into plans and judgment, and from the amygdala, which processes emotional information. When these excitatory synapses are made stronger by drug-evoked LTP, subsequent sensory cues and emotions will more easily activate the VTA neurons, causing them to release dopamine in their target regions. One reasonable hypothesis is that drug-evoked LTP in the VTA is necessary for learning the association between the pleasure evoked by the drug and the sensory cues and emotional state that accompanied it.

These recent findings are very exciting, in that we now actually have explanations for some aspects of addiction at the levels of cells and molecules in the brain. But we need to put them in perspective: Drug-evoked LTP in the VTA cannot account for the entire biological basis of addiction. After all, this LTP is produced by a single dose of drug—and holds true even for single doses of less dangerous drugs like alcohol or nicotine—which is not sufficient to produce addiction. So, then, what happens to the brain circuitry of rats when they are exposed to cocaine repeatedly? Surprisingly, there’s no further LTP of the VTA glutamate synapses—a single dose already produces maximum potentiation. However, repeated doses of cocaine do produce an additional form of plasticity that is not seen following a single dose: LTD of inhibitory synapses in the VTA that use the neurotransmitter GABA. Because the action of GABA is opposite to that of glutamate, LTD of GABA-using synapses will reduce inhibition of VTA neurons, leading to their excitation, thereby further turning up the gain on the pleasure circuit and producing even more dopamine release in the VTA target regions (see Figure 2.2). This synergistic effect on the activation of VTA dopamine neurons (LTP of excitatory synapses combined with LTD of inhibitory synapses) is likely to underlie at least a portion of the drug craving seen in the later stages of addiction.

In habitual drug use the repeated barrage of the nucleus accumbens, the dorsal striatum, and the prefrontal cortex by the VTA dopamine neurons produces changes in these target structures as well. After five days of repeated cocaine administration, the nucleus accumbens undergoes a series of alterations. One of these is an increase in the level of the neurotransmitter dynorphin, one of a class of natural molecules called endorphins that have morphine-like effects. Increased dynorphin release in the nucleus accumbens dampens the electrical activity in this portion of the pleasure circuit (and thereby overrides activity in “upstream” structures like the VTA). Activity in the nucleus accumbens is further suppressed through another mechanism: LTD of the glutamate-using synapses that convey information to the nucleus accumbens from the hippocampus, the prefrontal cortex, and the amygdala. Both of these changes in the nucleus accumbens turn down the gain on the pleasure circuit and are likely to underlie early features of addiction: tolerance and dependence. At this stage, in the absence of more cocaine, the action of the pleasure circuit is chronically suppressed, leading to depression, lethargy, irritability, and the inability to derive pleasure from other activities: the mental symptoms of drug dependence/withdrawal.

When rats are given five days of cocaine treatment and are then kept drug-free for days or weeks thereafter, mimicking the conditions faced by an abstaining addict, still further neuronal changes occur. One of the most noticeable is in the fine structure of the main class of neurons in the nucleus accumbens. These are called medium spiny neurons, because their long, branching dendrites (the treelike structures where most synaptic contacts from other neurons are received) are covered with tiny nubbins called dendritic spines. The spines aren’t just ornamental; they are where the glutamate-and dopamine-using axons from other brain regions form their synapses. In rats that become addicted to cocaine there is an overgrowth of dendritic spines so that these medium-spiny neurons become super-spiny, allowing for increased excitation (Figure 2.5). In addition, each individual synapse received by a medium-spiny neuron undergoes LTP. This LTP doesn’t just counteract the LTD seen immediately after five days of cocaine—it overshoots it, leaving the synapse stronger than it was in the pre-drug state. These two persistent changes in the nucleus accumbens following a period of abstinence have been suggested to underlie drug sensitization, the final hurdle for an addict trying to stay clean.20

image

Figure 2.5 Repeated cocaine doses rewire the brain’s pleasure circuits. In one study, rats were given either cocaine or saline solution every day for twenty-eight days and then sacrificed two days later. Brain sections were prepared, stained, and observed with a microscope to reveal the spine-covered dendrites of nucleus accumbens neurons. Each tiny nubbin in the images above is a dendritic spine, the point where an excitatory synaptic contact is received. You can see that the dendrites of the cocaine-treated rats appear bushier, showing a higher density of dendritic spines. Reprinted from S. D. Norrholm, J. A. Bibb, E. J. Nestler, C. C. Ouimet, J. R. Taylor, and P. Greengard, “Cocaine-induced proliferation of dendritic spines in nucleus accumbens is dependent on the activity of cyclin-dependent kinase-5,” Neuroscience 116 (2003): 19–22, with permission from Elsevier.

There’s still a lot we don’t know about the neurobiology of addiction, but exciting progress has been made that enables us to construct useful models that are subject to experimental testing. We now have a foundation that can be built upon. At present this research is mostly conducted with rats and mice, as we can’t yet measure LTP or LTD noninvasively, as would be required in human experiments. (Today’s brain scanners aren’t up to this task.) Nevertheless, there’s great promise for developing therapies to help people break addictions. The first generation of drugs directed against neurotransmitter receptors in the VTA and its target regions are already in use or in clinical trials, and more are on the way. (In chapter 7, “The Future of Pleasure,” we’ll consider present and future anti-addiction treatments in more detail.) The hope is that these treatments will suppress cravings and prevent sensitization and relapse and help recovering addicts stay drug-free. The challenge will be to do so while not compromising other aspects of the pleasure circuit involved in crucial behaviors like eating and sex.

 

These days most of us are willing to believe that drug addiction—including alcoholism—is a disease. Still, we harbor a sneaking suspicion that it’s really a disease of the weak-willed, the spiritually unfit, or people who are not quite like us. The comedian Mitch Hedberg understood this when he riffed:

Alcoholism is a disease, but it’s the only one you can get yelled at for having.

“Goddamn it, Otto, you’re an alcoholic!”

“Goddamn it, Otto, you have lupus!”

One of those two doesn’t sound right.

Whatever our prejudices, the truth is, given the right circumstances (which can include factors like high stress, early drug exposure or childhood abuse, poor social support, or genetic predisposition), anyone can become a drug addict. Addiction is not just a disease of weak-willed losers. Indeed, many of our most important historical figures have been drug addicts—not only the creative, arty types like Charles Baudelaire (hashish and opium) and Aldous Huxley (alcohol, mescaline, LSD), but also scientists like Sigmund Freud (cocaine) and hard-charging military leaders and heads of state from Alexander the Great (a massive alcoholic) to Prince Otto von Bismarck (who typically drank two bottles of wine with lunch and topped it off with a little morphine in the evening).21

From studies comparing identical and fraternal twins it is estimated that 40 to 60 percent of the variation in the risk for addiction is contributed by genetic factors. That said, we are only in the early stages of understanding genetic contributions to addiction. There is no single “addiction gene,” and it is likely that a large number of genes are involved in this complex trait.22 One tantalizing observation concerns the gene for the D2 subtype of dopamine receptor, a crucial component of the pleasure circuit. A particular form of this gene, called the A1 variant, results in reduced expression of D2 dopamine receptors within the nucleus accumbens and the dorsal striatum. Carriers of the A1 variant are, as a result, significantly more likely to become addicted to alcohol, cocaine, or nicotine. Furthermore, among alcoholics, those with the A1 variant tend to be more severely affected, with earlier age of drinking onset, more severe episodes of intoxication, and more unsuccessful attempts to quit. In families with a strong history of alcoholism, brain scanning has revealed that those family members who were not alcoholics had more D2 receptors in the nucleus accumbens and the dorsal striatum than those who were. Taken together, these studies suggest that elevated levels of D2 receptor may be protective against certain forms of drug addiction. Indeed, in rats trained to self-administer alcohol, injection of a genetically engineered virus into the striatum to produce increased D2 receptor expression caused them to reduce their alcohol intake considerably. (A control group, which received an inactive virus, did not show this effect.) We are not at the point of injecting engineered viruses into the brains of human drug addicts anytime soon, but these findings do suggest the D2 receptor as one target for new addiction therapies.

 

If these biological processes are so important in addiction, then where does the social/experiential role come into play in recovery? Do talk therapy or twelve-step groups or prayer or meditation really have a significant function? Given what we know about the genetic predispositions for and biological substrates of addiction, it’s easy to conclude that we’re all slaves to our genes and our brain chemistry. That’s simply not true. The long-lasting changes in neural circuits that are the result of repeated drug use, such as LTP and LTD and structural changes in neurons, can all be produced by one’s experience in the world as well. Indeed, these are some of the very mechanisms that enable us to write our experiences into memory and hence confer our individuality. In the brain, causality is a two-way street. Yes, our genes and our neural circuits predispose us to certain behaviors, but our brains are malleable, and we can alter their neural circuits with experience. When an addict goes to talk therapy or engages in mindful meditation to reduce stress or to create associations between drug use and negative life consequences, these actions don’t just occur in some airy-fairy nonbiological realm. They create changes in the pleasure circuitry to reverse or otherwise counteract the rewiring produced during addiction. This is the biological basis of social and experiential therapy.23

 

When we say that addiction is a disease, aren’t we just letting addicts off the hook for their antisocial choices and behaviors? Not at all. A disease model of addiction holds that the development of addiction is not the addict’s responsibility. However, crucially, recovery from addiction is. We don’t blame someone with heart disease for the development of his condition. Yet once the disease is diagnosed, we do expect him to be responsible for his recovery by eating a healthy diet, exercising regularly, taking his medication, and so on. Similarly, believing that addiction is a disease does not absolve addicts from their responsibility for their own recovery and everything that entails. It’s not a free ride.