14
MOLECULAR CULPRITS
BY NOW WE should finally be ready to spend some time talking about drugs themselves. I have outlined much of the brain machinery that addictive substances hook into, and the inherited vulnerabilities that lead to different levels of risk between people. I have also described enough of the behavioral phenomena that result. That allows me to now shift focus and let a few of the molecules that people actually become addicted to take center stage. In doing so, it is my intention to be quite selective. This is not primarily a book about drugs as such—excellent descriptions of those can be found elsewhere.1 This is first and foremost a book about people, their brains, how they become affected by addiction, and how they can be treated. The study of addictive substances as pharmacological agents is a field that obviously intersects in a major way with those topics, but it has its own perspective and a somewhat different flavor.
However, to fully understand the dynamics of drug addiction, both in populations and in individual people, it is helpful and almost necessary to discuss some real-world prototypes of addictive drugs. Along the way it is worthwhile to consider the history of these substances and how their impact on human lives has evolved over that history. That perspective should help us capture some salient characteristics of the interaction between molecules and brains and allow some themes to emerge. One of those themes is that availability is a major determinant of drug use, and that drug use for that reason spreads in ways that are similar to epidemics of infectious agents. Another theme is that purity and route of administration, trivial though they may sound, are major determinants of addictive potential. That is because the rate with which drug effects hit the brain is a major determinant of the high people experience. As a corollary of that rule, we will see that the long-term changes that usher the switch to the “dark side of addiction” are also most effectively driven when the initial, rewarding hit of drug high is the most intense. A final theme will be that addictive properties of drugs are to varying degrees influenced by characteristics of the individual. Nowhere is that more pronounced than in the case of alcohol. I will discuss some of the reasons for that.
As examples I will use three categories of drugs that most people are familiar with. The first of these, morphine-like substances, or “opioids,” are probably the psychoactive drugs whose use goes the furthest back in the history of mankind. As an aside, the history of these substances is much longer than that. Receptors for morphine-like substances were present as early in evolution as the amoeba. Applying morphine to a bath in which these unicellular organisms are swimming around will make them move toward the drug and also alter the cellular processes through which they ingest nutrients, or “eat.”2 Clearly systems that guide approach behaviors in humans build on elements that evolution developed long before our species appeared on the planet. The next drug group, cocaine and amphetamine-like drugs, which despite some important differences we can bundle under the label “psychostimulants,” is important because it hooks right into the core of the brain systems thought to mediate approach behaviors and reward. Alcohol, finally, makes up a substance category of its own but is probably able to fill up a catalog of actions more extensive than any other drug category. It is also, for that very reason, the addictive drug where individual differences in drug effects are the greatest.
“Opium” is in many ways a prototype for all other addictive drugs. Its name refers to dried milk of the opium poppy, Papaver somniferum.3 Opiates are the naturally occurring substances that can be extracted from opium, with morphine being the most important among them. In modern times a number of manufactured opioids, literally meaning “opium-like” substances in Greek, have been added to the formulary. For instance, a painkiller such as fentanyl is entirely man-made but acts in ways that are in principle at least the same as morphine. These drugs are therefore appropriately called opioids. To the human brain, it really doesn’t matter if a molecule happened to be extracted from a plant or was made in a laboratory before it hits the brain receptors. And sometimes it is not even possible to know whether it was one or the other—the painkiller codeine, for instance, can either be extracted from the opium poppy or be made in a chemistry lab. Out of deference to proper nomenclature, I will keep using both terms, “opiate” and “opioid,” and try to be as precise as possible. But from a clinical perspective, we could just as well go with the broader term “opioids” for all drugs that act in ways similar to morphine. This broader label also covers opiate-like substances, or endogenous opioids, that are made by our own brains and that were discussed in a prior chapter.
Inscriptions on clay tablets tell us that the opium poppy was grown as early as 5,500 years ago by the Sumerians of Mesopotamia, the “land between the rivers” that was the cradle of Western civilization. The Sumerians called their poppy the joy plant, so they must have been aware of its psychotropic properties. From here it took the plant some 2,000 years to reach Egypt. The people of Thebes, the Egyptian capital at the time, became the first large-scale commercial opium growers, giving the plant the name “opium thebaicum.” This name still echoes through the millennia in thebain, a constituent of opium poppy from which the common painkiller oxycodone can be made. Seafaring Phoenician and Cretan merchants then brought the desirable opium plant from Egypt to ancient Greece, from where it spread to the rest of Europe and, carried by Alexander the Great’s armies, to Persia and India. Ironically, although today associated with the Far East, opium did not arrive in China until the fifth century A.D. Only a few centuries later, however, it was so strongly associated with “Eastern magic” that it was condemned by the Catholic Church, and the Inquisition made sure that its use fell out of fashion in Europe. When opium returned to the West in the sixteenth century, it was as a medicine to suppress pain, called laudanum.
Originally opium was dried and eaten. Taken in this manner, much of the active ingredient is broken down in the gut. Perhaps even more important, taking opium this way makes for a rather slow uptake of what active ingredient remains. Since the high from a drug is related both to the amount of drug that reaches the brain and the speed with which it does so, the slow uptake of opiates taken by mouth limited their addictive potential. Then, in the sixteenth century, Portuguese marine merchants sailing off the cost of China figured out that opium could be mixed with tobacco and smoked. All of a sudden, uptake into the brain was nearly complete and almost instantaneous—truly a “high.” Used this way, the drug offered intense rewarding effects that quickly took control of the people using it. The Chinese, with their Confucian culture valuing self-control, considered this practice barbaric. But over the next two centuries, the Portuguese, British, and Dutch all used their colonial powers to promote trade of opium grown in India with the Chinese mainland. The consequences were so disastrous that three Chinese emperors attempted to outlaw the use of opium in the eighteenth century. But the trade was too profitable for the British Empire to give up. Instead the Opium Wars secured the rights for the East India Company to continue the trade, resulting in the misery of opium dens reaching epidemic proportions. To this day the growth of opium in the Golden Triangle (Burma, Thailand, Laos, and Vietnam) is a gift to our times from the colonialism of the British Empire. About a century later the other major opium plantation of the world, the Golden Crescent that spans much of Afghanistan and northern Pakistan, is a product of another great empire and its conquests, that of the late Soviet Union.
In 1803 or possibly 1804, the German pharmacist Friedrich Sertuerner mixed opium with an acid and then neutralized the mix with ammonia. Through this procedure he became the first person on record to extract in pure form the active ingredient of the opium poppy or, for that matter, of any medicinal plant. Principium somniferum, or morphine,4 had arrived. Physicians were somehow convinced that the substance could now finally be used in a precisely controlled and therefore safe fashion. In the late 1820s E. Merck of Darmstadt, Germany, started large-scale commercial manufacturing of the new miracle medicine. In 1843 it was discovered that the effects of morphine were both more immediate and more potent if the drug was injected, increasing its utility in controlling severe pain. The American Civil War saw a great breakthrough for the hypodermic needle as a technology, and its use allowed effective morphine delivery to the massive numbers of wounded soldiers. The rapid pain-relieving actions must have seemed a blessing. But it was soon clear that what had been viewed as a major medical advance was turning into a disaster. After the war large numbers of demobilized soldiers were unable to wean themselves off morphine. By the end of the nineteenth century, there were a quarter million morphine addicts in the United States.
A search for remedies was on. In 1897 Heinrich Dreser, who headed up the pharmacology laboratories of Bayer Industries in Germany, found that a simple chemical modification turned morphine into a drug called diacetylmorphine. Today this chemical is of course better known by the trade name given to it by Bayer on commercially launching it in a cough syrup—“Heroin.”5 The new miracle drug was promoted for many ills. With tuberculosis rampant and no antibiotics yet available to treat pneumonia, the ability of heroin to suppress cough was heaven sent. It was also a more potent analgesic than morphine. Its ability to stop diarrhea was useful, too. But the most important application was thought to be as a cure for morphine addiction. This time, however, it did not take long before the addictive potential of the proposed “cure” itself was realized. Bayer’s new wonder drug had an unusually short career. Its provision without a prescription was stopped in the United States in 1913, and all medical use was outlawed in 1919.
Today morphine remains the prototypical medical opiate, while heroin is the prototypical addictive member of the class. There are of course many other opiates and opioids—too many to be discussed here. When teaching these things, my approach is always to first help people understand one or two prototypical drugs in a class, the profile of their actions, and the mechanism through which these actions are produced. This provides a mental structure from which to start. After that, other, related substances can more easily be understood as variations on the theme. For instance, oxycodone is much like morphine but is a bit less potent and is more reliably absorbed when taken by mouth, lasting a bit longer and not causing itching. Fentanyl, on the other hand, once again is similar in its actions to morphine but is much more potent and extremely fast acting. And heroin really is morphine as far as the brain is concerned. The small chemical tweak introduced by Wright and Dreser just speeds up its entry into the brain. The chemical additions are then cut off by the body, and the brain is left with morphine that is now free to do its job. So in a way, the development of heroin was just another step in a progression, from slowly raising opiate concentrations in the brain by eating opium, through a faster uptake achieved by smoking opium or injecting morphine, to the ultimate hit—so far—achieved either by vaporizing and inhaling brown heroin or injecting the fully purified, white product. Either of the latter ways leads to an intense hit of euphoria.
Or, I should say, may lead to an intense hit of euphoria. Right here we can see how drug effects are a true interaction between the molecules and the brains they hit. Not everybody is prone to euphoria from opiates or opioids. In fact, the people who do respond with pleasurable effects to these drugs are probably a minority. For instance, large numbers of people receive morphine for postoperative pain. If asked, only a few of those report pleasurable effects. Brought up with the commanding, objective, and seemingly unequivocal dose-response curves of my pharmacology training, for years I did not know what to think of this variation. Maybe people were just embarrassed to reveal those pleasurable effects of morphine? That, for many reasons, turns out not to be an explanation. Maybe it simply had something to do with the setting in which opiates were given? There was in those days a widely held perception, since then long shattered by the recent epidemic of prescription opioid abuse, that when opioids are given in a medical setting, the addictive potential just isn’t there. It didn’t sound unreasonable at the time—after all, environment does matter. Maybe this was just an illustration of that fact?
Then, after having practiced and taught addiction medicine for some fifteen years, having written a textbook for medical students on the topic, and having begun to think that I knew a thing or two, I had my first surgery myself. My colleague Jan-Erik Juto of the Karolinska Hospital, a quiet, highly respected ENT surgery virtuoso, was simply wonderful. But boy, was I in pain after he had messed around inside my face. As I woke up and was wheeled back to the ward, I kept getting morphine. There was some bleeding, too. Nevertheless, around 8 p. m. I rang the bell. When the charge nurse came, her face expressing concern about my well-being, I asked if I could please sneak down to my office, just two floors down in the hospital, and check on some work. She looked at me in disbelief, shook her head, and gave me a lecture on postoperative complications. I let her finish the speech, waited half an hour until the shifts had their handoff at the nurse’s station, then quietly left, still in my hospital gown. Once I got to my office, I was able to quickly finish up a paper that at the time seemed important. As I came to my senses the next day, I was equally embarrassed, awestruck, and frightened. The simple fact was, I had probably never felt that good in my whole life. I recall telling my wife about the experience after coming home. She tried to listen, but it was clear that she just didn’t understand. After a couple of cesarean sections, morphine to her was just good pain suppression, obtained at the cost of unpleasant nausea. Individual differences in drug responses are, to a large extent, about just that—the individuals involved. And the most permanent ways in which we as individuals differ are of course in our genetics. Personalized medicine is rearing its head, sometimes ugly, sometimes beautiful, almost everywhere we look.
For people who end up taking opiates or opioids for their addictive properties, there is typically a set sequence of events that follow after taking drug. After injecting or inhaling heroin comes a brief rush, an intense feeling of intoxication that lasts only a few minutes and is accompanied by a visceral feeling commonly described by patients as similar to that of an orgasm. This is followed by a high of quiet euphoria that lasts perhaps half an hour. After that, people feel “straight” for a few hours and can seem quite unaffected. But then withdrawal symptoms kick in, and the person starts feeling sick. Other actions of heroin and morphine affect people in a more consistent fashion than the high. Perhaps most important is a triad of effects. First, there is of course a prominent suppression of pain. Second, and perhaps less widely known, secretion decreases or stops from all kinds of glands in the body, such as tear and sweat glands, and the water-excreting glands that line the gut. Third, the centers in the brain stem that provide the pacemaker signals for breathing are suppressed, ultimately resulting in death if respiration is not maintained by some external force.
As already mentioned, only a minority of people experience a high from opioids. We don’t yet know exactly why that is, but there is a hint that this is determined by our genes. That hint is provided from another opioid effect, the suppression of pain. People differ in how effectively morphine does that, and those individual differences have been clearly tied to genetics of the mu-opioid receptor gene. We will examine those differences and some of their consequences when we come back to the potential of the opioid antagonist naltrexone as an addiction treatment.
A unique feature of opiates and opioids is that unusually high and rapid tolerance develops both for the high they give rise to and for their other actions. It is actually still not exactly clear how that happens in the brain, either. My textbook in medical school once had all the answers to this, but it has been downhill ever since. From a clinical perspective, however, the high degree of tolerance that develops means that with continued use, people will need to drastically increase their doses just to maintain a certain effect. When they stop using, tolerance typically wears off within about a week.6 While tolerance wears off, there will in the absence of treatment be a constellation of much feared and unpleasant withdrawal symptoms. These are best thought of as the opposite of the drug effects, so you should know by now what to expect. In a mirror image of the high, mood will be depressed. Analgesia will turn into a flu-like ache throughout the body and goose pimples all over the skin. All glands will start secreting fluid, so that eyes will tear, noses will run, and guts will discharge. Actually, as unpleasant as it is to go “cold turkey” like this, it is with rare exceptions entirely without risk, which is more than can be said about withdrawal from the much less feared drug alcohol.7 But as cycles of intoxication and withdrawal are repeated, there is with opiates a very powerful allostatic shift, of the kind described in a prior chapter. This syndrome is usually present in fully developed form about two years into heavy use. The patient moves to the dark side of addiction. In the absence of drug, mood now remains low long beyond acute withdrawal, and reactions to stress remain high. This creates a powerful incentive for resumption of drug taking, signaled by cravings that are perhaps more powerful than for any other class of addictive substances. Somehow these cravings often persist for decades and can even be experienced in dreams.
Treatment of opioid withdrawal is commonly called detoxification. Although often thought to require highly specialized skills, it is actually very simple, at least in principle. Tapered doses of almost any opioid given over about five to seven days will allow tolerance to wear off without too much pain. A lot of treatment resources and efforts are devoted to detox. It may seem good to reduce the discomfort of a suffering patient and hope for that also to be a first step toward a drug-free life. But as we will see in a coming chapter, unless given effective treatments such as methadone or buprenorphine maintenance, about 90 percent of heroin addicts will relapse within a year. And when they relapse, having rid themselves of their tolerance is not necessarily a good thing. Out of old habit, patients will frequently go back to taking heroin at doses they were used to before entering detox. The difference is that now, with tolerance gone, those doses are enough to kill them, by shutting off respiration in an overdose. As an example, in the first couple of weeks after leaving prison, where they are typically forced to detox, heroin addicts have an up to eightfold increased risk of dying from an overdose.8
This brings us to a final point. Heroin dependence is the deadliest of addictions. At any given time, a heroin addict runs a twenty to fifty times higher risk of dying than a normal, healthy person of the same sex and age. The corresponding number for alcohol is about fivefold. People with heroin addiction die when they stop breathing following an overdose, but also through infections they get by sharing needles, such as bacterial growth on the heart valves, HIV, or hepatitis B and C. There is also a lot of “violent death.” All in all, nowhere is the need to provide treatment as much a matter of life and death as in the case of heroin addiction.
“The Divine Plant of the Incas,” Erythroxylon coca, grows as a bush or a small tree on the slopes of the Andes in present-day Peru, Chile, Colombia, and Bolivia. Its use by early humans of South America is not documented in writing the way we find a written record of early opium use. But radiocarbon dating of archeological findings provides evidence of human coca leaf use and primitive extraction of their contents in this region as early as about eight thousand years ago. By the sixth century A.D., pottery frequently depicted people chewing coca, and supplies of coca leaves were buried together with mummified bodies. At the peak of the Inca Empire, in the fifteenth century A.D., the coca bush was considered to be divine in origin, and growing it was subject to a monopoly restricted to the state and the aristocracy.
After the conquest of the Inca Empire by Francisco Pizarro in 1535, the first references to the coca bush started appearing in print in Spain around 1570. These accounts described how Inca “runners” chewed the coca leaves, mixed up with the alkaline ashes of another plant, to release the active ingredient on long, harsh journeys across the mountain ranges. From the leaves, the runners got the endurance needed for their long, steep climbs and the feeling of satiety that allowed them to travel with only a minimum of food. Later the new European masters would find these same effects useful to extract the most effort possible from their indigenous workers. A Dr. Poeppig reported the following in his 1835 report from journeys in Chile:
The miner will perform, for twelve long hours, the formidably heavy work of the mine, and, sometimes, even doubles that period, without taking any further sustenance than a handful of parched maize, but every three hours he makes a pause for the purpose of chewing Coca…. The same holds good with the Indian, who, as a porter, messenger, or vender of his own productions, traverses the Andes on foot. Merely chewing Coca from time to time, he travels with a load weighing one hundredweight, on his back, over indescribably rough roads and accomplishes frequently ten leagues in eight hours.9
But in addition to endurance and satiety, chewing coca leaves was known to give a euphoric pleasure. The Incas used them to celebrate holidays because of these uplifting properties. In fact, the leaves were thought to have powers so magical that they were considered a divine gift, as expressed in this prayer:
Oh, mighty lord, son of the Sun and of the Incas, thy fathers, thou who knoweth of the bounties which have been granted thy people, let me recall the blessings of the divine Coca which thy privileged subjects are permitted to enjoy through thy progenitors, the sun, the moon, the earth, and the boundless hills.10
Right here are captured some of the key facts worth noting. First, it is clear that the active ingredient of coca leaves acts to stimulate the person, counteracting feelings of fatigue, eliminating hunger, and producing an elevation of mood. This justifies the name psychostimulant. Second, in a parallel to early use of opium, the content of active ingredient in the leaves is relatively low—in the case of coca, on average perhaps only around 1 percent. Its absorption when the leaves are chewed is also rather slow. Under these conditions the addictive potential appears to be more limited than will be observed later in history. Today’s crack users would hardly be able to hold a job that involves carrying weights of about 50 kilograms over distances of 50 kilometers a day.
Although coca was imported to Europe shortly after the Spanish conquest of the Inca Empire, it did not become popular until the middle of the nineteenth century. At that time the Italian neurologist Paulo Montegazza experimented with coca after returning home from travels in South America. Based largely on his personal experience, he published a paper in 1859 that would become highly influential. It was entitled “On the Hygienic and Medicinal Properties of Coca and on Nervous Nourishment in General” and gave Montegazza’s account of his experience with effects of coca leaves on cognition:
I sneered at the poor mortals condemned to live in this valley of tears while I, carried on the wings of two leaves of coca, went flying through the spaces of 77,438 words, each more splendid than the one before…. An hour later, I was sufficiently calm to write these words in a steady hand: God is unjust because he made man incapable of sustaining the effect of coca all lifelong. I would rather have a life span of ten years with coca than one of 10 000 000 000 000 000 000 000 centuries without coca.11
By the time Montegazza’s paper was published, the method for extracting chemically pure constituents from medicinal herbs that had been developed by Sertuerner had been around for over fifty years and was well known among chemists. With an endorsement like that provided by Montegazza, the extraction method was quickly applied to coca leaves. The same year Montegazza’s paper was published, the German chemist Albert Niemann, working out of the University of Göttingen, became the first person to isolate the main active ingredient of coca and named it cocaine. It did not take long before the new miracle drug was mixed into all kinds of tonics, elixirs, and patent medicines. But there was also a simpler, if somewhat less effective, way to extract cocaine. If coca leaves were mixed with alcohol and left for a time, much of the cocaine was eluted into the fluid, and the leaves themselves could be discarded. Among several coca wines produced in this manner, Vin Mariani, a red Bordeaux, became the most successful. It contained about 250 mg of cocaine per liter, was awarded a gold medal by the Vatican, and was happily consumed by the royals of the time. It also inspired a Civil War veteran, John Pemberton of Atlanta, to create his own competing recipe in 1885. His Pemberton’s French Wine Coca was initially a success, but shortly thereafter Atlanta and the surrounding county outlawed alcohol. Faced with the need to adapt to the new market, Pemberton replaced his original recipe with one that was free of alcohol. Instead, it was sweetened and carbonated. It was promoted as a cure for many ills, including the morphine addiction suffered by countless veterans of the Civil War, including Pemberton himself. Its name was Coca-Cola. It would contain cocaine until shortly after the turn of the century. To this day it contains an extract of coca leaves, but now with the cocaine removed.
At this point it is tempting to quip about characteristic differences between Europe and the United States. Around the time Pemberton was making money off the coca leaf extract, a Viennese physician was writing to his future wife, expressing hopes that writing learned papers about cocaine would bring him academic recognition. Instead he ended up building an influential school of psychology while high on the drug. That, however, is a story better told by others.12 From our perspective, what followed is more important. By the first decades of the twentieth century, the addictive potential of cocaine was widely recognized, and its use outlawed almost everywhere outside of South America. Over the following decades there were recurring flare-ups of illicit cocaine use. But during this time the drug was used as the cocaine salt. In this form it has to be insufflated, that is, “snorted” or “sniffled.” The efficiency and rate of cocaine uptake are certainly greater when snorting cocaine powder than is possible to achieve by chewing coca leaves. But they are still quite limited. This kind of use is expensive and for the most part is confined to economic and artistic elites. Under these conditions, most users are able to control their use and ultimately discontinue it.
If, however, the cocaine salt is reacted with ammonium bicarbonate or otherwise made to form the free base, it is possible to vaporize it and inhale it through the lungs, something that cannot be done with the salt. This gets the drug into the bloodstream as fast as an intravenous injection. After that, transport into the brain is almost instantaneous. Cocaine has a somewhat unusual chemistry, which helps it get through the barrier into which the brain is wrapped with an unusually high speed. By now the pattern should be familiar. With the almost complete and near instantaneous uptake after the vapor is inhaled, the high reaches unprecedented levels. Along with it, so does the addictive potential. In 1984 a crude formulation of cocaine base hit the streets of American cities on a large scale. Because of impurities, it made a crackling sound as it was vaporized and inhaled. Imitating that sound, the name “crack” was born. With it came the well-known epidemic that ravished entire inner cities. What had been a recreational drug of elites became the curse of poor black communities. Over the coming years, a generation of “crack babies” was born, many of them with irreparable developmental defects from intrauterine cocaine exposure.
Cocaine targets the very core of the brain’s reward and approach circuitry. As discussed in a prior chapter, mesolimbic dopamine nerve cells can be thought of as a final common pathway whose activation energizes approach behavior. After dopamine is released from these nerve cells and has transmitted its signal, it is for the most part sucked back into the cells that released it. If you think about it, removing dopamine that has been released from the synapse is a necessity. After you’ve sent a signal, you must somehow make sure that the signal also comes to an end. How else would you be able to send the next one? You can only send Morse code if the longs and shorts come to distinct stops. The clearance of dopamine is that stop. It is accomplished by a molecular pump that is present in the nerve endings of dopamine cells. Called the dopamine transporter, this protein sits embedded in the nerve ending. It catches a dopamine molecule on the outside of the cell, makes a flip worthy of a freestyle swimmer, and is all of a sudden able to let go of its molecular cargo on the inside. Then it flips back and is ready for the next one. The existence of the transporter was established long before the era of gene cloning. It was well known, for instance, that if you made a soup of nerve endings in the lab and added to it dopamine with a radioactive label on it, the labeled dopamine would quickly be sucked into the nerve endings. An important role of the transporter for the addictive properties of cocaine was also suspected at least since the 1980s. But cocaine binds to more than one place in the brain. For a long time it was not completely clear which effect was most important.
The critical role of the dopamine transporter for psychostimulant properties of cocaine as well as amphetamine was nailed down only in 1996. In the beginnings of the 1990s Susan Amara, then at Yale University, had cloned the dopamine transporter. That allowed Marc Caron and his trainee Bruno Giros at Duke University to make a mouse that was genetically modified and lacked the transporter gene.13 I still remember my feeling of awe during the poster presentation at the Society for Neuroscience meeting that year. Posters usually show bar graphs, curves, tables. In the crowded halls of a meeting that already in those days attracted close to twenty thousand people, it usually took an effort to concentrate on the figures. This one had them, too, but I actually don’t remember any of them. What I do remember is that once I got through the crowd, I saw the little television monitor that Giros had brought with him. It was showing a video of a mouse, filmed while moving freely around in a box, without any drug treatment. It was constantly running around in a way otherwise seen only after a high cocaine or amphetamine dose. Giving those mice cocaine or amphetamine didn’t make any difference. I remember thinking, “Missing one gene did all that.” I thought of the many people I had met through the years who had always managed to sound wise by saying, “It can’t be that simple.” Well, I told myself, sometimes maybe it can.
But I get carried away, which easily happens with science that is this exciting. I’m giving you the story backward. Caron and Giros’s mouse confirmed beyond reasonable doubt what pharmacologists had long suspected. Cocaine does to the mouse or the human who takes it the same thing that happens when the dopamine transporter gene is knocked out. The drug binds to the transporter and prevents it from sucking back dopamine out of the synapse after it has done its job to transmit the signal. The result is that excessive amounts of dopamine build up in the synapse. With each pulse of dopamine release, the signal onto dopamine receptors is magnified. To a lesser extent, cocaine also blocks similar molecular pumps that exist for the transmitters serotonin and noradrenalin. But it is the elevation of dopamine levels that is behind the addictive potential both in animals and in humans.
Maybe because cocaine acts so directly on a core element of reward and approach circuitry, the high it produces is much more uniform than that from heroin or morphine. There are other important differences, too. Some of the clinically important differences are among consequences of increasing the doses people take. For starters, there are all the psychostimulant effects I have described—a high, increased movement, and so on. But as doses are increased further, the movement effects change in flavor. Rather than running around enthusiastic about everything, people will be locked into stereotypic movements that get repeated over and over again. This is even more pronounced with chronic amphetamine use, where it has earned a specific name: “punding.” As doses are increased further still, people become psychotic. In a strictly psychiatric sense, this means that they lose touch with reality and start having delusions, hallucinations, or both. In fact, because of that, dopamine overactivity was long thought to be important for schizophrenia. That turns out to be complicated, however, and the similarities may well be more superficial than we thought. Meanwhile, however, the delusions that come with the use of high psychostimulant doses have a particular flavor. Often they are what a psychiatrist would call persecutory: the patient is convinced that someone is out to get him or her.14 Of course, with this patient group, it is not always easy to know if this is just a delusion. Just because someone is paranoid does not mean no one is out to get him or her. Another psychotic symptom is a characteristic hallucinatory perception of insects crawling under the skin, the “cocaine bugs” that can make a patient rip through the skin of the forearms to get out the insects. The name is not the best one, since these hallucinations are not unique to cocaine; they are equally common with other psychostimulants, such as methamphetamine.
It was once written in medical textbooks that there is no withdrawal syndrome when people stop taking psychostimulants. In those days the concept of withdrawal was to a large extent shaped by what we knew about opiates and alcohol. The focus was on bodily symptoms—the shakes, running fluids, and aches of the withdrawing morphine or heroin addict, or the tremors, seizures, and delirium of the alcoholic. There is none of that after stopping heavy cocaine use. There is no increase in heart rate or blood pressure either. There is, in fact, not much that can be objectively measured, or heard with the stethoscope. But an important lesson we have learned since those early days is that the really important elements of withdrawal are psychological. And with psychostimulants, just like with other drugs, these follow the opponent process rule. Each binge cycle activates opponent processes that promote low mood. This pushes the patient toward the dark side of addiction. After a binge and its repeated highs, there is a crash. During that withdrawal period, the lingering activity of the opponent processes will increasingly result in symptoms that are opposite to those experienced when using the drug. Perhaps most important when it comes to psycho-stimulants, the high of intoxication will be followed by an increasingly depressed mood. At times people will be suicidal during this phase. Other withdrawal phenomena follow a similar pattern. The suppressed appetite rebounds, and people eat to catch up the deficits they have accumulated. After staying awake for days during the binge, they sleep a lot. But most important, they shift their affective baseline. In the absence of drug, there is misery. A new cycle is waiting to begin.
Fast forward to present time, and the number of cocaine users has dropped to half of what it was at the height of the epidemic. To some extent cocaine use has been replaced by an increased use of other psychostimulants, perhaps most important methamphetamine, or “meth.” If cocaine was a drug of the cities, many of the meth-kitchens are found in rural regions such as Appalachia and Missouri. Many other differences exist. Although I could write a whole chapter about amphetamines, the information about cocaine has provided a solid foundation and a mental structure into which it is relatively easy to insert these drugs. The drugs also affect the dopamine transporter in ways that result in synapses being flooded by excessive accumulation of dopamine. In the case of amphetamines, the transporter is not only blocked; instead these drugs have the ability to turn the pump molecule inside out and get it to pump dopamine out of the cell. A body of neuroscience has identified additional differences between the molecular and cellular effects of cocaine and the other psychostimulants. Maybe some of these differences will one day turn out to be very important. For now, however, understanding cocaine as the prototype psychostimulant and then learning a few of the key distinguishing characteristics of the other members in this class goes a long way. The effects of amphetamine are a lot like those of cocaine, although amphetamine does not get into the brain quite as fast. It is, on the other hand, more potent and longer lasting. Methamphetamine, in turn, is much like amphetamine except that it gets into the brain faster. Both are easier to make and cheaper than cocaine.
Psychostimulant addiction is an area of tremendous unmet medical needs. Right now not much in the way of treatment has any convincing scientific data to support its efficacy.
After all the years in my field, I still keep getting in trouble with the word “alcohol.” Alcohols really are a whole group of chemicals. The active ingredient in alcoholic drinks is one very specific member of the group—ethyl alcohol, or ethanol, for short. Other alcohols are present in our everyday lives as well, so the distinction is not purely academic. Methanol and butanol, for instance, are often used as fuels or solvents. The sugar alcohol xylitol is used in the food-processing industry and can be found in sugar-free chewing gum. When discussing the drug used and abused by humans, it would therefore perhaps be best to consistently use the proper term, “ethanol.” But then again, for consistency, would we perhaps have to say “ethanolism” instead of alcoholism? I am not sure anyone is ready for that. So I will stick with “alcohol,” though unless I say otherwise, what I refer to by that name really is ethanol.
Humans have known how to make alcohol for quite some time. It is generally held that our species settled down and embarked on the transition from hunter-gatherers to cultivating the soil some ten thousand years ago. After that it still took a while to figure out how to use yeast to ferment sugars obtained from the crops. But already about six thousand years ago, hieroglyphic inscriptions from ancient Egypt describe production and consumption of wine. Recipes for making wine or beer are also found on clay tablets from Mesopotamia, dated about a millennium later. In ancient Greece, fermented honey—mead—seems to have been the original alcoholic beverage, but wine-producing skills reached Greece about 2000 b. c. Alcoholic beverages were in those days not necessarily consumed only, or even primarily, for their taste or intoxicating properties. Fermented beverages provided a certain degree of protection against growth of bacteria, such as cholera, in drinking supplies. Because of this, fluid consumption in areas where freshwater was not readily available could to a large extent consist of beer or wine. Likewise, when ships went to sea, drinking supplies brought onboard were often not in the form of water but rather as beer. The alcohol content of these beverages was typically relatively low, probably not exceeding 4 percent, so the protection against growth of bacteria was not great—alcohol is best at killing bacteria at concentrations around 70 percent. But I guess a little was better than nothing.
Fermented beverages typically cannot reach an alcohol content beyond 15 percent or so. At higher concentrations the yeast that make the alcohol from sugar poison themselves and die, or at least become unable to keep up production. As already discussed, the rate with which the concentration of a drug rises in the brain is important for the intensity of the high, and therefore for the addictive potential of the drug. It requires quite a bit of determination to quickly work up high blood alcohol concentrations using beverages with low alcohol content. Producing beverages with higher alcohol concentrations became possible only once people learned to make distilled spirits. There are historical accounts of small-scale distillation from ancient Babylonia and Egypt, but successful distillation of quantities useful for human consumption was only achieved by Arab chemists around A.D. 800–900. By the thirteenth century this expertise had spread to Europe. By the fourteenth century the use of distilled spirits as medicinal elixirs was widespread, for instance, to provide protection or cure from the Black Death. Somewhat ironically, today the number one vodka brand in Iceland is Black Death. I guess the name may still be appropriate given the mortality caused by alcohol.
Alcohol is unique among addictive drugs in the way it produces its brain actions. Understanding that basic difference helps to explain why actions of alcohol are more complex and vary between people more than effects of most other addictive drugs. In contrast to heroin or cocaine, alcohol does not have a unique molecular target in the nervous system. It does not bind to, and activate or block, any specific neurotransmitter receptor or transporter. It was once thought that the actions of alcohol on the brain were caused through effects on the sheets of fat that make up the outer membranes of all nerve cells. As an organic solvent, alcohol was in those days thought to become dissolved in those membranes and change their properties. Because neurotransmitter receptors and other signaling molecules are embedded in the cell membranes, changes in membrane properties were somehow thought to change the communication between nerve cells. That does not seem to be the way it works, however. In fact, it is still not entirely clear what the very first steps are in the actions of alcohol. One possibility that is supported by good data is that alcohol enters pockets found in a large family of very important proteins. A prototype for these, called LUSH, is found in the fruit fly, which uses it to recognize certain odors.
Two of the most fundamental transmitter systems of the brain rely for much of their function on receptor proteins that are distal relatives of LUSH and have similar pockets. These systems use for their signals the molecules glutamate and gamma-amino butyric acid, or GABA for short. Each of them has multiple receptors. Within each receptor group, some members send their signal into the receiving cell through a type of mechanism that is different from that described for dopamine or endorphins. The receptors that trigger the most immediate consequences of a glutamate or GABA signal work as molecular channels that are controlled by their respective transmitter. They run from the outside to the inside of the fat sheets that surround the nerve cell. When no transmitter signal has been received, the channels are closed. When the signal arrives, they briefly open. When open, however, these channels are picky about what they let through. In the case of glutamate receptors, they will let through sodium or calcium. In the case of GABA receptors, passage will instead be provided to chloride. In the first case, the result is that the nerve cell will become activated. Chances are it will start firing away, sending signals to other nerve cells. In neuroscience this is referred to as becoming excited, and glutamate is simply the number one “excitatory” transmitter of the brain. When a GABA signal is received and chloride ions are allowed passage into the cell, the result is just the opposite. The activity of the nerve cell that received the signal will be dampened, and the cell will also be less likely to respond to a signal from other nerve cells. GABA is thus the most important inhibitory transmitter of the brain.
When alcohol is taken, the “first hit” is a set of actions on glutamate and GABA systems. In the short term, alcohol intake dials down glutamatergic communication. With alcohol onboard, less glutamate comes out from nerve cells that normally send out this signal, and the response by the neurons that receive it is also smaller for each signal that is received. Since the job of glutamate signals is to activate nerve cells, this set of actions is one way by which alcohol broadly dampens brain activity. But alcohol also strengthens GABA signals, by allowing more chloride to flow into cells that receive them,15 and probably also by increasing the amount of GABA that is being sent out. Knowing that GABA is the major dampening nerve transmitter of the nervous system, we should expect these actions to contribute to the broad dampening of brain activity produced by alcohol. Together these are indeed the mechanisms through which alcohol is a general brain depressant. The end result is that if alcohol is taken in limited quantities, it will lead to dampening of anxiety, which is all about being excited in the wrong brain areas. At higher levels the depressant actions will also lead to impairment of movement, balance, and thinking. Through the same broadly dampening mechanisms, consumption of higher amounts still will induce sleep, unconsciousness, and ultimately death. As discussed, people vary in their sensitivity to the nervous system dampening actions of alcohol, and this variation is to a large extent genetic. More on that in a bit.
Because glutamate and GABA systems are so fundamental for the millisecond-to-millisecond communication between nerve cells throughout the brain, the first hit of alcohol on these primary targets results in an unusually broad range of secondary effects. Among those are actions on systems we are already familiar with, because we have encountered them as key players in reward from addictive drugs. Perhaps most important among these are the dopamine and endorphin systems. Alcohol intake leads to release of both endorphins and dopamine. These secondary hits, although a bit more specific, still happen in several places in the brain. For instance, endorphins are released by alcohol in the ventral tegmental area, the origin of the classic dopamine cells thought to make up a final common reward pathway. In that case the endorphin release triggered by alcohol is what activates the dopamine cells. But when alcohol is ingested, endorphins are also sent out at the brain site where this dopamine pathway ends, the nucleus accumbens, where they can activate that structure directly. There is also endorphin release in parts of the frontal lobes. These mechanisms, and then some, act together to let alcohol activate classical brain reward circuitry. The end results may be a bit counterintuitive. Despite being a brain depressant, alcohol can produce reward, positive reinforcement, and also a general stimulation of movement, or “psychomotor stimulation.” But as already mentioned, it will ultimately also depress a wide range of brain processes.
These multifaceted effects of alcohol follow a two-phase progression. At the lower end of the intake spectrum, the rewarding and motor-stimulating effects tend to dominate, together with a dampening of anxiety that leads to behavioral disinhibition and a further accentuated, stimulant-like net result. The dampening effects begin to take over as higher amounts are consumed, leading to impairment, loss of balance, sleepiness, and ultimately unconsciousness. But because so many different actions combine to produce the net effect of alcohol on the brain, there is great potential for different people to be affected differently. Say, for instance, that for some reason, the cascade alcohol → endorphin → dopamine → reward and movement happens to be particularly responsive in your case. At the same time, perhaps your sensitivity to the dampening actions is low. In this scenario the rewarding and stimulant-like actions of alcohol will dominate up to very high intake levels. For someone else maybe a mild reduction of tension and anxiety will directly be followed by becoming tired and sleepy. Genetics, sex, age, tolerance, and other “host factors” will in each case determine the relative contributions of the different alcohol effects to the net outcome. Perhaps most important, they will determine the balance between rewarding and motor-stimulating actions, on one hand, and dampening and sleep-inducing effects, on the other. As we will see when we come to medications for alcoholism, these distinctions are becoming increasingly important to understand.
And then, after exposing the brain for a long time to cycles of intoxication and withdrawal, the now familiar allostatic shift will occur. Alcohol may not be as intensely rewarding as other addictive drugs initially, but the allostatic process is probably more pronounced with this drug than with most if not all other addictive substances. In the presence of alcohol, opponent processes are progressively recruited that counteract acute alcohol actions. When alcohol is withdrawn, these opponent processes become predominant. For instance, processes opposing the dampening and sleep-inducing actions of alcohol, when uncovered, will result in the sometimes dramatic increase in nervous system excitability seen during alcohol withdrawal. This makes people jittery or anxious and can in more severe cases lead to withdrawal seizures and delirium tremens. Dialing up of processes that oppose the rewarding and motor-stimulating actions of alcohol actions will lead to a progressively dampened reward-system function: less dopamine, higher intracranial self-stimulation thresholds, all being signs of what can be called a reward deficit syndrome. Meanwhile, turned-up activity of corticotropin-releasing factor will further increase negative emotions and stress reactions. In the vicious circle characteristic of the dark side of addiction, this will set the scene for increased negative reinforcement through renewed alcohol intake.