Even centuries ago people wrote about the enslaving properties of both opium and alcohol, so in a sense the concept of addiction has really existed for many years. Rather surprisingly, however, the use of these drugs did not seem to carry any moral condemnation, nor did an awareness of their non-medical effects have any real social or political significance until the 1800s, although the public generally recognized the potential of withdrawal symptoms and the risk of toxicity when these substances were taken in large doses. For example, before 1868, when the Pharmacy Act was passed in Great Britain, opium was readily available over the counter in any kind of shop (Berridge, 1997). However, these liberal ideas changed quite markedly during the Victorian era. By the end of the nineteenth century, both in Great Britain and the United States, heavy use of opium and alcohol was labelled inebriation, and this condition began to carry with it the stigma of moral corruption – even mental illness. We can probably say, therefore, that addiction, as a tangible construct, really emerged with the nineteenth century notion of inebriety since formal government legislation and drug policy followed soon afterwards. Nevertheless, for most of the twentieth century the issue of addiction and dependence was still largely restricted to alcohol and the opiates. Although certain other recreational substances became illegal during this period – for example, cocaine was banned in the United States in 1914 – this happened mainly because of evidence that cocaine could produce serious medical complications (Dackis & Gold, 1985). Marijuana was similarly criminalized in the United States in the 1930s. Again, this was not because of an association with addiction, but rather because of research linking it to violence and other socially deviant behaviours – a reaction that many believe was spawned by public resentment of the influx of Mexican immigrants during the depression, and their recreational use of the marijuana leaf.
It was not until the post-World War II rise in psychiatry, and the ‘rediscovery of addiction’ (Berridge, 1997), that substances other than alcohol and the opiates were more formally added to the list of addictive drugs. Surprisingly, nicotine was remarkably slow to join and be fully integrated within the addiction model. In fact, public concern about smoking did not develop because of any real association with dependence, but rather because accumulating epidemiological evidence linked it to lung cancer. And, some have argued that it was not until the 1980s, and the advent of nicotine replacement therapy, that smoking-cessation difficulties really meant, unequivocally, nicotine addiction.
The field of addiction has been dominated for many years by the view that psychomotor stimulant drugs (see below) are at the heart of all addictive behaviours – indeed, so much so that the phrase ‘drug dependence’ has become almost synonymous with the term addiction.
Psychomotor stimulant drugs
Psychomotor stimulant drugs are different from the general central nervous system (CNS) stimulants, such as caffeine, in that their action is more localized and evokes a particular set of psychological responses. In humans, and at moderate doses, these drugs tend to increase physical arousal and to improve cognitive performance on tasks related to reaction time, vigilance, and stamina. In animals, they increase locomotion at moderate doses, but at high doses animals display ‘stereotypic’ behaviour which is an exaggerated form of behaviours that are part of their natural repertoire. For example, laboratory rats will engage in excessive grooming behaviours or exaggerated exploratory rituals.
It is now commonly believed that it is the psychomotor stimulant property of a substance (or activity) that determines whether it is potentially addictive. The most commonly used psychomotor stimulants are crack/cocaine and amphetamines. However, even drugs which are mostly known for their CNS depressant effects, such as alcohol and the opiates (morphine and heroin), have psychomotor stimulant properties, and evidence suggests this is principally what makes them so addictive. At high doses, the sedating effects of the CNS depressants are the most pronounced. Interestingly, however, tolerance develops more quickly to their sedative effects than to their psychomotor stimulant effects. Alcohol is a good example. People who drink alcohol regularly often experience an enlivening effect from the first few drinks, rather than any feelings of sleepiness.
However, in recent years there has been an obvious paradigm shift, and many now believe that addiction extends beyond the use of conventional substances – those that can be injected, ingested or inhaled, such as heroin, amphetamine and marijuana – and includes certain compulsive activities such as gambling, internet use, and shopping (Griffiths, 2001; Orford, 2001). Others have also argued that even natural rewards like eating and sex can be just as addictive as pharmacologic agents (Holden, 2001). Therefore, and in our discussions about addiction in the remainder of this chapter, we will adopt the perspective that addiction is not just about drug-taking, but rather that it comprises any behaviour that has the potential to become excessive, and that satisfies the set of commonly agreed criteria for addiction. Nevertheless, by adopting this broader viewpoint, we risk another difficulty – that of knowing where to draw the line between those behaviours that often cause addiction, and those that seldom do. For example, some have argued that it is possible to become addicted to work and to television – even to various forms of exercise, such as rock climbing (McIlwraith, 1998; Burke, 2000)! Although that may be true at one level, it is also clear that these activities do not share the same qualities as the core addictions, such as drug-taking, alcohol consumption, smoking, and gambling, both in terms of their reinforcing properties and in the strength of their addictive potential (see Orford, 2001).
Although the experts have not reached complete consensus on the defining characteristics of addiction – nor on how they should be ordered by importance – at least most agree there are certain factors that should be included. Foremost is their progressively compulsive nature, even in the face of adverse consequences to health, safety, and social relations (Berridge & Robinson, 1995). In other words, to the serious detriment of conventional daily pursuits, such as looking after the needs of one’s family, functioning well at one’s job, and attending to normal social obligations, the addict spends more and more time carrying out the behaviour or seeking opportunities to do so. Nor is the addict deterred by the knowledge that the behaviour can cause serious medical problems, for example in the case of smoking.
Tolerance is also a defining characteristic of addiction and describes the experiential changes that occur as a result of chronic overindulgence. With continual and frequent exposure, the individual typically requires more of the behaviour to produce the same pleasurable or reinforcing effects. For instance, cocaine addicts repeatedly explain that they are never able to achieve the extremely euphoric feeling that came with the first few ‘hits’ of the drug. Tolerance effects have also been demonstrated in drug-administration experiments with animals. Initially, and at moderate doses, addictive drugs cause the animal to increase its locomotion. However, after repeated exposure to the drug, the animal no longer responds with the same degree of heightened activity.
In the early theories of addiction, and for many years to follow, withdrawal was seen as its core and quintessential feature; indeed, fundamentally necessary for its diagnosis. At that time, withdrawal had a very precise meaning and was inextricably linked to physical dependence and the symptoms of illness. Only if cessation of the behaviour resulted in severe bodily discomfort was the substance thought to be ‘addictive’ and the person experiencing these symptoms, ‘dependent’. However, among all the drugs that are now commonly abused, severe physical withdrawal symptoms only occur with the chronic and excessive use of alcohol and opium (and its derivatives, heroin and morphine). This fact probably accounts, at least in part, for the early belief that these were the only truly addictive substances.
Viewpoints about withdrawal changed quite dramatically in the 1980s when Western countries experienced the widespread and illicit use of cocaine and ‘crack’ – the latter being an affordable and arguably more addictive form of the powder. Currently, a vial of crack sufficient to produce an intense euphoria costs about $5 in most American cities, making it easily available to all members of our society, including adolescents. We now know that crack/cocaine are amongst the most potent and reinforcing of the euphoriant drugs, and yet abstinence, even after heavy and long-term use, is not associated with any very pronounced physical withdrawal symptoms. This fact has forced researchers and clinicians to rethink the central and prominent role that physical withdrawal has played historically in definitions of addiction. And when nicotine eventually joined the ranks of addictive substances, the evidence against the centrality of withdrawal was overwhelming since two of the most addictive substances in common use are virtually devoid of withdrawal symptoms, in the conventional physical sense. At this point, the World Health Organization declared that withdrawal was ‘neither necessary nor sufficient’ for a diagnosis of addiction.
However, in recent years, views on withdrawal have changed again, moving beyond a simple focus on physical effects, to include unpleasant feeling or emotional states. In other words, there has been a shift away from physical dependence, towards an emphasis on motivational dependence and the importance of withdrawal-induced dysphoria as powerful incentives for the continuation of addictive behaviours (Di Chiara, 1999). It is now abundantly clear that a pleasureless state and blunted affect – a condition similar to the anhedonic state that defines melancholy (see Chapter 5) – occurs when any addictive behaviour is discontinued, or its ‘dose’ is suddenly reduced. Anhedonic effects have also been seen in animals’ behaviour after chronic drug exposure. Numerous studies have shown that if animals are allowed to self-administer large and intoxicating doses of any addictive drug over long periods of time, their threshold for natural reinforcers is elevated. For example, compared to baseline levels the animal will ‘work’ (for example, press a lever or run through a maze) for food, water, or access to a mate, at a significantly reduced rate after the drug administration. Clearly, the drugs have perturbed the animal’s system in such a way that it is no longer as motivated to engage in previously rewarding activities. Later in this chapter we will explain more about the neurophysiological changes that cause this effect.
It seems, however, that the experience of withdrawal, either in its physical or psychological form, is probably not the greatest impediment to successful treatment for addiction, no matter how intense the symptoms may be. A more potent influence is the very strong urges that the addict experiences to continue or resume the behaviour. This ‘craving’ for the behaviour is often experienced as overpowering, and can persist even after a long period of abstinence. Indeed, according to the reports of addicts themselves, these cravings account, in large part, for their failure to stay ‘clean’ after treatment; and why some have suggested that addictions should be called ‘chronic relapsing disorders’ (Leshner, 1997) – despite the obvious pessimism that such a term conveys. What also makes relapse so difficult to overcome is that the cravings and feeling of ‘loss of control’ may be elicited by so many diverse factors. For example, external cues or triggers, such as a stressful situation, as well as internal factors, such as negative mood states, can give rise to cravings (el-Guebaly & Hodgins, 1998).
However, the most difficult to manage – because of their ubiquity and their relative uncontrollability – are the classically conditioned environmental cues. Objects or events which were repeatedly paired with the behaviour during the time the addiction was developing begin to function with the same strength, and in the same consistent way, as Pavlov’s bell did to his salivating dogs (see Chapter 5). Their power is that they signal the approach of the rewarding event – food for Pavlov’s dogs; drugs for the addict. For example, the environmental context in which the behaviour typically occurred – the neighbourhood bar for the alcoholic, the street where the drugs were obtained for the heroin addict, the tea break for the smoker – becomes the conditioned stimulus, and therefore has the ability to evoke the same intense cravings as those fostered by the addictive behaviour itself. In a series of elegant experiments, Childress and colleagues (1999) demonstrated the strong physiological effects of cue-induced craving. By use of positron emission tomography (PET) and measuring regional blood flow, these workers found that certain areas of the brain were activated to a very high level among cocaine abusers simply by showing them videotaped scenes of cocaine paraphernalia. More details about the specific brain regions that are activated will be discussed in detail in the next section of this chapter.
These and other similar studies have demonstrated the strong motivational properties that conditioned cues can have in pulling the addict towards the destructive use of their addictive behaviour. Cue-induced craving may also explain why relapse is less likely to occur during periods of detoxification, when addicts are usually in an institutional setting, than after they are rehabilitated and have returned to their familiar home environment. Even though the withdrawal symptoms may be the most severe during the detox period, the addicts are effectively removed from most of the conditioned environmental cues which tend to elicit potent cravings.
A glimpse at the current ‘hot’ topics in addiction research shows a clear bias favouring the view that addiction is a ‘brain disease’ (Leshner & Koob, 1999). There are probably many reasons for this emphasis, but one of the most important is the recent development of sophisticated brain-imaging techniques. Researchers now have an unprecedented opportunity to examine human brain-functioning in synchrony with subjective experiential reporting – a methodology that provides a clearer window than ever before into the brain–behaviour–motivation synthesis underlying addictions. Notwithstanding the apparent enthusiasm for a ‘disease model’ of addiction, most experts readily acknowledge that a multitude of other events influence the taking, and abusing, of addictive behaviours. These include factors such as availability, cost, peer pressure, and social tolerance. Therefore, any balanced and comprehensive view of addiction must take account of a combination of environmental factors, biological vulnerabilities, and a psychologically susceptible disposition. Leshner (1999) articulated this viewpoint by calling addiction ‘a quintessentially biobehavioural disorder’. It occurs not only because of prolonged exposure effects on brain structure and function, but it also involves critical behavioural and social-context components.
But we also want to emphasize that addiction is not a condition that is passively conferred upon us; it does not suddenly overcome us like a major depressive episode or a panic attack might – without us taking any conscious or overt part in its development. Nor is it simply due to the concatenation of various risk factors that reflect our past history or our biological make-up. What makes addiction different from other psychological disturbances (except the eating disorders which we discuss in the next chapter) is that the individual is an active participant in the process. No one will ever become an alcoholic unless they take the first drink, nor a compulsive gambler if they never place a bet. And, addiction is a process that takes time to develop. With some substances, such as crack/cocaine, this can happen fairly rapidly. For others, such as cigarettes and alcohol, it generally happens over a period of years. In the following sections, we will briefly describe the brain mechanisms underlying addiction, including the acute and long-term effects of addictive behaviours on the structure and the function of brain biology.
Some have used the term ‘hijacked’ to describe the effect of addictive substances on normal brain functioning. Although this may sound like a rather dramatic turn of phrase, its appropriateness stems from the fact that cocaine and other drugs – indeed, all potentially addictive behaviours – activate and can subvert areas of our brain which evolved to regulate and sustain the most basic aspects of our existence. In other words, the same brain circuitry which subserves feeding, sex, and other essential survival behaviours, also underlies the development and maintenance of substance abuse. And, because these non-natural behaviours ‘trick’ the brain into thinking that a survival need has been met, it is not surprising that addicts typically have a diminished sexual libido and appetite for food.
The mesocorticolimbic dopamine pathway is involved in the pleasure and reinforcement associated with natural reward states, such as eating, drinking, mating, and maternal behaviour, as well as with less basic needs, such as social interaction and novelty. As illustration, activation of the dopamine pathway was clearly seen in response to a palatable food in a recent brain-imaging study. When subjects ate a piece of chocolate and rated the experience as ‘pleasurable’, there was increased regional blood flow in the striatum (Small et al., 2001). Moreover, when subjects felt highly motivated to eat more chocolate, the same brain regions (viz. the caudomedial orbitofrontal cortex) were active as those implicated in the experience of drug cravings. As an aside, it is interesting to note that chocolate has been identified as the single most craved food in studies of food preferences.
In another interesting study, investigators found that when men were exposed to pictures of female faces and asked to give preference ratings, only the ‘attractive’ female faces activated the brain’s reward circuitry, not the faces which had been rated as ‘neutral’. The men also took longer to rate the attractive faces, inferring that they looked at them longer because it was a more pleasurable experience. By contrast, when attractive male faces were shown to these men, they produced what could be considered an aversive reaction (Aharon et al., 2001). How does one explain that? Evolutionary psychologists would probably say that the male aversion to other attractive males occurs because the latter represent a threat in the competition for accessible females.
Neuroimaging techniques have contributed greatly to our understanding of the biology of addiction. However, this is a relatively recent technology, and before we had regular access to these procedures, we relied mostly on what we could glean from animal research, making inferences about the rewarding properties of drugs from the way the animals behaved. Researchers have demonstrated repeatedly that cocaine, heroin, and a host of other addictive drugs are readily self-administered by several species of experimental animals (for example, laboratory rats or monkeys). There is also such a positive and strong relationship between the human abuse potential of a particular drug and the degree to which animals will self-administer the substance, that this experimental paradigm continues to serve as an excellent tool for our exploration of the neurobiology of drug reinforcement (Withers et al., 1995). One disadvantage of this experimental approach, however, is that the animals cannot tell us anything about how they ‘feel’ when they take drugs!
Another sort of animal model – the ‘knock-out’ mouse (see Chapter 5) – has also proved useful in studying the role of dopamine in the reinforcing and addictive properties of drugs of abuse. For example, one strain of mice has been genetically modified to lack the dopamine transporter (DAT), a membrane-bound protein found on the terminals of mesolimbic neurons, and whose role is to remove extracellular dopamine and carry it back into the cell body for reuse. Not surprisingly, these knock-out mice (DAT-KO) demonstrate the marked hyperactivity we would expect in the case of hyperdopaminergic tone, showing a 300-fold increase in the amount of time that dopamine spends in the extracellular space of their brains, compared to wild-type mice (Gainetdinov et al., 1999). Also, when a dose of cocaine was given to DAT-KO mice it produced no further increase in their locomotion, as we would expect. It is interesting, however, that cocaine still maintains its rewarding properties in these mice – a fact that seems rather counterintuitive since we know conclusively that cocaine produces its euphoric effects by binding to the DAT and rapidly increasing synaptic mesolimbic dopamine availability. Part of the explanation relates to another neurotransmitter, glutamate, which activates brain cells devoted to dopamine, and is also involved in memory. Support for this notion comes from another strain of knock-out mice who lack a particular glutamate receptor, and who do not become dependent on cocaine, no matter how much they take. And the action of this glutamate receptor seems to be very specific to drugs like cocaine since the glutamate knock-out mice are just as motivated to approach natural rewards, such as food and water, as any other mice.
Although several anatomical structures and neuronal projections comprise the mesocorticolimbic dopamine system, and are implicated in the biology of natural and pharmacologic reward, the function of four of these regions is most clearly understood. The ventral tegmentum is an area in the midbrain, rich in dopamine neurons, which sends projections through the medial forebrain bundle to a set of limbic brain regions, including the nucleus accumbens and amygdala, and to the prefrontal cortex. Together these, and related structures, are known as the ‘common reward pathway’ because their activation or stimulation is experienced as pleasurable and reinforcing. In fact, if animals are given unimpeded access to self-stimulation of this circuit (via an electrode implanted in the brain which sends an electric current whenever the animal presses a lever), they will lever-press excessively and to the point of death from self-starvation.
Whereas several areas of the brain are associated with the subjective feelings of pleasure – or the euphoric ‘rush’ one gets from drugs like cocaine – increased dopamine transmission in the nucleus accumbens seems to play the most central role in mediating reinforcement. Indeed, the nucleus accumbens has been called the ‘Universal Addiction Site’ (Leshner & Koob, 1999) because most, if not all, drugs (or activities) of abuse stimulate extracellular dopamine in this area, albeit through interactions with different proteins and receptors (Gamberino & Gold, 1999). Some have also described the nucleus accumbens as a limbic–motor interface because increased dopamine release in this area seems to have a pivotal role in providing certain stimuli with the incentive qualities needed to increase appetitive behaviour. The neurobiological mechanisms by which drugs increase extracellular dopamine in the reward pathway are considerably varied. Some, like amphetamine, stimulate the synthesis and release of dopamine from the cell body, others inhibit intra- and extracellular metabolism, and drugs like cocaine block the synaptic clearance of dopamine via the dopamine transporter (reuptake pump).
Interestingly, burst firing activity of the dopamine reward neurons has been observed with great consistency not only during the consummatory phase of rewarding activities but also well before the consumption begins. This strongly suggests that the common reward pathway is also involved in associative learning; that is, in establishing the conditioned reinforcement of environmental cues that signal the approach or the onset of the natural reward state. From an evolutionary perspective, this is clearly a very adaptive function since the organism will fare much better if it is able to discriminate between stimuli that predict when a rewarding event is likely to happen, and those that do not. A relatively large body of research supports the role of the amygdala in this process because it maintains a representation of the affective or emotional value of the conditioned stimulus (Jentsch & Taylor, 1999). One type of supporting evidence comes from studies of second-order schedules of reinforcement where it has been observed that experimental animals continue to respond to the presentation of a stimulus (the CS) that has been paired with a primary reward. We also know that the strength of conditioned reinforcement is greatly enhanced when the CS is paired with pharmacologic reward, such as cocaine, instead of a natural reward, such as food. However, when the amygdala is lesioned, animals show a clear and progressive impairment of responding under these second-order schedules of reinforcement (see Parkinson et al., 2001). As we saw in Chapter 5, the amygdala operates as strongly for aversive and fear-provoking stimuli as for rewarding stimuli.
A final aspect of the functional neurobiology of addiction involves the prefrontal cortex. This area is thought to serve an ‘executive’ function in the brain by acting as a gating mechanism to moderate the suppression of limbic impulses. One method that scientists have used to study this area of the brain is to lesion or block its function pharmacologically with antagonist drugs. Another is to examine the behaviour of patients who have suffered frontal lobe damage. In studies where the function of the prefrontal cortex has been disrupted in one way or another, we see the person’s inability to suppress inappropriate responses – in other words, a diminution in the ability to self-regulate one’s behaviour and a frank loss of inhibitory control. In these cases, the individual’s behaviour seems to be largely guided by previously conditioned responses which are not suited to the current situation (Jentsch & Taylor, 1999).
In recent years, the role of the prefrontal cortex has gained increasing prominence in our understanding of the addiction process; especially in our increasing awareness of its function in decision-making and in controlling behaviour that entails the risk of punishment (Bechara et al., 2001). Currently, the evidence points to the fact that specific regions of the prefrontal cortex are responsible for regulating behaviour – specifically, for inhibiting the drive to respond to immediate reinforcement if the long-term consequences are likely to result in some negative outcome. Some have suggested that impairments in the capability of making good decisions stand at the core of addictive behaviours (for example, Grant et al., 2000). In other words, addicts tend to choose immediate rewards even if they result in long-term negative consequences. However, what is not entirely clear is whether this impairment is a consequence of drug-taking and overactivation of the reward circuitry, or whether there is a premorbid tendency or handicap in the adaptive functioning of the frontal cortex that increases the risk for addiction. More on this will be discussed in a later section of this chapter.
When the brain is activated excessively, and chronically moved beyond its natural or homeostatic state, neurochemical changes or alterations begin to occur. What is insidious about the overuse of addictive activities or substances, however, is that they change the brain in ways that contribute to further seeking and further use – a process that, over time, creates a vicious downwardly spiralling cycle of behaviours that are difficult to resist and highly prone to relapse if abstinence is attempted. Although the neuroadaptive responses that occur are complex, and vary from one substance (or activity) to another, there are some general adaptations that are common to all addictive behaviours. Paradoxically, the two most pronounced changes operate in virtually opposite directions – desensitization, on the one hand, and sensitization on the other.
Earlier in this chapter we discussed the phenomenon of tolerance and the fact that after prolonged and intense exposure to an addictive behaviour, more of it is required to produce the same subjective effect; that is, the same feelings of pleasure or reward. Desensitization is the neural mechanism underlying this aspect of addiction. When there are prolonged and repeated elevations of extracellular dopamine (hyperdopaminergia), the brain attempts to compensate for the excessive stimulation by changing its function in some way. We now have a pretty clear idea that long-lasting alterations take place at the post-synaptic level where a downregulation or reduced sensitivity of the dopamine receptors occurs. When trying to understand how this happens, it may be helpful to use a rather ‘domestic’ metaphor. Think of a house with a lot of open windows and suddenly the temperature outside drops and becomes very cold. The occupants cannot change the climate; all they can do to restore a comfortable and normal temperature in the house is to close some of the windows. That seems to be the strategy our brain adopts when it becomes chronically overstimulated. Regrettably, one of the behavioural consequences of this neuroadaptation is that it seems to foster a desire for more extensive drug-taking since the individual needs to take larger and and/or more frequent doses to achieve the initial or desired effect of the behaviour. Recent evidence suggests that downregulation also functions at the level of the prefrontal cortex, and that reduced activity in this area (hypofrontality) underlies the difficulty that addicts have in resisting impulses to use their drug. In other words, the executive function of the prefrontal cortex is compromised.
Researchers have studied the neurobiology of desensitization in a variety of ways. Some have examined the cadaver brains of former drug addicts and found a reduced density of dopamine receptors compared to normal cadaver brains. However, a problem with this type of research is the question of causality. Did the drug addiction cause the reduced receptor densities (via downregulation), or were the addicts’ brains like this before the addiction? And if so, did this factor contribute in some way to their addiction? Less confounded research has come from studies with animals. In one PET-imaging study with non-human primates, there was an observable downregulation of dopamine D2 receptor density after animals had been chronically exposed to amphethamine stimulation (Ginovart et al., 1999).
Sensitization also plays a significant role in supporting addictive behaviours. Repeated but intermittent exposure to psychomotor stimulant drugs seems to produce heightened or increased behavioural and neurochemical responses to subsequent drug exposure. In animals, one way this can be seen is by increases in their activity levels – a behavioural marker of enhanced arousal or activation to the drug. Sensitization also appears to function at the level of stimulus-reward or associative learning. After an addictive substance has been used many times, we see an enhancement of its incentive value and that of its conditioned stimuli. The ‘fatal flaw’ of this neuroadaptation is that the addict’s behaviour becomes more and more under the control of these conditioned reinforcers. Sensitization also controls behaviour because over time it increases the attention-getting properties of conditioned cues that reliably predict reward. In other words, addictive behaviours seem to enhance the attentional bias to stimuli that are associated with the addictive behaviour, thereby contributing to its increasingly compulsive use.
From an historical perspective, the emphasis on personality and personality pathology in the development of addictive behaviours has fluctuated considerably. In early theories, during the first half of the twentieth century, a disordered or maladjusted personality was believed to be the root cause of all addictions. However, by the 1970s, this perspective was mostly abandoned because a large body of research had failed to find one consistent pre-addictive personality (Verheul & van den Brink, 2000). In recent years, the tide has turned again, and personality pathology has now regained its prominence in the addiction risk profile. Presently, the most prominent aetiological viewpoint is that of a stress-diathesis model whereby addictions develop from a reciprocal interaction between the psychological and biological vulnerability of the individual, and their environmental circumstances. Even the most extreme environmentalists in psychology have been forced to acknowledge that genes contribute to individual differences in behaviour. However, behavioural traits are highly complex and therefore rarely affected by a single gene. Indeed, they have been characterized as polygenic, meaning that any given gene is likely to contribute only a small portion to the phenotypic (behavioural) variance (Crabbe, 2002).
It is now generally agreed that addictive behaviours can begin through two motivational routes: either the seeking of positive sensations, or the self-medicating of painful affective states. Recently, perhaps more researchers have favoured the latter perspective – that a disturbed affect and a difficulty with the regulation of unpleasant emotions are at the heart of most addictive behaviours (Khantzian, 1997; Leshner & Koob, 1999). In the remainder of this chapter we shall review the causal evidence linking certain personality factors to the development of addictions. However, when studying the personality aetiology of this disorder, we need to consider that the traits predicting who might experiment with substances (and other addictive behaviours) – even use them on a regular basis – may be quite different from those that influence who will abuse these behaviours.
Information about the role of personality in the aetiology of addiction, comes from three primary sources:
In this last regard, the evidence is somewhat compromised by the ongoing debate about whether, or to what extent, Axis II diagnoses in addicts are merely substance-related artifacts reflecting conditions created by the addiction rather than ‘true’ personality disorders with onset prior to, and independent of, the addiction (Verheul & van den Brink, 2000).
In current formulations about vulnerability to addiction, three primary causal or developmental pathways have emerged from the search for personality risk factors. The first body of research has focused on a construct we shall call sensitivity to reward, the second on impulsive behaviour and the third on proneness to anxiety and negative mood. Although these have mostly been studied as independent factors, we will see a certain amount of overlap among them, both theoretically and concerning underlying biological mechanisms. Interestingly, these domains of emotional experience are remarkably in step with the personality taxonomies of Eysenck, Gray, Zuckerman, and Cloninger, which were discussed in Chapter 3. By factor analysing the trait scores from all these measures of personality, Zelenski and Larsen (1999) identified three factors which they named ‘reward sensitivity’, ‘impulsivity/thrill-seeking’, and ‘punishment sensitivity’. In a second phase of the study, these authors found that these three factors predicted different sensitivities to emotional states; reward sensitivity only predicted positive mood (or its absence at low ends of the dimension) and punishment sensitivity only predicted negative mood. Impulsivity/thrill-seeking, on the other hand, seemed to predict few emotions in either context.
In the following sections, we will summarize the findings that have emerged from the personality categories described above, and explain how each relates to the onset and progression of addictive behaviours.
Meehl (1975) was one of the first to suggest that within the general population the capacity for pleasure or reward exists as a normally distributed and biologically based dimension. Subsequent research has firmly rooted this personality construct in the neurobiology of the mesolimbic dopamine reward system. The sensitivity to reward dimension has also been associated with the motivation to approach rewarding stimuli, as well as with the ability to experience reward from engagement in these behaviours. In other words, the simple expectation of reward (usually triggered by some signal of forthcoming reward, such as the smell of cookies baking in the oven) tends to produce a feeling of pleasure and increases motivation to engage in the rewarding behaviour (see Germans & Kring, 2000). In the context of addiction research, we shall see that traits located at both ends of the sensitivity to reward continuum have been implicated in the development of addictions – albeit for different reasons.
The term anhedonia was coined to describe the low end of the sensitivity to reward dimension. As we learned in Chapter 5, this term refers to the diminished ability to experience pleasure and reward from natural reinforcers, and is thought to reflect compromised, hyposensitive, or sluggish dopamine availability. Indeed, neuroscientific research has strongly supported the hypodopaminergic tone underlying anhedonia. For example, Breier et al. (1998) found that personal detachment and indifference to other people was associated with reduced density of striatal dopamine D2 receptors and dopamine availability in the brain reward areas. Also, a distinctive personality type characterized by introversion, apathy, and low preference for novelty has been associated with Parkinson’s disease – a degenerative condition caused by a diminution of dopamine neurons in the substantia nigra and ventral tegmentum areas of the brain (Slaughter et al., 2001). There is also some animal support for the notion that anhedonic characteristics may be more about an hedonic deficit in approach motivation than an actual deficit in the ability to experience pleasure, although compelling human evidence is tentative at this point (Heinz, 1999).
The high end of the sensitivity to reward continuum describes an enhanced motivation to engage in natural rewards, such as eating, mating, and maternal behaviour. Novelty is also a state that both human and non-human animals tend to find rewarding. For example, in place-conditioning experiments, rats chose to spend more time in the environment that had previously been paired with a novel stimulus than in the non-conditioned area. However, recent physiological evidence suggests that the mechanisms of novelty reward are rather different from those underlying novelty seeking – the former, but not the latter, involving the dopamine system (Bevins et al., 2002). Another finding, highly relevant to the subject of addiction, is that novelty seems to potentiate the reward of other pleasurable stimuli. In one study, it was observed that the striatum was activated more in rats who received a combination of amphetamine and novelty than in control rats who received either a novel stimulus, or a dose of amphetamine on its own (Badiani et al., 1998). But perhaps that should not really surprise, as most of us would agree that food tastes much better in an interesting restaurant when dining with friends than it does when we eat roughly the same thing at home by ourselves.
While there are clearly inherent aspects to the variability in sensitivity to reward, anhedonia may also be induced by environmental and behavioural factors. In this context, Wise (1982) was the first to introduce the term ‘anhedonia’ to the field of addiction research after it had become clear that activation of mesocorticolimbic dopamine played a central role in animals’ responses to a variety of reinforcing stimuli, such as food and water, intracranial self-stimulation, the opiates, and a variety of other psychomotor stimulant drugs; and when it was found that neuroleptic drugs (that is, dopamine receptor antagonists) blocked the positive reinforcement associated with these stimuli. Since then, many other studies have demonstrated that a relatively long-lasting anhedonic state may also be induced by prolonged exposure to psychomotor stimulant drugs (see Gamberino & Gold, 1999) and to chronic mild stress (Zacharko, 1994). We now have a fairly clear idea that state-induced anhedonia is primarily mediated by downregulation of postsynaptic dopamine receptors. Although repeated administration of addictive substances results in synaptic deficits in two brain neurotransmitter systems – dopamine and serotonin – the former seems to underlie the anhedonia associated withdrawal, whereas the latter is associated with negativity and poor impulse control (Rothman et al., 2000).
As we said earlier, there is accumulating evidence that individual differences in sensitivity to reward are related to susceptibility to addiction, with risk being conferred from both ends of the continuum. We shall begin first by citing some evidence related to the low end of the continuum. We know, for instance, that a strain of alcohol-preferring rats has lower levels of dopamine concentration in the nucleus accumbens compared to non-preferring rats (see Cloninger, 1987 for a review). Also, genes of the dopamine system have been studied as candidates for risk. For example, certain forms of the dopamine transporter (DAT1) gene (viz. the 9-repeat allele and rare shorter alleles) are considered ‘low risk alleles’ for addiction because functionally they are associated with a less efficient DAT and therefore greater availability of extracellular dopamine. On the other hand, the 10-repeat (and rarer longer alleles) are considered ‘high risk alleles’ for the opposite reason – a more efficient DAT and therefore less dopamine availability (Rowe et al., 1998; Waldman et al., 1998). In addition, a form of the dopamine D2 receptor gene (the A1 allele) has been associated with reduced density of dopamine receptors, and those with this genotype (compared to the A2 allele which has been associated with increased receptor density) are more likely to exhibit compulsive and addictive behaviours, such as alcoholism (Noble et al., 1991). In an interesting and relevant study of gene–gene interactions, a 50 per cent reduction in smoking risk was found for those with the 9-repeat DAT1 gene and the dopamine D2–A2 gene (Lerman et al., 1999). This pronounced effect was attributed to the combination of greater availability of synaptic dopamine, and higher functioning of the dopamine receptors.
Lastly, in an elegant study highlighting the risk potential conferred by specific genes, Volkow and colleagues reported that subjects who experienced the effects of a cocaine-equivalent drug as ‘pleasurable’ had significantly lower dopamine D2 receptor levels than those who found the effects ‘unpleasant’, suggesting that potent stimulating drugs may exert their positive subjective effects because they boost a sluggish dopamine system in those less sensitive to reward (Volkow et al., 1999). To explain these findings, Volkow and her colleagues have proposed a particularly intriguing hypothesis related to risk for addiction. They suggest an optimal level, inverted-U, relationship between hedonic tone and dopamine activation where too little or too much of the latter is subjectively aversive (Figure 7.1) For those with high D2 receptor levels (more hedonic individuals), a normal increase in dopamine stimulation – like that found from natural rewards (for example, food and social interaction) – is likely to be perceived as pleasant, whilst a larger increase, such as that created by potent drugs like cocaine, would be experienced as unpleasant (as the study by Volkow et al., 1999, demonstrated). Alternatively, low D2 receptor levels (associated with anhedonia) could predispose to addiction by favouring initial pleasant responses to drugs, and other potent dopamine agonists, as this activation does not take them beyond their optimal level of dopamine activation.
Figure 7.1 The relationship between hedonic tone and dopamine activation.
There is good support for the notion that some anhedonic individuals engage in arousing behaviours as a form of compensation for their blunted affect and their inability to experience activation from weak levels of stimulation. For instance, one study found that skydivers were more anhedonic (independent of any depressive episode) than control subjects, suggesting that this highly arousing behaviour may partly serve a mood-enhancing role (Pierson et al., 1999). Anhedonia has also been associated with nicotine dependence, especially among depressed and schizophrenic patients.
It is easy to see that certain people make better decisions than others in their choice of self-regulating behaviours. The factors that move some in the direction of adaptive options, such as exhilarating sports, whereas others turn to drugs are not entirely obvious, but must clearly involve a host of environmental influences like peer pressure, cost, and the opportunity for experimentation. Other personality factors are also likely to be influential. Indeed, epigenetic models propose that temperament traits like sensitivity to reward do not exert direct and invariant effects but, rather, depend on a relationship with other traits, such as self-control. Some have also suggested that temperamental traits may become amplified over time through a ‘chain of failures’ in the development of self-regulation (Wills et al., 1998).
In the preceding account, we have seen how anhedonic traits may foster addictive behaviours because they serve a compensatory function for blunted affect and motivation. However, an argument may be made – and the evidence is supportive – that individuals whose personality locates them at the high end of the sensitivity to reward continuum are also more likely to engage in addictive behaviours, but for very different reasons. Those high in sensitivity to reward tend to be more motivated to approach, and more easily pleased by, natural rewards, the primer of which is eating. Indeed, research has supported the relationship between eating and sensitivity to reward, as we shall see in the following chapter on eating disorders. From an evolutionary perspective, there is good reason why our genetic legacy has favoured high hedonic reward from eating. In times of famine and seasonal food shortages, an inherent love of eating was clearly adaptive. However, this same capacity has a very obvious disadvantage in environments such as ours where highly palatable and calorically dense food is too readily available. One result of this clash between our environmental and our biology is the staggering percentage of overweight and obese individuals in most Western countries. Current estimates from the UK and North America indicate that more than 50 per cent of the adult population is overweight. Not surprisingly, the emerging viewpoint, as we saw earlier, is that eating can be just as addictive as snorting cocaine or drinking alcohol (Holden, 2001).
There are other characteristics of those high on sensitivity to reward that may contribute to their involvement in addictive behaviours. By definition, hedonic individuals tend to be more extraverted, more sociable, and more attracted by novelty. This is especially relevant because many addictive behaviours occur in social settings, or are likely to be initiated in the presence of other people. Think for a minute about social drinking, the influence of peers on smoking, or the club scene, and illicit drugs like Ecstasy. Even the use of caffeine is an activity we mostly do with others, as seen in such social idioms as the ‘coffee break’, and the current popularity of the coffee shop as a daytime venue for meeting friends. It is also the case that many addictive behaviours occur in the presence of some novelty, and this is especially true for adolescent experimentation with illicit drugs and sex. For the more sociable and extraverted individuals, the addictive behaviours themselves – the cigarettes, the alcohol, or the casino – may not be the primary appeal. What they may find more rewarding is the social contact and the novel experiences that are so often a part of engaging in these behaviours, especially in the early stages. In other words, for some individuals, addictions may be a secondary effect, developing as a consequence of their location within social events. However, over time, and after repeated exposure, the behaviours themselves take on primary appeal.
Impulsivity is one of the most elusive personality constructs in the field of addiction research, probably because it has attracted so many definitions and such subtlety of meaning. Some see restlessness and the tendency to be easily distracted as essential elements of impulsivity. Others have described impulsivity as poor tolerance of frustration which drives the individual to act spontaneously, whereas others still have focused on the disinhibition of responding at times when inhibition is the appropriate response in a particular situation. In other words, impulsivity is about poor self-regulation whereby behaviour is predominantly and inappropriately controlled by appetitive stimuli. According to Gray’s and Cloninger’s theories, impulsivity is a component of both novelty seeking and reward sensitivity. From a more cognitive perspective, some believe that impulsive behaviour is primarily about poor decision-making, exemplified by the tendency to choose small or poor rewards that are immediately available in preference for larger but delayed rewards. And lastly, among those who focus on the seriously disordered end of this trait, impulsivity is often used to describe strong drives or temptations to perform acts that are risky or even harmful to ones self or others. In this context, acts of self-mutilation and aggressive behaviour are subsumed in the DSM category of Impulse-Control Behaviours.
One theory brings together many of these rather loosely connected ideas by proposing that impulsivity is at the low end of a dimension of ‘behavioural self-regulation’. Other components of poor behavioural regulation include inattention, hyperactivity, and aggression. It is also interesting to speculate whether intelligence is an independent factor, or whether it plays a role in determining where any given individual falls along this continuum since we know from studies with children that those who show good self-control are more competent socially and intellectually (see Blair, 2002).
Considerable research has also investigated the biological basis of impulsivity. However, the general ambiguity of its meaning has also contributed to measurement confusion, and a certain inconsistency in the findings. Most agree that aggressive impulsivity is related to lower serotonergic activity, perhaps at the level of the amygdala (Oquendo & Mann, 2000). On the other hand, those studying the more cognitive elements of impulsivity tend to implicate the prefrontal cortex in this process. For example, an interesting series of studies over the past 30 years has shown that frontal lobe brain lesions in animals enhance the degree to which the animal’s behaviour is controlled by conditioned reinforcing stimuli – or what has been called ‘Pavlovian approach behaviour’. In other words, after the lesion, the animal is less able to shift its behaviour from conditioned reinforcing stimuli to a new task. As we have seen in an earlier section of this chapter, the strength of conditioned responses is correlated with the degree of dopamine activation in the nucleus accumbens – a process that occurs when natural or pharmacological rewards, such as food or addictive drugs, respectively, are present. We will also recall that prefrontal input to the midbrain is the mechanism by which cortical structures can modulate or gate conditioned response tendencies. Therefore, inherent individual differences in the functioning of this dopamine-based regulator system – either a hypersensitive tendency to respond to appetitive stimuli, or hypoactive cortical modulation of these impulses (or both) – appear to form, at least in part, the biological explanation of some aspects of impulsivity (Jentsch et al., 2000).
The insidious nature of potent rewarding behaviours (like taking addictive drugs) is that they tend to create a snowball effect by fostering the drive to do more of the same. One way this seems to happen is that repeated exposure to drugs of abuse can produce reductions in cortical dopamine, which in turn diminishes the efficiency of the executive function of the prefrontal cortex, resulting in the exaggerated responding to pleasurable stimuli and the thrill or sensation seeking that is typically associated with impulsivity (Taylor & Jentsch, 2001). In summary, drugs of abuse can affect behavioural processes that contribute to addiction, including enhanced stimulus-reward learning and Pavlovian approach behaviour, and decreased behavioural inhibition.
For many years, addiction research focused on the compulsive aspects of this disorder, and the prominent role that cravings seem to play in its astonishing resistence to treatment and its high rate of relapse. However, in recent years there has been increased interest in the part that impulsivity plays in this process. This change in focus was prompted partly because clinical evidence failed to show that all drug use occurs in response to overwhelming cravings. It seems that some happens in a rather spontaneous, impetuous, and unplanned manner (Moeller et al., 2001). Many studies have also shown a strong relationship between impulsivity and addictive behaviours (see Brady et al., 1998 for a review). For example, impulsivity has been associated with greater experimentation with drugs, greater frequency and severity of use, and poorer treatment outcome. Aggression is also strongly associated with drug-taking, both preceding and following drug use, and one characteristic linking the two behaviours is impulsivity (Allen et al., 1998). In addition, a number of studies have shown a higher than expected prevalence of personality disorders, especially antisocial and borderline, among groups of addicts (Clark et al., 1997; Grilo et al., 1997). However, in much of this research, the causal association between impulsivity and addiction is open to question since many studies are either correlational and/or have tested groups of drug-dependent subjects. The major problem, as we have seen earlier, is that chronic drug-taking can directly cause the behavioural characteristics that are tapped by most measures of impulsivity.
More compelling evidence of the link between impulsivity and addiction comes from longitudinal studies, and studies that track the chronology of comorbid disorders. For instance, Verheul and colleagues studied a large group of mixed substance abusers and found that remission of the addiction was not significantly associated with remission of their personality disorder – an outcome which suggests that personality pathology and addiction tend to follow an independent course. Another study compared adolescent sons of substance-abusing fathers, over a two-year period, with an age-equivalent group of boys whose fathers had no history of addiction or other psychiatric disorders, and found a higher presence of poor behavioural self-regulation among the former (Dawes et al., 1997). This characteristic also predicted more deviant peer affiliations and poorer school performance – characteristics which frequently precede substance abuse. What was also interesting was that the factors that comprise poor behavioural regulation in the boys (such as impulsivity and inattention) were largely present in their respective fathers.
A number of irrefutable facts, such as the high comorbidity between substance abuse disorders and a variety of anxiety disorders, has spawned the search for causal links between anxiety and addiction. Studies have found, for example, that the lifetime prevalence of substance abuse is 300 per cent greater in those with generalized anxiety disorder and 200 per cent greater for panic disorder compared to rates in the general population (DeHaas et al., 2001). There is also good evidence that a personality profile – the main component being neuroticism, and which the Eysencks have called the ‘addictive personality’ – is significantly higher in all addict groups, including substance abusers, compulsive gamblers, and those with eating disorders.
However, because many of the studies supporting the association between anxiety and addiction have used addicts recruited from treatment and rehabilitation centres, some have disputed the aetiological role of anxiety, arguing that it is simply the distress of withdrawal that we are assessing in these studies rather than a premorbid, causally prior, risk factor. While this may be true in some cases, longitudinal studies of personality, and retrospective accounts of the order of onset of comorbid anxiety disorders in addiction, offer fairly strong support for a causal model. For example, teacher’s ratings of high harm avoidance in school children predicted their subsequent substance abuse in adolescence and early adulthood (Wills et al., 1998). Several studies have also shown that anxiety disorders precede the substance use disorders in a large percentage of comorbid individuals (Merikangas et al., 1998).
One factor motivating the use of addictive behaviours is simply their powerful ability to reduce the painful emotional consequences of stress. The stress-reduction or self-medication pathway to addiction has received a great deal of research attention in recent years and predicts that individuals who are high on traits, such as anxiety and neuroticism, are more reactive to stressful life events than more stable individuals and, in turn, that this reactivity provides the motivation to seek quick and effective psychological relief from distress, in the form of drugs. Consistent with this hypothesis is the argument proposed by Khantizian (1997) that the drugs that addicts select are not chosen randomly but instead are the result of an interaction between the psychopharmacological action of the drug and the form of the individual’s distress. For example, he argues that heroin addicts prefer opiates because their powerful muting action subdues the rage and aggression they experience, and cocaine has its appeal because it can relieve the distress of depression. It follows then that the stress-reduction pathway is perhaps more relevant for addiction to alcohol, tranquillizers, and the opiates than for the popular stimulant drugs, such as crack/cocaine.
In recent years, cognitive theorists, investigating the links between anxiety and addictive behaviours, have proposed the notion that anxiety sensitivity is an important variable in the development and maintenance of addiction, and various models have been proposed to explain how this might occur. Although convincing empirical support for any of these is inconsistent or lacking, some do have a certain intuitive appeal. One ‘moderator’ model, proposes that the association between anxiety and substance use will be greater in those who are also high in anxiety sensitivity. Because the symptoms of stress will be more extreme in these individuals, they may be more likely than others to self-medicate with anxiolytic substances. Indeed, the evidence linking anxiety sensitivity and addiction is most convincing in the area of alcohol abuse, and to a lesser degree, nicotine addiction (Norton et al., 1997; Norton, 2001). A second moderator model, coming from the opposite direction, specifies that the anxiety-dampening effects of taking substances such as alcohol will be greater in those with high anxiety sensitivity, thereby reinforcing its use. Mediational models could also explain the links between anxiety and substance abuse. For example, it could be that those high in anxiety sensitivity intensify drug withdrawal symptoms, especially those related to alcohol and smoking, because of their similarities with anxiety symptoms – putting them at greater risk for continuing the behaviours (Stewart & Kushner, 2001).
The role of stress in the vulnerability to addiction has been approached from at least three angles:
Although stress affects practically all physiological systems, one of the most important includes its activation of the limbic-hypothalamic-pituitary-adrenal (LHPA) axis. Briefly, the hypothalamus secretes corticotropin releasing factor (CRF), which leads to the subsequent release of adrenocorticotropic hormone (ACTH) from the pituitary gland, culminating in the sercretion of cortisol and other hormones from the adrenal glands. While the acute action of these stress hormones are life-saving because they mobilize the resources we need to respond quickly and efficiently when the situation is threatening (as we saw in Chapter 5), their protracted effects can be seriously detrimental to one’s health (Majewska, 2002). It is relevant that drugs of abuse also activate the LHPA axis. However, the causal mechanisms linking stress and addiction are not fully understood, except for the consistent acknowledgement that it is a multidimensional relationship that depends on the neurobiological, genetic, and developmental make-up of each individual.
Although we could cite many examples, some taken from animal research will highlight our limited understanding of the complex relationship between LHPA axis function and drug effects. Many studies have shown that laboratory rats who respond with increased locomotion when placed in an inescapable novel environment like the ‘open-field’ box (high reactives) are more likely to self-administer a wide range of addictive drugs than their less active counterparts (low reactives) (Piazza et al., 1989). The best explanation for these findings is that the high physical activity reflects a higher sensitivity to stress because these animals also have a stronger and longer-lasting corticosterone (a stress hormone in rats) response to this environment. Furthermore, other research has shown that the high reactives do not differ from the low reactives in their sensitivity to a novel environment that is freely chosen (Robinet et al., 1998). Puzzling, however, is another animal paradigm producing exactly the opposite results. Two inbred strains of rats (Lewis and F344) were compared on their susceptibility to drug self-administration. The Lewis rats acquired drug self-administration significantly more rapidly than the F344 rats, and although the former also displayed a high locomotor response to novelty, they had a more hyporesponsive HPA axis function to stress than the F344 rats. Kosten and Ambrosio (2002) proposed a rather elegant fusion of the two seemingly contradictory findings by suggesting a non-linear, inverted-U relationship between HPA activation and behavioural sensitivity.
Extrapolating from their theory, some individuals (like the Lewis rats) may have low responsiveness to stress and to drugs, and may therefore seek out drugs to increase their arousal. Here we can see parallels to the ideas we presented in the earlier section on Sensitivity to Reward. Other individuals who are highly responsive to stress (like the high reactive rats) may also seek out drugs to ameliorate their hyperresponsiveness to stress. In other words, both those who are not easily stressed and those who are very easily stressed may find drugs rewarding, but for different reasons.
Other interesting research linking anxiety-induced stress to addiction was carried out by Volkow and colleagues (1996a; 1996b). Healthy male subjects were given either an intravenous dose of methylphenidate (a cocaine equivalent drug) or a placebo injection in a double-blind study. At the same time brain PET images were taken and subjects were asked to rate how they felt on an adjective checklist (for example, anxious, restless, happy, etc.). It was found that subjects’ ‘anxious’ ratings during placebo were positively correlated with the degree of dopamine concentration in the mesolimbic area during the drug condition. From these results, the authors concluded that a proneness to anxiety may be associated with a more reactive dopaminergic system. In other words, addictive behaviours may have a greater reward potential for these individuals than for those who are less anxious, increasing the likelihood of their use.
Finally in this section, we shall examine the role that early environmental stress may have on increasing vulnerability to addiction. A substantial body of research has documented the relationship between addiction and trauma. For example, victims of childhood physical and sexual abuse consistently report greater use of alcohol and other illicit drugs. Other studies have also shown high rates of victimization among female substance abusers, and rates as high as 50 per cent of post-traumatic stress disorder in clinical samples of substances abusers (Gordon, 2002). Although causation is always difficult to establish – and, indeed, may be bidirectional – one obvious explanation is that drugs and other addictive activities are used to provide relief or escape from the stress of ongoing abuse, or its painful memories. On the other hand, there is also clear evidence that early aversive life events can modify neurophysiological development of the LHPA axis. Animal research also indicates that early developmental stress causes the enhanced self-administration of drugs when these animals are adults (Kosten et al., 2000). There is some indication that downregulation of certain serotonin receptors may be one of the biological mechanism linking early life stress to increased risk for addiction. Other evidence has implicated reduced cortisol levels (see Chapter 5 for a more detailed discussion of developmental influences on fear and anxiety). Although the neurochemical correlates of early developmental stress are still only vaguely understood, the behavioural correlates are quite consistent. Animals raised under stressful conditions tend to be shyer, less explorative, and fall into lower levels of the social hierarchy. In the human condition, it is more difficult to untangle environmental stress, innate temperament, and the interaction between the two, especially because this appears to be an evolving relationship. However, observations of orphan children show behaviours comparable to those seen in animals, such as a greater occurrence of emotional disturbances relating to sleep and feeding, and a substantially higher incidence of anxiety and depression at later ages.
In summary, there is substantial evidence that personality and personality pathology are involved both in the aetiology of addiction and in the course the disorder takes. There is even some tentative suggestion that personality factors may play a role in the type of addiction one develops – whether to alcohol, nicotine, or food. A clearer understanding of the personality pathways has considerable practical importance and is crucial to the development of improved treatment methods. If we can understand the psychology of those at high risk for addiction we can also improve our ability to target prevention efforts. A recognition of the heterogeneity of personality risk factors is also fundamental to the development of strategies to predict relapse since certain treatments may be more effective with some patients than with others.