12

THE NEW NORMAL

We all have labels. There are the ones that make us part of something bigger than ourselves: daughter, aunty, sister, friend, ally, godmother. There are those that put us into tribes: Scot, Aussie, journo, leftie, Hawks fan, Gen X. Then there are the ones that only we can see. They’re usually the labels that constrain us: crazy person, procrastinator, depressive, fuck-up, defective, fraud.

There is one label I can’t seem to peel off: problem child. Its adhesive is watertight. I don’t know exactly when the term first came into my mind, but I remember feeling that I’d been marked from the day I was born. Wednesday’s child is full of woe, and there was nothing I could do about it.

There’s a story my family tells about an incident in my childhood. On holiday in France when I was about six years old, we visited a park with a map of the world carved into turf in the middle of a lake, allowing visitors to hop from country to country. Neil and Dad had gone back to the grassy bank by the side of the lake, while Mum and I were still navigating our way through the nations of South America. At the southern tip of the continent I attempted to make the leap across to Antarctica, but misjudged the gap and plunged into the murky depths off the coast of Argentina, thrashing around in what was only knee-deep water but to me might as well have been the bottom of the ocean. My brother and Dad burst into laughter, while Mum helped fish me out, dripping wet as I cried in abject mortification at the public spectacle I’d become. Every time Dad retells the story he sings ‘Don’t Cry for Me Argentina’, and we all laugh at the hilarious scene I caused. But I can remember even in that moment thinking, Why do I always stuff things up?

When my adolescent angst stretched on for longer than it should have, the problem-child label really took hold. It was a bewildering time, and I remember feeling ashamed of the way I couldn’t get over what Dad discreetly called my ‘troubles’. I felt guilty for the distress I was causing my parents. Every family has an oddball, and I was it. A red wine stain on a crisp white tablecloth. I visited a succession of GPs, psychiatrists, and psychologists, who made diagnoses and predictions that served to confirm what I had thought for so long: I was not normal. One doctor told me I might have to take antidepressants for the rest of my life. Just as my brother needed medication to manage his epilepsy, I would require pills to keep me sane. They said I had a chemical imbalance that needed correcting.

I remember my psychiatrist — an elegant, grey-haired man in a three-piece suit who practised from a grand Victorian house with huge bay windows and antique wooden furniture in Edinburgh’s southside — offering me, at 19, an interesting take on my future. ‘You’re going to have a very successful life. You will achieve anything you put your mind to. The challenge for you will be whether you can overcome your anxiety enough to enjoy it.’

As carpe diems go, it was a bit shit. But the words stayed with me. Anxiety was going to be my meal ticket or the ball and chain around my ankle. In some ways it was both. He was right that my life, by any objective measure, has been a success thus far. I’ve enjoyed a great professional life and have reached many of the life goals that make up the formulaic equation for happiness. I’ve jumped out of planes (twice), swum with sharks, appeared on live radio and television, and passionately spoken my mind on more occasions than has probably been good for my career. But it hasn’t made the worrying go away. And through it all, I struggled to shake the belief that I was abnormal. My fate had been sealed by the prognosis I’d been given.

Labels can be informative and necessary. The skull and crossbones on a bottle of bleach warns kids it’s not for drinking. But they can also pose their own risk: they can be limiting and self-defeating. I often wonder what would have happened if I hadn’t been told at an early age that anxiety and depression would be with me for life. Would I have still viewed myself as damaged goods? Or perceived every bad day as the start of an impending breakdown? Maybe, like so many teenagers struggling with a developing mind, a changing body, peer pressure, and the desperate need to belong, given time, with the right professional who would offer the space to really listen, I would have been okay. Instead, I came to believe the narrative that I was an abnormality to be fixed.

My struggles evolved during a dramatic period of change in the way society perceives and responds to our emotional health and the vagaries of the mind. Stigma is slowly being reduced, encouraging more people to seek help. But the rising prevalence of psychological disorders has coincided with a broadening definition of mental illness. Experiences that might once have been considered transient, normal facets of the human condition can now be classified as disorders. The figures say that one in five of us will experience a mental disorder in any given year, and almost half will be afflicted in our lifetimes. It’s hard to know if mental illness is more prevalent than it once was, or if medicine has simply blurred the boundaries between normality and disease. One person’s eccentric is another’s senile. There’s a fine line between creative visionary and deranged heretic.

Over the last decade in particular, there has been a growing unease within mental-health circles about the medicalisation of the human experience. This shift, according to some experts, has sparked false epidemics of psychiatric disorders and led to more people being unnecessarily medicated. They believe the diagnostic bar has been set so low that everyday sadness and personality quirks are being pathologised. In a culture that places such a premium on happiness, any deviation from this expected default emotional position is viewed as aberrant.

At the heart of the debate is the changing definition of mental illness and the contentious document used globally to diagnose disorders. The Diagnostic and Statistical Manual of Mental Disorders (DSM), produced by the American Psychiatric Association, has grown from a 130-page booklet of around 60 broad illnesses in its first edition in 1952 to a 947-page tome listing almost 300 disorders. Its fifth and most recent edition, published in 2013, dropped the threshold for many existing conditions and introduced a range of new disorders, including ‘disruptive mood dysregulation disorder’, which critics say essentially turns children’s tantrums into an illness, and ‘mild neurocognitive disorder’, which some say makes the natural forgetfulness of age a treatable disease. It’s also now easier for a boisterous child to meet the diagnosis for attention-deficit/hyperactivity disorder (ADHD), as the number of symptoms required for a diagnosis has been halved. There’s already been a global epidemic of the condition and a subsequent spike in kids being medicated since it was added to the DSM in 1987.

These revisions to the manual’s fifth edition caused an international outcry, with 51 health groups, including the American Counselling Association, the American Psychological Association, and the British Psychological Society, calling for an independent review. It also sparked a bitter divide within the psychiatric community, with the debate as much about the ideological future of the profession as about the nature of mental illness. Perhaps the most controversial decision was the removal of the bereavement exclusion when diagnosing major depression. Previously, doctors had been urged to refrain from giving a diagnosis of clinical depression within the first few months of a patient experiencing grief. While the symptoms may be similar to a major depressive disorder, bereavement is not a sickness, they argued. But under the new DSM, a person can be classified as having a depressive illness after just two weeks of grieving. Two weeks. We have medicalised a normal, albeit intensely painful, part of the human experience. Anyone who has experienced grief knows all too well that no pill can ease the heartache of losing a loved one.

When I was diagnosed with depression in my teens, it coincided not only with several years of bullying but also the death of four people who were close to me. I lost my gran, my great-aunt (who was like a grandmother to me), my uncle, and, finally, the five-year-old daughter of one of my mum’s oldest and closest friends. By the time I was 18, I’d been to more funerals than most people twice my age. As a pessimistic individual with a natural propensity to catastrophise and obsess about death, it heightened my angst at the fragility of the world. I struggled to make sense of my loss. And yet, when my parents sent me for help, the professionals showed little interest in discussing the bereavements that had taken such a toll on our family. Their silence on the subject indicated that they thought this wasn’t enough to explain my mood. I don’t know if they were wrong. I was definitely terribly unhappy and I needed help. But there was something about the label depressive that cemented my problem-child status. It was a self-fulfilling prophecy as I came to believe that I was helpless, that being mentally defective was an integral part of my identity.

I didn’t want to be crazy. It wasn’t that long ago people displaying signs of mental ill-health were locked up in austere, high-walled asylums, as governments attempted to shield polite society from their madness. In Victoria, suicide was considered a crime until 1958. Sixty years later, and we still routinely talk about people ‘committing’ suicide. For many years I was an advisory group member of the Australian government’s Mindframe National Media Initiative — set up in 2002 to promote responsible, accurate, and sensitive representation of mental illness and suicide in the media. Our job was to educate journalists about the stigma created by using words such as psycho, lunatic, or maniac. We helped foster shifts in editorial policies such as the now-standard practice of including links to Lifeline at the ends of stories dealing with suicide, and avoiding reporting on the details or method in order to avoid a suicide contagion effect. We also tried to bust the inaccurate stereotype that people with mental illness are dangerous, pointing out that they’re more likely to be a victim of crime than a perpetrator. It might have changed some of the reporting in The Age newsroom, but as I was trying to return to work, I learned that we still have a long way to go to shift cultural attitudes. The management executive who liaised with my boss, my GP, and psychologist on my ‘transition plan’ was well-meaning and kind, but when we met, she looked at me as if I was a live bomb that could blow up in her face at any moment. Some of the paperwork I was made to fill out did not inspire confidence. One form — which asked a series of questions including which duties I could and could not carry out — stated that this was to secure a ‘safe working environment’ for me and my colleagues. I can’t be sure what risk they thought I posed, but I couldn’t help but think this paperwork was less about looking after my wellbeing and more an arse-covering exercise to protect the company from litigation should I turn out to have homicidal tendencies and shoot up the newsroom.

Much of the shame and fear that remains around mental illness stems from the labels we use to describe people who are different, unconventional, or just having a bad day. I’ll hear a friend joke about someone being ‘completely mental’ and think, Yeah, but that could equally apply to me. How mental do you have to be to be ‘completely mental’? Does believing your loved ones are dead if they don’t reply to an email get you there? Or not being able to drive a car because the panic makes you feel like the steering wheel’s dissolving and your limbs are turning to liquid nitrogen?

Sometimes I’ll self-identify as ‘a total crazy person’ after an episode that has seen me behave in a way that seemed entirely rational at the time but through clarity’s rear-view mirror reveals itself to be comical. Like the occasion I was sitting in the forecourt of a service station in Jason’s car, waiting for him to pay for petrol, and he took longer than expected — but because I wasn’t wearing my glasses I couldn’t see there was a long queue and only one attendant, so naturally I assumed he was lying on the floor having a seizure as fellow drivers delivered mouth-to-mouth, and life seeped from his limp body. Or the time that Dr Fiona’s receptionist unexpectedly brought my pap smear follow-up appointment forward by a day and for three agonising hours I sat rigid at work, utterly convinced I had end-stage cervical cancer. Or the fact that despite my period being an event I have lived with for more than 25 years, every month without fail it takes two days of spontaneously weeping at videos of soldiers being welcomed home from combat by their dogs before I remember I’m not on the verge of a breakdown but am just premenstrual.

Finding humour in the absurd things my brain tricks me into believing allows me to view the anxiety as something that is happening to me rather than a condition that defines who I am. But the mockery is still a judgement. There’s an element of self-criticism in quantifying my emotional state. People undergoing chemotherapy don’t describe themselves as ‘totally cancer-ridden’ when things are bad. And yet the way I talk about my emotional health can often be disparaging. It has a history that goes all the way back to adolescence, when I was given a label and began to view myself as damaged. And, of course, this was a script that the child part of me that already felt broken and bad was only too willing to recite.

In some ways, I’m a product of the psychiatric trends of my time. The diagnostic categories of mental illness have expanded with every edition of the DSM, and the threshold for being diagnosed with a depressive illness has gradually dropped. While once depression was broadly divided into two types — melancholic depression, which was seen as a disease and had no obvious cause, and reactive depression, sparked by stressful life events — DSM-3, released in 1980, four years after I was born, essentially created one condition that varies by severity. Mild or moderate sadness was lumped into the same category as what was once considered clinical depression. GPs now grade patients against a checklist of symptoms — the Kessler ‘exactly how fucked up are you’ scale I have come to dread so much — to arrive at a diagnosis of mild, moderate, or severe depression. It’s a blunt instrument that doesn’t take into consideration life circumstances or underlying drivers. It’s a snapshot of a person’s emotional state at a specific moment in time. It is little more than a number on a chart. A number that can assign a label that sticks for life.

In a system set up to deliver the magic bullet, GPs — who prescribe 85 per cent of all psychiatric drugs in Australia — simply don’t have time to investigate the many potential causes of psychological distress. And when people are at their lowest point, a ‘happy’ pill is often welcomed as a short, sharp salve for their pain. It’s been a boon for the pharmaceutical industry. The most recent figures on the global psychiatric-drug market value the sector at around $88 billion. It’s a long way from the 1950s, when the world’s first antidepressant, imipramine, was invented, and manufacturer Geigy worried there weren’t enough depressed people for it to generate a profit.

In Australia, the market is flourishing. A 2013 report providing a health overview of the 33 Organisation for Economic Co-operation and Development nations found that Australia was the second-highest prescriber of antidepressant medications, with the rate of use doubling in a decade. It showed that 89 Australians in every 1,000 were taking antidepressants, compared to 45 in 2000. The following year, researchers from the University of Sydney discovered that the number of children aged 10 to 14 being prescribed the drugs had jumped by more than a third between 2009 and 2012.

When antidepressants first hit the market, the thinking was that they helped correct malfunctioning neurotransmitters in the brain by strengthening the serotonin signal between nerve cells. I’ve lost count of how many doctors told me that depleted levels of serotonin — one of the mind’s ‘happy’ chemicals — was the reason I was so sad. This chemical imbalance theory became firmly rooted in the public consciousness from 1987 when Prozac — one of the first of a new generation of antidepressants known as selective serotonin reuptake inhibitors (SSRIs) — went on sale. It became the most widely prescribed antidepressant in history, a potent symbol of our culture’s quest for the happiness quick-fix. By 2007, 54 million people worldwide were taking it.

I’ve been on and off various antidepressant medications since I was 16. I took them because the doctors told me if I didn’t, my ‘imbalanced’ brain would continue to malfunction. If I wanted my happy-ever-after, it would come in a little white pill taken every morning with breakfast.

There have been times when I’ve felt the drugs have helped me function again. Other times, they’ve barely touched the sides of my pain. Most recently, I took them, reluctantly, because I’d reached the point where each day was a slow dance with suicide. I couldn’t expect my friends to keep putting their life on hold to watch over me. I either tried medication or I went to hospital.

Things slowly improved, but the improvement coincided with profound breakthroughs in therapy with Veronica that untangled so much of my past and helped me rely on myself more. I can’t know for sure whether it was the medication, the therapy, or a combination of both. But the weeks of debilitating physical and psychological withdrawal symptoms I’ve suffered every time I’ve come off these drugs leave me in no doubt that they are incredibly powerful chemicals I’d rather not take unless left with no other options.

And I’m not the only one having doubts. An increasing number of clinicians and researchers now argue that the case for antidepressants as an effective and widespread treatment option for depression and anxiety has been based on a lie. The debate was blown open in 2012 when Harvard Medical School scientist and psychologist Irving Kirsch told the US 60 Minutes program that his extensive research revealed the difference between drug and placebo is very small, and in half the studies non-existent. While he didn’t deny many people improved after taking antidepressants, he concluded, ‘It’s not the chemical ingredients of the drug that’s making them better. It is largely the placebo effect.’

Around the same time, in a series of articles in The Age looking at the pathologising of the human condition — which in hindsight was perhaps an attempt to untangle my own emotional problems — I spoke to Professor Michael Baigent, a psychiatrist and a director of beyondblue, who echoed Kirsch’s views. He stressed that for those with severe and otherwise untreated depression, not taking antidepressants could be life-threatening, but people with more moderate symptoms are far less responsive to medication. ‘The chemical imbalance explanation is an oversimplification of a very complex picture,’ he said. ‘There’s a tendency to want to dumb it down and say people are depressed and all depression needs an antidepressant, but when you look at the research you see that their effect is really greatest with the more severe forms of depression. With the less severe forms they’re often no better than a placebo.’

Kirsch’s findings were backed up by two studies co-authored by Dr Walter Brown, clinical professor of psychiatry at Brown University’s Warren Alpert Medical School. He said that it was the mildly depressed who largely accounted for the huge increase in antidepressant prescriptions over the previous decade and yet they were the ones least likely to benefit from them.

This is not something you hear very often when you visit a doctor to discuss your emotional health. I can’t remember a GP ever telling me that the chemical imbalance theory was flawed or that taking antidepressants might be no more useful than swallowing a sugar pill. Perhaps when I’ve felt they worked for me it was the placebo effect. Maybe it doesn’t matter. If the end result was me feeling better, why should I care? But looking back, I can’t help wondering what effect these pills may have had on my developing teenage brain. According to a 2016 Lancet study, the majority of antidepressants given to children and teenagers are ineffective, and some are potentially dangerous, increasing the risk of suicidal thoughts. Of the 14 types of antidepressant taken by 5,000 children in the study aged nine to 18, only Prozac was found to be statistically more effective than a placebo. Among those shown to be ineffective were the two drugs I was prescribed at 16 and at 18.

Millions of people across the globe continue to take antidepressants. For some, there is no doubt they can be life-saving. Medication is an important part of the treatment regimen for many conditions, and people should have access without judgment or shame. But how many millions of people are taking pills in an attempt to ‘correct’ a brain imbalance that simply doesn’t exist for them? Their suffering is real, but the way out of it might not come in a blister pack of daily tablets. Perhaps we don’t really want to hear that the drugs don’t work — we just want the magic pill that will take the pain away.

The broadening of categories in the DSM is in some ways a reflection of our culture’s inability to sit with our discomfort. We are so focused on the expectation of happiness we don’t want to feel loss or anger or the myriad shades of profound sadness life can throw at us, so we label it as sickness. If there’s a malady, there must be a remedy. We want the fairytale ending, however we can get it.

The most strident critic of the pathologising of the human condition has been American psychiatrist Allen Frances, who chaired the expert committee that developed the DSM-4 in 1994. He has labelled DSM-5 a ‘dangerous public health experiment’ that will inappropriately inflict the ‘mental disorder’ label on millions of people previously considered normal.

Frances believes the system is now set up to misdiagnose or unnecessarily medicate people who are not mentally ill, while at the other end of the spectrum, people are told they’re not sick enough for support and only receive care when they reach crisis point. His critics paint him as a disgruntled malcontent seeking relevance after his moment in the spotlight has passed. I disagree. We’ve spoken many times and I’ve always found him to be informed, considered, and congenial. During a trip to Australia as part of a speaking tour, we met in person and it was clear that he finds the explosion of psychiatric diagnoses genuinely troubling. Permanently tanned, with a full head of thick white hair, the 75-year-old told me there has been an ‘imperial, wholesale takeover of normality’ and that the ‘pool of normal has shrunk to a puddle’.

In 2013, he documented these concerns in Saving Normal: an insider’s revolt against out-of-control psychiatric diagnosis, DSM-5, Big Pharma, and the medicalization of ordinary life, which became an international bestseller. We reconnected as I was writing this book, and while previously I’d interviewed him for his views on over-diagnosis, this time I laid bare my own history of mental-health problems.

He has retired from practice, but remains Professor Emeritus of Psychiatry and Behavioral Sciences at Duke University. Speaking from his home in California, he’d lost none of his passion for the cause. There was an audible gasp when I mentioned I’d been told as a teenager I might need to take antidepressants for the rest of my life. ‘That’s terrible misinformation. Very often the symptoms that present in youngsters are transitory and related to adolescence,’ he said. ‘There’s no evidence that antidepressants work very well for teenagers, and they have considerable risk for that age group. Similarly, ADHD drugs and antipsychotics are widely overused. We’re conducting a population-wide experiment in bathing immature brains in powerful neurotransmitters, not knowing what the long-term impact is, and there’s very little evidence that for most kids they’re helpful.’

For teenagers, Frances said, a ‘watchful waiting’ approach was best. Family stress, developmental issues, peer pressure, exams, and relationship problems can cause acute distress for a short period of time, but that distress doesn’t necessarily constitute a mental disorder, much less one that will last a lifetime. Given what was going on in my life at the time, he said it was possible my distress would have been short-lived if I’d had access to the appropriate counselling. Just because I’d been a chronic worrier since I was a child didn’t necessarily mean I had a psychiatric disorder.

‘Lots of people have anxiety; it can either be crippling or it’s something that you can manage very well, that much is true,’ he told me. ‘The part I object to is the prediction that it looms over your future as a haunting black cloud. We can make prognostic guesses, but there’s tremendous variability, even without treatment, especially in young people, and to make these magisterial comments as if we know exactly how things are going to turn out is just unsupported by facts. It’s arrogant and it can be very harmful.’

When I asked him about Australia’s rising youth suicide rate and how he can reconcile this with the idea that there is a false epidemic of psychiatric diagnoses, he pointed out that not everyone who takes their own life has a mental disorder, particularly impulsive teenagers who can pursue drastically permanent solutions to temporary problems. ‘Often they’re experiencing bullying or have stress at school or they lose a romantic relationship and that can be the trigger. Part of why we haven’t had that much effect on suicide rates is because not all suicides are related to the conditions we treat.’

Frances believes adolescents should be given more time with GPs and access to psychotherapy before drugs are considered. He stresses that he does not doubt that many people — including some children and teenagers — suffer from diagnosable mental disorders that can be incredibly debilitating and may require medication. But the threshold for emotional distress to be considered a disorder should be if the symptoms are ‘classic, severe, persistent, and cause considerable distress and impairment’. This means monitoring patients to ensure their emotional state is not transient. ‘People coming in during the worst day of their life often will feel much better in a very short period of time. If they get medicine during their first visit, they’ll be convinced it was the medicine that made them better and maybe feel compelled to stay on it for a long duration when it’s really just placebo.’

He partly blames flawed epidemiology for over-inflating the prevalence of psychiatric disorders, but saves his greatest ire for pharmaceutical companies, which he claims are only slightly higher up the moral food chain than illegal drug cartels. It’s been in their interests, he says, to turn every facet of the human condition into a problem requiring a pill. If people aren’t really sick, it works to their advantage.

‘People with very severe conditions have a placebo response rate under 10 per cent. People with mild conditions have a placebo response rate of over 50 per cent,’ he said. ‘The person taking the pill doesn’t know whether they’re a placebo responder or not, but the most satisfied customers will be those people who are taking a pill they don’t need. The drug companies could not have gotten as rich as they did treating severe disorders because it’s a small market.’

Given what we now know about neuroplasticity and the ability to change our brains through psychotherapy rather than psychotropic drugs, it seems clear the chemical imbalance theory is an unhelpful, catch-all notion for many of the people who have been medicated and given labels that can impact on their whole lives. As Norman Doidge, the psychiatrist and author, discovered, our brains are not fixed. A tough time doesn’t necessarily mean we will be sick forever. We might not be sick at all. When we view our suffering solely through the prism of a biological malfunction, we ignore the underlying drivers of that pain. Feeling the full depth of life’s low points is not something you often see depicted in the neat Hollywood ending, but it does not make you mad or weird or broken. Frances points out that the capacity to experience suffering and pain is a necessary part of the human condition. ‘Evolution doesn’t have sadness and grief built into us for no reason — these are normal, useful emotions. People who don’t feel anxiety get into terrible trouble. The essence of being a mammal is being able to love and attach, and the price of that is a sense of loss.’

I am by no means anti-medication. But what would have happened if I’d been offered more of a choice when I was young and at a crossroads? Perhaps I still would have struggled with anxiety throughout adulthood. Perhaps I wouldn’t. I just wish there was better research on the therapeutic responses to the emotional distress so many of us experience. I wish we had a system that provides the time and space for people to have those conversations — for their life stories to be heard. A system that doesn’t cap therapy at ten hours a year, and that offers solutions beyond the prescription pad. Most of all, I wish we had a system that sees the person, not the label. As Frances told me, it’s easy to give a diagnosis, but it’s very hard to take it away.

The work I’ve done in therapy has helped me see that I am much more than a label. It’s also allowed me to start accepting that the challenges I’ve faced — as tough as they’ve been — do not necessarily make me abnormal. In a culture that puts happiness and perfection on a pedestal, everyone has issues they secretly worry make them a bit weird. So many of us view ourselves as damaged, defective, not quite right. And yet, clearly, most of us are not living out the happiness fairytale that we’ve been sold as the norm. It’s time to flip the script. If everyone struggles, we can’t all be mad. Or if we are, maybe that’s okay. As Veronica pointed out one day in a statement I found strangely comforting, ‘We’re all fucked up. Every single one of us.’ Perhaps, in these challenging times, ‘not quite right’ is the new normal.

As I look back, I realise that in some ways the problem-child label stuck because part of me needed to believe it. Being defective had become my identity. Who would I be without it? I’m reminded of the episode of The Simpsons where Homer gets his hand stuck in a vending machine. Waiting for firefighters to rescue him after being trapped for hours, he is eventually asked by a co-worker, ‘Homer, are you just holding on to the can?’ He sheepishly lets go, removes his arm, and walks free.