22

Alzheimer’s Legacy

Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less.

Marie Curie, attributed

WEVE BEEN MARRIED fifty-one years. We got married in 1947.’ The old man’s voice echoed from the speakers in the dimly lit studio. ‘The first symptoms were memory loss. She seemed to forget things…’ He looked down, confused. ‘Gradually things got worse and worse… I really reached the end of the road, with having to get up in the morning, bathe her, dress her, deciding what clothes she’s going to put on, and then of course there are the meals to get ready. But the unfortunate thing is…’ He paused, voice crumbling, before letting the grief sink in, ‘is her not being able to even talk to me… It’s worse than anything I’ve come across.’

Professor Nick Fox stopped the video. ‘It’s a dreadful disease,’ he said solemnly, turning to his audience. It was 24 April 2016. I was standing among a crowd of 100 preternaturally quiet Londoners in the Science Museum, watching a clip of one of Fox’s patients. It was 8 p.m.; the museum was holding an evening called ‘Lost in Thought’, a rallying cry for anyone who still hasn’t got the memo. Now, on the screen behind Fox, enormous photographs of Alois Alzheimer and Auguste Deter stood like columns, their story enshrined in the monolith of neuroscience.

Fox has been an integral character in my story. It was he, and his protégée Natalie Ryan, both stellar neurologists at London’s National Hospital for Neurology and Neurosurgery, who had put me in touch with some of the Alzheimer’s families in this book. In addition to seeing patients, Fox now spends considerable time raising public awareness. He recently gave a talk preceding the performance of Nicola Wilson’s play Plaques and Tangles–portraying the life of a family affected by early-onset dementia–and had, in the past year, released simple online courses that the public could access to learn more about the condition.

‘One in three people in this room will get Alzheimer’s,’ Fox continued. ‘One in two will look after someone with Alzheimer’s.’ He paused and turned again. ‘As a society we’re sleepwalking into this.’

And so it’s time to wake up. We can start by reassessing our priorities. Even though the disease is now heavily ingrained in the public’s collective consciousness, Alzheimer’s is still woefully underfunded. In the UK alone, the total cost of Alzheimer’s is £23 billion a year–more than cancer, heart disease and stroke combined–yet, incredibly, a mere 0.2 per cent of that is actually spent on research. John Trojanowski, a researcher at the University of Pennsylvania, has pointed out that the US spends more money on popcorn, Viagra and anti-ageing creams than Alzheimer’s research. This is astounding considering that an additional 7.7 million cases (that we know of) are reported every year. It seems a more horrifying kind of forgetting is taking place in our world. We are forgetting them.

If things continue this way, epidemiologists estimate that the total number of Alzheimer’s cases will double every twenty years, making dementia the next global pandemic. In that event, the current 46 million patients would represent no more than the tip of a vast, society-crippling iceberg.

So after a century of Alzheimer’s research, a journey that’s spanned the globe and brought with it a kaleidoscope of blind alleys, high hopes and stark tragedy, the final question is one that’s been with us from the very beginning. What is the future for this ‘peculiar’ disease?

The answer might strike you as somewhat passive, but the hope, for the time being, is to reach where we currently are with diabetes. Diabetes was often a death sentence a century ago, but the advent of molecular biology, and subsequent creation of synthetic human insulin from genetically engineered bacteria, has effectively reduced it to a phantom of its former self. Given how late in life Alzheimer’s usually strikes, its own phantom needn’t be a phantom for very long. Consider this: if Alzheimer’s could be delayed by only one year, there would be 9 million fewer people with the disease by 2050.1 A five-year delay, some scientists predict, would effectively halve the globe’s 46 million sufferers, saving healthcare services approximately $600 billion a year.2

So what would this treatment look like? Based on everything we’ve seen in this book, in the future doctors may not need to administer extensive memory tests or rigorous brain imaging. A drop of blood or a strand of hair might foretell when our minds are set to unravel, and a doctor, with a host of genetically tailored pills to choose from, might know precisely which one to give to ensure lifelong brain health. No longer would we be at the mercy of what is written in our DNA, nor feel compelled to live a spartan existence. We would then be facing some new challenge, safe in the knowledge that memory is untouchable.

In the meantime, here is what we know. We know Alzheimer’s is by and large an age-related disorder, but that ageing, in and of itself, is not the root cause. Alzheimer’s is, rather, the product of processes common in normal ageing. We know that plaques and tangles, for instance, are found in elderly people without causing disease; it’s only when they reach a certain threshold that they trigger full-blown dementia. We know that both are necessary–although perhaps not sufficient–to kill neurons, and we know that plaques appear before tangles and are therefore the most attractive target for drug developers.

We know that genetics is central to understanding this disease, and that while only a small minority of people inherit genes for early-onset Alzheimer’s–genes such as APP and PSEN1–many of us possess genetic risk factors that can tip the balance towards dementia, with the APOE4 gene being the strongest risk factor. Undoubtedly, the genetics of the twenty-first century will add a great deal to this story.

But we also know that Alzheimer’s is going through a drastic reformation of underlying principles. In the past decade, as methods to conceptualise brain function have advanced, a more holistic understanding of the disease is taking shape. Perhaps the most insightful thesis on this so far–‘The cellular phase of Alzheimer’s disease’, by Bart De Strooper and Eric Karran in 2016–argues that the advent of ‘systems biology’, a method using sophisticated statistics and clever computers, should soon provide the means to generate ‘a comprehensive cellular theory of the disorder’.3 By drawing on individual discoveries from the broad canvas of cell biology, biochemistry, molecular genetics and neuroimaging, this approach may have the power to piece the entire puzzle together. Just as the Human Connectome Project seeks to map all the neural connections in the brain, systems biology may yield an ‘atlas that describes the evolution of Alzheimer’s disease’, they write.

Detection is another transformative development. Rather than focusing on the ultimate destruction of the brain, our lens has swung 180 degrees, pointing instead at the early, telltale signs of a brain in decline. Even as I write this, a study showing that proteins in human tear fluid might indicate Alzheimer’s has reached my desk.4 Biomarker findings such as these will continue to move the bull’s-eye for treatment to middle age, if not younger. And while lifestyle countermeasures have yet to be unanimously approved, we do know that an increasing number of experts are encouraging their use every day. I could probably fill a textbook with all the neurochemical processes we’ve had to reorganise and reimagine. Alzheimer’s research, the scientific field once viewed as a fool’s errand, has transmogrified into the grandest of pursuits.

Attitudes still need to change too, of course, for an ageing society can engender as much defeatism as it can defiance. There’s an uncomfortable comparison worth mentioning here, which is that cancer, while causing a similar number of deaths each year, receives on average ten times more funding than Alzheimer’s. Eliminating cancer is vital, but we shouldn’t pour all our efforts into one pandemic only then to be met by another–vanquishing the devil only to meet the deep blue sea. The situation is starker still when comparing Alzheimer’s funding to that of HIV/AIDS. In the early 1990s the public lobbied US Congress to allocate 10 per cent of its research budget to help those with the infection. Consequently, it’s now a manageable disease and the number of deaths has fallen from 45,000 per year in 1995 to 7,000 in 2013. And yet, the US hasn’t readjusted its budget; Alzheimer’s receives less funding even though it’s now a much bigger problem.5 This makes no sense. Research funding should reflect disease burden.

On a simpler note, we need more research into the biology of healthy brains. We all marvel at our ability to build cities and skyscrapers, to appreciate art and music, to comprehend planetary motion and organic evolution, to explore the world’s oceans and send objects into space, but for all our accomplishments we still don’t understand how the organ responsible for such accomplishments works–or why, with time, it breaks down. As the line between Alzheimer’s and ageing is further discerned, a better apprehension of the brain’s normal inner workings will be crucial to stop the latter spawning the former.

The good news is that people are waking up. In the UK in 2012 former Prime Minister David Cameron launched the ‘Dementia Challenge’, a government initiative committed to more than doubling research spending on Alzheimer’s, from £26.6 million in 2010 to £66.3 million in 2014. In America, Congress has also agreed to boost Alzheimer’s funding by 50 per cent, approving a $350 million increase in its 2016 budget. In Europe, private industry is joining public–private schemes such as the European Prevention of Alzheimer’s Disease (EPAD) initiative, which aims to create a register of 24,000 people for longitudinal studies and clinical trials. And around the world, big drug companies like Johnson & Johnson, Roche and Novartis are coming back to the table, investing $3.3 billion in research in 2014, according to Forbes magazine, more private funding than in any of the preceding ten years.6 In the end, it all whittles down to money.

Perceptions are also changing. Listening to Fox, I couldn’t help noticing the age group of the audience. I’d expected to see mostly middle-aged people, mothers and fathers caring for their elderly parents, perhaps. But these people looked between eighteen and thirty. When I asked some of them why they’d come, it turned out it was for the same reason I decided to write this book: to learn what happened to their grandparents. Fox’s campaign was working. A new generation of neurodetectives was here, curious to find the truth.

It was surely this curiosity that provoked a critical moment in not just Alzheimer’s but all realms of biology: CRISPR (or Clustered Regularly Interspaced Short Palindromic Repeats) was little more than a repetitive sequence of bacterial DNA when it was first spotted by Japanese scientists in the 1980s. But in 2007 it was discovered that CRISPR is in fact a clever molecular defence system protecting bacteria from viruses: when a virus attacks, CRISPR first stores a fragment of the virus’s DNA in the bacterium’s genome, recording the threat, and then uses that information to delete any DNA with the same sequence.

Over the past few years, scientists have figured out that CRISPR can also be used as a gene-editing tool for humans. That’s because CRISPR is composed of two parts: an enzyme called Cas9, which deletes the viral DNA, and a ‘guide’ molecule, which ferries Cas9 to the correct location in the genome. By artificially altering the guide molecule, scientists can theoretically add or remove any DNA sequences they want. The technology is in its early days, but the transformative effect it will have on medicine is staggering. Picture it: you go to your doctor and are then referred to a geneticist, who informs you that your son has cystic fibrosis but that they can simply delete the causative gene and replace it with a healthy one. It will be as if he never had the disease. Or you walk into a clinic with an untreatable and inoperable type of cancer, but the geneticist can use CRISPR to edit your immune system’s DNA to spot and destroy the malignant cells. And of course, the geneticist could explain that you have the APOE4 gene, that it’s a strong risk factor for Alzheimer’s, and offer you an APOE2 instead. While discussing that, they could even offer to screen and edit any other genetic risk factors for Alzheimer’s as well.

Film enthusiasts will have already drawn parallels to the 1997 film Gattaca, a futuristic drama about a world where gene-editing technology has reached its zenith, and, as a consequence, every child has their genome sequenced and edited at birth, ensuring a long and disease-free life. I am not overstating matters when I say that CRISPR is the first inroad towards such a future.

When this day comes, and it will come, there will doubtless be ethical issues concerning designer babies: parents may want to modify genes linked to intelligence, physical strength, behavioural traits, even sexual preference–conceivably leading to another narrative in Gattaca in which the technology inadvertently triggers a new form of genetic prejudice and career discrimination, what philosopher Philip Kitcher calls ‘laissez-faire eugenics’. It won’t be easy to determine where to draw the lines, but as long as we proceed with caution and integrity, that reality should remain unthinkable.

When Fox’s speech was over, I wandered through the rest of the exhibit. There were scores of cheerful, energetic researchers showcasing Alzheimer’s-themed activities and games for the public. One activity was a giant ‘memory wall’, on which people had written their most enduring memory. I glanced over some of them. ‘Getting engaged in the Sahara desert,’ one person had written. ‘Burning my feet on a beach in Greece,’ wrote another. ‘Having my hair brushed by my mum whilst looking at a blue car.’

In a room next door people were playing a retro-style Alzheimer’s arcade game, saving virtual brain cells by shooting the killer proteins beta-amyloid and tau. There was a game called ‘Operation Alzheimer’s’, a huge plastic mould of DNA that people had to repair by switching genes on and off. There were stem cell scientists displaying pictures of brains in a dish, physiotherapists discussing the impact of sports and head injuries, neuropsychologists waxing lyrical on Terry Pratchett and visual Alzheimer’s. And on and on. Alois Alzheimer would have been stunned.

The most encouraging research right now is the amyloid-based treatments. In August 2016 the US biotech firm Biogen released some early clinical trial results for their new antibody drug, Aducanumab, designed to clear amyloid-beta using the brain’s immune cells. For 165 mild Alzheimer’s patients, following a year of monthly injections, the drug reduced beta-amyloid levels and slowed cognitive decline. Such news also bodes well for Big Pharma’s new BACE inhibitor drugs, designed to stop amyloid accumulating in the first place. The challenge now is to recreate such victories in larger trials. As for the other research areas quickly gathering pace–tangle research, neurogenetics, stem cell technology, young blood, prion biology, off-label cancer drugs, visual Alzheimer’s, lifestyle influences–the outlook is good, in that they will either donate key insights to sharpen the blade of amyloid therapeutics, or launch a volley of specialised medications for those who don’t respond to such treatment.

I hadn’t planned to write a book that involved so many separate strands of thought and areas of research; that ended so open-endedly. The pragmatist in me believed there was only one true path on the road to a cure, and that if I could only find it, I would have the answers I so desperately sought. But now, walking through the exhibit, I suddenly understood why it had to be this way. There was no single path, no one idea to pursue indefinitely. The march of each idea provided the footing for another. And only when enough ideas converge shall we ever reach the summit. As Sir Edmund Hillary once said, it is not the mountain we conquer ‘but ourselves’.