Introduction

You have to begin to lose your memory, if only in bits and pieces, to realize that memory is what makes our lives. Life without memory is no life at all. Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.

—LUIS BUÑUEL

It is singular how soon we lose the impression of what ceases to be constantly before us. A year impairs, a luster obliterates. There is little distinct left without an effort of memory, then indeed the lights are rekindled for a moment—but who can be sure that the Imagination is not the torch-bearer?

—LORD BYRON

THE PROCESS OF WRITING THIS BOOK, THE PHYSICAL act of putting it together from diaries, scribbled notes, books about the mind, and concentrated bouts of introspection, has proven an illuminating exercise for me, demonstrating just what it is that dementia takes away. (Answer: everything; every last thing we reassure ourselves that nothing could take away from us.) The way the brain works, the supercomputer folded modestly into every human head, marshaling its forces, making connections, prompting and synthesizing, is dazzling and extraordinary and yet seems every day perfectly unexceptional and ordinary to us. There’s nothing we take more for granted. In recording the decline of somebody with dementia, and seeing her preoccupations grow narrower and narrower, and her intellectual pathways block off, I’ve found myself preoccupied with unexpected things, more and wider things, my mind disappearing down all kinds of unforeseen alleys, which has been exhilarating but also poignant. I’m left feeling a profound gratitude to the life of the mind, how associative it is and how rich, in its leading on from one thing to another, into that whole interior landscape of yoked-together and often incongruous thoughts that adds up to a self. This book has turned out to be as much about the unraveling of a caregiver as it is about the person cared for, but its starting point was wanting to write about Alzheimer’s and about life with an Alzheimer’s sufferer, my mother-in-law, Nancy.

We spent many years looking after Nancy at one remove from us, a responsibility made more stressful by distance, and then at closer range—in a big Victorian house in a remote part of Scotland, with Nancy and her disabled husband, Morris, living with us and our three children. The house was not an ordinary one and, in a way I didn’t anticipate, became another character in the story. It was an imposing, drafty mansion on a wild, near-treeless headland. We moved there specifically to attempt an extended family; when that failed, we had little choice but to leave. The official gloss put upon this exit is of the “phases of life” sort: job done, time to go. The private verdict is soaked marrow-deep in defeat.

I’m aware that in many ways this is a story about privilege. We could afford (could convince the bank we could afford) the big house and the part-time help, and when push came to shove (and it did, literally), my in-laws could come up with the fees for a good nursing home. But there are monetary consequences to caregiving, above and beyond the obvious weekly bills, and there has been a real financial hangover that we’re still working through, brought on by months and years of having no choice but to put work second.

Well, so what, you may be thinking. You took in your husband’s parents. Boo-hoo. Big deal. Across other, more populous continents, three-generation households are the norm, after all (the Asian three-generational photograph is lodged reprovingly in my brain), and they will likely become more commonplace here, as the care crisis bites harder. It’s pretty clear that it will bite. The world seems to be in the grip of a dementia epidemic. Here in Britain, there are 820,000 people who’ve been diagnosed with dementia, two-thirds of them women, and the figure is rising sharply. In the United States, it’s more than ten times that number. Of these, according to the Alzheimer’s Association, 5.3 million have Alzheimer’s disease. There are estimated to be more than 35 million dementia sufferers across the globe, with over 65 million forecast for 2030 and more than 115 million for 2050: the figures near doubling every twenty years. That’s why the phrase “dementia time bomb” is beginning to be used. The devastating extra sting of dementia is that, unlike heart disease and cancer, it doesn’t shorten life. It’s a cruelly lengthy business. The changes in the brain can begin twenty years before a formal diagnosis, and the average life expectancy afterward is eight years.

Alzheimer’s disease is only one of many varieties of dementia, though by far the commonest. Over 60 percent of diagnosed dementia sufferers have Alzheimer’s disease. Back in 2002, BBC News reported that more than 40 percent of UK home caregivers of someone with Alzheimer’s had been forced to give up work in order to look after the person. In the United States, 10 million people act as caregivers to someone with dementia and millions more offer support. About the same percentage of American caregivers are not employed, and two-thirds of those who can manage to hold down a job report major disruption to the workweek. I quote these statistics as a roundabout way of answering my own question: Why write this book at all? There were several reasons. One of these was to share in my own revelation, hard-earned, that Alzheimer’s isn’t just about memory loss; that memory loss isn’t just about memory loss, but leads to disintegration. I wanted also to kick the system ineffectually in the shins; to give a glimpse into the dementia abyss; to show that for every “client” in the statistics there are one, two, four, six others (aka the family) whose lives are blighted in addition; in short, to give a little insight into the reality that ensues from the apparently noble idea (the noble, and for the country’s financial bottom line, far preferable idea) that the elderly ill should stay at home whenever possible.

Question: Do governments understand just how dehumanizing Alzheimer’s is? (A rhetorical question. Answer: no, or they wouldn’t withhold good drug treatments or limit research programs on grounds of cost.) Question: Does anybody who hasn’t been through it understand just how dehumanizing caregiving can be? (A rhetorical question. Answer: no, or there would be proper nursing home provision and it would be free.) As things stand in the United Kingdom, dementia patients in nursing homes, unlike cancer patients in hospitals, are regarded as “social care clients” and charged hotel rates, and if they have savings and houses must give them up to pay the bills. We British may regard ourselves as two steps ahead of the United States in the matter of universal rights to health care, but when it comes to dementia, the two systems are very alike. Medicaid will step in and pay for residence in a nursing home only if the ill person’s own assets have dwindled away almost to nothing, and it’s pretty much an identical situation in the United Kingdom. Once the money runs out, the ill person’s house is likely to be sold to pay for care, unless a spouse or dependent is still living in it. Even if American houses can be placed out of risk in the short term, certainly they are at risk after the owner with dementia has died, via estate recovery (the rebate of nursing home fees in arrears to Medicaid—a policy that’s pursued energetically in most states). Advice about loopholes in the system that allow a family to hang on to a loved one’s home long term has grown into an industry, and almost every American Web site that talks about costs and rights to do with dementia suggests consulting an attorney. It’s a system that’s good for lawyers: in other words, bad law.

There’s also a selfish answer to the why-write-the-book question. I’m one of those who have found work incompatible with caregiving, even work that I have always done at home, sitting at a table by a window, or slouched uncomfortably on a sofa, laptop at a precarious angle, mediating children’s interruptions—work that you might assume would be ideal in the circumstances. It’s more than economics in my case. Writing is more in the way of a compulsion. It may even be a psychiatric disorder. If days pass dryly—that is, without sentences being made and remade—I find that I begin to drift into the arena of the unwell. Throughout my years of caring for Nancy, the drive was there to produce something salable, but other than the occasional article, the content wouldn’t follow the impulse. Following an early career producing sensible nonfiction and then a long hiatus while having and raising children, I was supposed to be cutting loose and writing a novel—and, on the face of it, I was immensely productive, almost manically so. I wrote two and a half novels. I wrote them in a rush, thinking, I can make some money at this (almost a guarantee of failure). The two finished ones were bad, superficial, studded with frustrations like cloves in an orange. The half is still a half, stopped, stalled. The muse left me. She did it quite abruptly, though things had been sticky between us for a while. After that, all I could seem to write about with any passion or conviction was my mother-in-law. Writing about her was sustaining through the dark days of creative roadblock. It was, to be blunt, a way of not cracking up.

This might also be the moment to tell you that names in the account that follows have been changed. Nancy is beyond minding or even registering the fact that she’s the subject of what you might call an unauthorized biography, and changing names provides only a tissue-paper-thin layer of anonymity, but it feels right, nonetheless.

A lot of what follows is taken from unedited diaries, which accounts for the use of the present tense and also for the emotional rawness of some passages. While filling the diaries, I used some of the entries in a newspaper piece I wrote about Nancy. It was straightforward and at moments graphic about her problems (and ours), and this didn’t go down well with online commentators. Their chief complaint had to do with my having written intrusively about my mother-in-law without her consent. Even by then Nancy was long past the point of being able to consent to anything; she found the choice of Weetabix or cornflakes baffling enough. Intellectual competence aside, the argument remains that whatever the truth about rights, it’s in bad taste to write in such unsparing detail about another’s decline. The daughter of former British prime minister Margaret Thatcher has been pelted with rebukes since disclosing her mother’s dementia. Her critics insist that the disease should be “kept in the family,” which is only a short hop from suggesting that it’s stigmatizing and shameful. Tony Robinson, the actor who played Baldrick on the popular BBC show Blackadder, was accused of something similar when he let UK Channel 4 make a documentary about his mother’s last weeks. His response was robust: no, quite the opposite; he was proud of the program. There’s a public service element to allowing media access, even if it might appear to the viewer to be cloaked in voyeurism. Those of us who have loved ones embarked on the dementia journey—and it is a journey, with clearly defined stages—publicize the details of their decline not despite our love, but in large part because of it. In some cases, the love is shared with the nation at large. In the case of Ronald Reagan, the announcement of his Alzheimer’s disease by means of an open letter in 1994 (a poignant and brave last message that marked the end of his public life) prompted a whole new national surge of affection for the former U.S. president.

SCIENCE STILL ISN’T sure precisely what triggers Alzheimer’s, though things are moving so fast that the mystery may be solved by the time you get to read this. (In fact, the pattern in the last few years has been that they move fast and get nowhere much.) What’s uncontroversial is that Alzheimer’s brains show the presence of two weird and provocative things: (1) a wild overproduction of beta-amyloid, a naturally produced and usually soluble protein, contributing to sticky blobs called plaques and (2) the knotting and snagging of the tau protein that forms the “rungs” in the communication ladders within brain cells into tangles. The race is still on to determine what the definitive cause is.

An adult brain has about 100,000 million nerve cells, individual neurons that each look rather like the branching root of a tuber pulled out of the ground—tubers of different shapes according to flavor. A good analogy, put forward by the Oxford professor Susan Greenfield, is to think of the brain as the Amazon rain forest inside your head. In the Amazon rain forest’s 2.7 million square miles, she says, there are about 100,000 million trees. Imagine all that foliage condensed into the size of a cauliflower within your skull: 100,000 million tiny trees, making a dense neuron forest. Our memories and our thoughts travel through the forest as encoded electrical signals. The “roots” of the neuron are called dendrites (from the Greek for treelike), and its stalk (trunk) is called an axon. The information comes in to the neuron via the dendrites, into the soma (cell body)—that’s the front door—and then goes out the back door, travels up the axon, along parallel lines of communication called microtubules, and out the other end at branches called synaptic terminals. This information moves, in tiny leaps, from axon to dendrite, from one neuron to the next. How does it do that? For a while there were two camps of conjecture, spark versus soup. The sparkers, who believed in an electrical leap, lost out in the end to the soupers, who thought that the constituency of the soup was key. The spaces at which the crossing is made are called synapses, though they’re more like ports than spaces—ports at which clusters of neurotransmitters are waiting as a chemical transport system. Subsequent research has shown that there are indeed electrical as well as chemical synapses in the brain, though the electrical ones are heavily outnumbered. The number of dendrites and synapses varies hugely according to the neuron’s function, but on average a neuron is thought to have around 7,000 synaptic terminals. Multiply that by 100,000 million and the mind begins to boggle.

In photographic comparison, a normal brain resembles a freshly peeled chestnut, pale and fat and glistening, and a brain with advanced Alzheimer’s disease looks rather like a walnut, shrunken and shriveled with bits apparently eaten away. The disease takes place as a physical invasion, involving the progressive destruction of the neuron forest. Under the microscope, the damage is theatrically obvious: there are plaques—fuzzy, rust-colored accretions of protein fragments—which interfere with the transport network, and tangles, which look rather like strands that have grown over the neurons, like bindweed in a garden, though in fact they’re a distortion of the neuron wall itself, its microtubules having collapsed into knots. As cells wither and die, gaps form in the tissues, leaving characteristic holes. American researchers working with the new generation of scanners, and thus able for the first time to look into the brains of living Alzheimer’s patients, have found that the disease starts in or adjacent to the hippocampus (the memory-processing zone) and moves farther into the limbic system (our emotional nerve center); around eighteen months later, it has crept into the frontal lobe (site of the thinking, reflecting self). The disease always starts in the same place and takes the same general route, but proceeds unevenly in its spread. Some sections of the brain will be decimated, but neighboring ones might be unaffected and normal. It’s rather like a forest fire in which clumps of blackened stumps sit adjacent to trees that seem oblivious to the disaster, untouched, their green canopies intact.

The term dementia (from de mentis, “out of the mind”) was coined in 1801 in the asylums of Paris. Today it is used to mean brain failure, and in just the same way that heart failure is a condition caused by a whole host of problems, brain failure has many sponsors. One in fourteen UK citizens over sixty-five has some form of dementia and one in six over eighty, but for UK citizens reaching the age of sixty-five in 2010, the risk of developing dementia is one in three. Almost one in six Americans aged sixty-five will go on to develop dementia, and more than one in five aged eighty-five. And that’s the trouble with it, in terms of PR. It’s an old person’s disease, by and large, and elderly ill people aren’t easy to “sell.” The issue is confused by our muddle about what’s normal in old age—the idea that senility is an ordinary part of the human condition, that it is aging itself made manifest, and thus can’t be cured. Progress is slow.

Research funds aren’t generous, despite the fact that currently dementia costs the United Kingdom about £23 billion a year and the United States a staggering $148 billion just to deal with damage limitation and long-term care. Unpaid caregivers, their lives transformed into a round-the-clock vigil, are saving the British government about £12.4 billion. In just one year (2008), the economic value of unpaid caregiving in the United States was estimated to be $94 billion. Two-thirds of UK citizens with late-onset dementia are living in a family home; about 70 percent in the United States. Both figures are probably higher when undiagnosed cases are taken into account.

In the United Kingdom, only £61 is spent on research per Alzheimer’s victim, though the amount is £295 per patient for cancer. In the United States in 2008, $5.6 billion was spent on cancer research, but only $0.4 billion on dementia science. Cancer has higher cultural status, even, perversely, a twisted, dark kind of glamour. Plucky young people get it, pop stars battle it, pretty wives and dashing young husbands die of it, and their pictures are spread across the newspapers. Cancer is a disease that journalists get and write about on the premise that if life hands you lemons, make lemonade. People with dementia don’t write about it much because writing isn’t something they do—or wasn’t, until recently, when the very-early-diagnosed patient lobby sprang into being and people like the writer Terry Pratchett began speaking out. The much-loved author of the Discworld novels, a man who’s sold 55 million books worldwide, allowed a BBC TV crew to follow him for twelve months. The resulting television program (Living with Alzheimer’s) charted unsentimentally the beginnings of his decline, his defeat by the attempt to tie a knot in his tie, his having to pause in giving a reading because he found that a “shadow” was falling repeatedly across the page. This is the kind of cultural event that introduces people to the idea that dementia has something to do with them. It will be a long road. In general, the Alzheimer’s demographic and its symptoms have meant that it’s very low caste—something that, even now, we associate with decay and the cabbage-and-disinfectant scent of the geriatric ward.

There are widespread misconceptions about the disease. Uncertainty is the midwife of misconception. The trouble is, nobody knows for sure what triggers Alzheimer’s. All we can hope for is that keeping fit, doing crosswords, and eating well will spare us. They don’t, necessarily. The illness of writer and philosopher Iris Murdoch attracted so much interest because people were amazed that someone like that could fall prey to Alzheimer’s, someone so clever, articulate, affluent. We live in an age-defying, mortality-denying culture. We don’t believe in ourselves as elderly. We’re interested in cancer and the carcinogenic because those are words that might turn out to apply to the thirty-eight-year-old as much as the seventy-eight-year-old; cancer afflicts the young and rich and fit. If Alzheimer’s equals old age, then that’s something we’ll deal with later … though we’ll be fine, because we drink soy milk and do Sudoku and play tennis on the weekend. The most widespread misconception is that dementia’s a good way to go: “They’re in their own little world and pretty happy” the misconception goes, and “they’ve no idea they’re going to die of it right up to the very end, which doesn’t sound too bad to me.” Very occasionally and exceptionally, in the online Alzheimer’s community, sweet-tempered-to-the-last is reported; the slow-fade sweetie who was never any trouble and died smiling in bed before indignity could take hold. But that isn’t the norm. That hasn’t been Nancy’s fate, alas.

IF I HAD to pick one catchall descriptor for Nancy’s life in the last few years it would be misery. Profound misery, unceasing and insoluble. She knows that something is wrong, very wrong, but what is it? She’s had a series of terrible daily encounters with herself and her environment that might have come directly from an amnesiac thriller: waking to find she has aged fifty years overnight, that her parents have disappeared, that she doesn’t know the woman in the mirror, nor the people who claim to be her husband and children, and has never seen the rooms and furnishings that everyone around her claims insistently are her home. Time has slipped, gone seriously askew. Every day for her is spent in an ongoing quest to put things right. The trouble is, she can’t seem to concentrate on the question or on possible clues to it. She can’t navigate the problem. When she left us for the nursing home, she was daily engaged in a very protracted, slow-motion form of panic. It’s been over eight years now since the formal diagnosis and eleven years at least since symptoms began, but even after all this time, she’s only at stage 6 of the disease. Stage 7 looms, the cruelest and last phase, with its loss of continence, motor control, speech, and ability to swallow. Eventually her lungs will forget how to breathe, her heart forget how to beat, and her quest will come to an end.

I have thought, said, and probably even written in here somewhere that Nancy has lost her self. That at least is the impression anyone who knew Nancy twenty years ago would have if they spent a weekend with her. The things that made her herself are all but gone now, I say, but what does it mean to say that? Obviously she is still herself, isn’t she? She isn’t anyone else. It’s just that the self is changed. Disease has changed it, or else, in some vaguely science-fictional manner, overlaid it with something new. But what exactly is the self, anyway? Must it have unity, continuity, in order to be authentic? Does it exist beyond and beneath the health or otherwise of 100,000 million neurons? Is there something else that encapsulates the self, something extra, indefinable, that we call the soul? If, as some philosophers of the mind argue, being conscious can’t be said to be without content, that it has to do with being aware of not only your own person but also your past and future, your place in the world, your culture and context, your hopes and fears, then where does that leave Nancy? John Locke may have come up with the notion of “consciousness” specifically to spike Descartes’s idea that we are thinking all the time, even when sleeping, but Locke also thought that we are only ourselves in having our memories, and defined personhood accordingly. Locke’s definition, being antique, is easy to forgive. It’s surprising, though, to find much more recent definitions that agree broadly with his. As late as 1973 an American philosopher named Mary Anne Warren demanded of persons that they be conscious, rational, capable of abstract thought, able to communicate, able to exercise free will, and have self-awareness. Under this severity, nobody with brain damage is a person, and Alzheimer’s, so often misreferred to as a mental illness, involves a catastrophic form of brain damage.

Materialists would contend that there is no soul, that we are only a kind of organic machine, our notion of a unique self misguided. It’s difficult not to be convinced by this idea, seeing Nancy’s selfhood warp and flicker and wane as the disease colonizes her. It’s not good—not even for privileged bystanders, counting their blessings—to see a self under attack. We prefer to think of our selves as something original in the world, inviolate, independent of our physical bodies. The idea that we are biochemistry, and that’s all, that thoughts and feelings are produced by neurons, that neurons can die and our selves die with them … that’s a deeply undermining idea. It’s far more comforting to contend that Nancy’s soul, her essential self, remains intact beyond the reach of her struggle to think and express herself, and will be liberated and restored by immortality. I try hard to believe this when I see her, alone in the dayroom in the nursing home, sitting rubbing her hands together and muttering. I can’t help wondering what she’s thinking. Is she thinking? Is she having a dialogue with her disease, negotiating with it in some way, aware of the great buried store of memory, her past, her self, glimpsed under the tangles of Alzheimer’s like a ruined house under the suffocating grip of ivy?

Now that she’s at one remove from us again, it’s easy to love her, and where love falters, guilt is primed and ready to fill its place.