Eight hundred years before Christ and a hundred or so after, leading Greco-Roman and Egyptian thinkers put forth an array of sometimes contradictory ideas about aging. Hippocrates cataloged ailments particular to old people and believed medicine had little to offer them, while a key medical text from Egypt around 600 B.C. included “the book for the transformation of an old man into a youth1 of 20.” Plato’s Republic opened with the elderly Cephalus describing the variability of old age2 and how often old people blamed problems on aging even when most older people didn’t have those problems. Aristotle advanced his theory of pneuma, in which finite life force decreased over time, taking with it vitality and the ability to fend off disease and death. In de Senectute, Cicero noted that “since [Nature] has fitly planned the other acts of life’s drama, it is not likely that she has neglected the final act.” The older man, he argued, “does not do those things that the young men do, but in truth he does much greater and better things … by talent, authority, judgement.” Galen asserted that aging was a natural process and only disease counted as pathology. He taught that self-care through diet and behavior could slow aging.
Two millennia later, our response to old age isn’t so different. Google, the National Academy of Medicine, and a host of other public and private researchers, echoing the Egyptians, have launched campaigns to “end aging forever.”3 Perhaps in agreement with Hippocrates, in 2018 the UK appointed a minister of loneliness instructed to pay special attention to the elderly, and the United States passed the RAISE Act to support family caregivers. Recapitulating Aristotelian fatalism, the medical care of older adults is often unstandardized and unpopular. Although researchers have been required to include women and people of color in their studies for decades, similar stipulations for older adults,4 a group that uses health services at much higher levels than the young, were only passed in 2018. Meanwhile, in a blending of Cicero and Galen, the terms healthy or successful aging have become the catchwords for acceptable old age, and thought leaders are competing to coin a word to distinguish the younger, fitter old from the truly old and frail.
And that’s only the beginning of the story.
From earliest recorded history, even those who agreed on pathological mechanisms of old age adopted different interpretations of the same findings. Greek doctors considered old people members of the adult group and their age just one factor among many with relevance to diagnosis and treatment. At the same time, while recognizing them as different, they lumped old people of both genders—because they were “too cold”—with children and women, who “exhibited excessive dampness.” Everyone but adult men were assigned a status somewhere between health and illness (and a legal status of less than full competence) as a result of their inherent “pathological dyscrasias,” or bad mixtures of elements. From earliest recorded history, many societies have considered their oldest citizens less than fully human.
After the fall of Greece, numerous advances in scholarship about the care of older adults came from the Middle East. In Arabia in the tenth century A.D., Algizar detailed ailments of aging, including insomnia and forgetfulness, and wrote books on maintaining health in old age. In the eleventh century, a Persian polymath named Avicenna, often described as the father of early modern medicine, published The Canon of Medicine. He advocated health through exercise, diet, sleep, and management of constipation,5 echoing Galen’s Hygiene. That classic from over a millennium earlier was rediscovered and became a bestseller in the late twelfth century. It went through 240 printings in European and Middle Eastern languages under the title Regimen Sanitatis. Building on Avicenna and Regimen, the Franciscan friar and thirteenth-century physician Roger Bacon revived Galen’s idea of aging as heat loss, also suggested the still-popular wear-and-tear theory, and advanced the Christian idea that behavior determined longevity. His book The Cure of Old Age and the Preservation of Youth was translated into English four hundred years later and well circulated. Little changed over those centuries in Europe, where understanding of illness and aging came primarily from the religious view of humans as immortal and death as the wages of sin.
In the fifteenth and sixteenth centuries, Europeans began taking inductive and empirical approaches to medicine. By observing a range of older adults, philosophers and clinicians concluded that behaviors and interventions could delay and improve but not prevent old age, and also that aging and death were inevitable. In Italy, Gabriele Zerbi’s 1489 book Gerontocomia described physiological changes of age from skin wrinkles to shortness of breath, illustrating that aging was a physical and physiological process. The Italian octogenarian businessman and philosopher Luigi Cornaro, the “Apostle of Senescence,”6 based his work on self-observation. Although senescence refers to age-related cellular damage and biological old age, Cornaro saw old age as a time of promise and fulfillment. He advocated moderation and personal responsibility7 for health so people could experience its rewards. His Discorsi della vita sobria was first published in the 1540s, translated into English in the 1630s, and came out in fifty editions through the 1700s and 1800s. That Cornaro lived to one hundred suggests he was onto something.
In Britain, Francis Bacon studied long-lived people8 and observed that multiple factors, including diet, environment, temperament, and heredity, influence aging and longevity. Studies in recent years have proved time and again that he was right on all counts.9 The French physician Andre du Laurens’s 1594 text, Discourse of the Preservation of the Sight;10 of Melancholic Diseases; of Rheumes and of Old Age, also went through many editions and translations. Its title alone provides insight, calling out vision loss, depression, and arthritis in old age. At regular intervals throughout the nineteenth century, popular books offered rules for extending life, while others, notably William Thoms’s Longevity in Man: Its Facts and Fiction, provoked controversy by questioning whether anybody had ever really lived past one hundred.
With the Scientific Revolution in the sixteenth and seventeenth centuries, when physicians began dissecting and analyzing the anatomy and pathology of subjects both living and dead, increasingly accurate specifics about the aging body emerged. The preeminent philosophers of the time, including René Descartes and Francis Bacon, like the scientists of today, believed that humans could prolong life and cure disease through healthy living and interventions uncovered through medical research. The Marquis de Condorcet correctly predicted that science would improve the physical health of populations, while Napoléon believed humankind would eventually be able to engineer its own immortality. A few thinkers questioned this goal in various ways. Thomas Malthus raised concerns of overpopulation, and, in Gulliver’s Travels, Jonathan Swift imagined the people he called Struldbruggs11 living the dispirited, purposeless lives of people absent the ticking clock of their own mortality.
Over those centuries, too, some saw old age as a disease in the Galenic tradition, an intermediate condition between health and illness. In France, François Ranchin’s 1627 Opuscula Medica distinguished between “natural senescence” because of waning heat and “accidental senescence” as a result of disease. In Germany, Jakob Hutter gave away his main point in the title of his 1732 book, That Senescence Itself Is an Illness. According to him, people died of old age itself, and he developed a theory to explain the underlying pathology. With aging, he wrote, people developed a “progressive hardening of all fibres of the body,”12 which eventually obstructed blood flow and led to “fatal putrefaction.”
Beginning in the eighteenth century, European understanding of the biology of aging rapidly advanced, distinguishing normal aging from disease and recognizing apparently symptomless diseases and organ pathologies in people who appeared to be experiencing healthy old age. This led to the recognition of chronic diseases and of the different presentations of disease in old age. It became clear that death in old age was due not to the waning of invisible humors or heat but to one or many diseases; in other words, it wasn’t a disease itself. The accumulation of chronic diseases, which might remain silently asymptomatic for years, was documented in 1761 by Giovanni Morgagni in De sedibus, et causis morborum. In 1892 Heinrich Rosin, a German professor (of law, not medicine) wrote, “Extreme old age, with its natural degeneration of resources and the natural decline of organs, is a condition of development of the human body; old-age infirmity is no illness.”13 Meanwhile, in the United States, Benjamin Rush’s 1793 Account of the State of the Body and Mind in Old Age, with Observations on Its Diseases and Their Remedies noted that old age was only rarely the sole cause of death. Yet even then the messages were mixed. Long-standing preventive regimens for optimal aging persisted even as new ones emerged. Rush also discussed the influence of genetics on aging and the benefits of being married and having a calm temperament.
The nineteenth century brought a significant reconceptualization of aging. This occurred partly as a result of scientific advances, but larger social forces also played a role. Increasingly, the impact of poverty and social policies on health became apparent, and communities and the state were seen as having social responsibilities for their older citizens. In the final decades of that century, the Victorian focus on the behavior of individuals and notions of life as a journey were denounced by modernists as “creating respectable cowards14 rather than morally empowered individuals.” By the early twentieth century, Americans rejected the earlier religious, metaphysical, and cosmologic explanations of aging and began putting their faith in the biological sciences to explain not so much why we age but how. Understand the how, they reasoned, and you could control it. If you could control it, the why was irrelevant.
Despite these scientific and societal changes, the medical care of the elderly garnered relatively little attention. This was largely due to a belief that older adults were doomed and incurable. The focus on pathological changes with age in the nineteenth century and the emphasis on cures in the twentieth put the needs of many old people at odds with the goals of medicine. There were exceptions to this inattention, particularly by German researchers, including Alois Alzheimer and Emil Kraepelin (who named Alzheimer’s disease after his mentor), and British clinicians. They produced descriptions of dementia15 and the impact of early life habits16 on health in old age, as well as an elaboration of the challenges posed by the coexistence of several diseases in older patients, now known as multimorbidity.17 Still, by the early twentieth century the lines between normal and pathological aging remained unclear.
Most physicians at the time (as now) deemed old people less worthy of medical attention than younger adults who were easier to treat and more fixable. The common approach to their care was neglect, a relatively inexpensive strategy that required little from doctors and had the added advantage of being a disincentive to malingerers. Old patients were confined to beds in dismal surroundings with few activities and scant stimulation, and provided with little more than food and shelter. This led to depression, obesity, muscle atrophy, and pressure ulcers until the 1930s, when the surgeon Marjory Warren, “the mother of British geriatrics,”18 advocated for the physical rehabilitation of the sick elderly.19
Fresh from her residency and given responsibility for 714 patients in a West Middlesex hospital unit, Warren found her new patients “unclassified and ill-assorted.” She created the first “block” of exclusively older patients in the UK and began an innovative approach to their rehabilitation with a multidisciplinary team. Very quickly, she noted that even within groupings of older patients, people of the same age might be radically different in function.20 They did best, she found, when “nursed with those of equal mental capacity.” She also insisted that “nothing that a patient can do for himself should be done for him,” essentially advocating against the behaviors—still so common today—that breed hopelessness and dependency in the name of expediency, a process so ubiquitous it has a name: “learned helplessness.”
Warren modeled her approach on the already fairly well-established rehabilitation of patients after strokes and found that with appealing surroundings, hope, and help, many older patients could return to regular lives: “The number of patients able to leave [geriatrics] wards varies, I think, immensely with the time available and the work done. Many of the so-called ‘incurable’ cases21 only need the patience, tact and quiet energy of a staff trained to work with this type of patient to show a considerable measure of improvement.”
In that regard, not much has changed over the last century. Our health system penalizes hospitals if they don’t fix people and quickly send them home, designates just fifteen to twenty minutes for clinic appointments, and doesn’t provide most nursing facility staff with the time, training, or both to help people in ways appropriate to their life stage. This sets up a vicious circle, as age-blind systems lead to bad outcomes for old people, which in turn reinforce people’s sense that they are not worth treating.
Although not an official specialty in the United States until the 1970s, medical interest in the care of older adults periodically surged during the twentieth century, sometimes as a result of medical progress and other times in response to social forces. Advances in understanding of pathological anatomy fueled the first surge in the 1910s and early twenties when medicine began its shift from prevention to treatment. In those years, physicians interested in geriatrics began writing popular press articles about health and aging. With these articles describing new therapies, older people turned to doctors to help them with age-related challenges. The second surge began in the 1960s and eventually allowed the doctors interested in caring for old patients to join a legitimate specialty.
Even a brief glimpse at the long history of old age shows that scientists and philosophers have debated the same questions about aging for over five thousand years: throughout history, the experience of being old has been shaped by economics, social priorities, medical knowledge and technology, and our beliefs about life and health. We continue to try to understand aging scientifically and existentially. There are still people trying to find the fountain of youth and others striving to make the most of life within the constraints that have defined it since the beginning of time. The lines between normal and pathological aging and whether science can “cure” aging remain unclear. What is clear is that the history of medicine illuminates the history of old age, and the history of old age shows that approaches currently touted as innovative or transformational22 are novel only in the specifics of their how and who and not in their what or why.
SICK
When I was nine and a half, doctors saved my life … twice—I have the scars on my abdomen to prove it.
I was at a sleepaway camp in Colorado when I developed a stomachache. It was my first summer without my parents, so the nurse’s initial diagnosis was homesickness. When I didn’t get better with attention and reassurance, she thought I might have the stomach flu. Finally, after I could barely eat or walk, my cousins—ages ten, twelve, and fourteen—tearfully made the case that there was something very wrong. The nurse took me to a doctor. After examining my belly, he gave her an intense, concerned look. I remember wanting my mother.
With the nurse riding shotgun, the angel-voiced, no-nonsense camp director’s wife drove me and my ruptured appendix over the Rocky Mountains to the local hospital during what would turn out to be that summer’s worst heat wave. This took hours. I lay on a navy-striped camp mattress in the back of a station wagon. It was 1972, and the car and its shocks were old at the time, older than me and definitely not what I needed, with my infected, pus-filled abdomen.
Despite its open windows, the wagon was a box of motionless hot air. The outside temperature that day was 102 degrees, and mine was several points higher. I knew the heat-blurred palm trees, pools of shimmering water, elephants, dogs, and green-blue snakes that paraded across the station wagon’s faded tan ceiling weren’t real and also that there wasn’t much point in telling the adults about them. Though the nurse chattered companionably and regularly checked on me, tension hung in the car as heavy and omnipresent as the heat. The director’s wife drove as quickly as one can on narrow, curvy mountain roads with a sick child on a mattress in the back.
There are many other moments and “facts” I remember, which might be only memories of the stories I later told about that day. For a while, they made me a more interesting nine-year-old.
I don’t recall why the director’s wife drove me, a critically ill child, over the mountains to the nearest hospital, but in those days, in medical crises, men made decisions and incisions, and women provided care.
Each time the station wagon drove over a twig or stone or fissure, it felt as if someone had thrust a burning log into my belly. I tried but sometimes failed not to moan or scream. When a mewl slipped out, I pressed my lips together. I felt so hot. It hurt so much.
They said my mother would be there when I woke up after surgery, but she wasn’t. There were visiting hours then, even for sick children’s parents, and she couldn’t get there from California in time. All that night, as I intermittently awoke in my hospital room, scared and sore and wishing she were there, my mother lay awake in a nearby motel, scared and worried too.
Days later, when a kindly nurse announced that she would help me walk, I laughed and said, “I’m nine, I already know how to walk!” Then I stood and my legs gave way and she caught me. That day and the next, she helped me to learn, again, how to walk.
On the Fourth of July, after the sky darkened and most of the doctors went home, the nurses put some of us in wheelchairs and—against the rules—wheeled us out the small hospital’s sliding front doors to watch. The summer’s fresh night air felt like a balm, and the fireworks seemed as much a celebration of my own brief freedom from the hospital as of our nation’s independence.
At the end of that week, people stared at us as my mother wheeled me through the Denver airport. I insisted on walking into the bathroom stall so at least some of those strangers would understand that I wasn’t the damaged child I appeared to be, someone they obviously found different and upsetting. I wanted to make clear that I was actually an entirely normal child experiencing a temporary setback. I thought, too, how hard it must be to be a child who would not get up from their wheelchair, who knew people would always stare, and their sympathy and horror would be permanent.
Back in San Francisco, I was happier than I’d ever been to see my younger sister and our house. And I was outraged, fifteen minutes later, when my parents said that I had to go back to a hospital. Apparently it had all been arranged; I wasn’t so much going home as moving to a hospital closer to home.
My last two memories of that summer are from the second hospital, ones I wouldn’t fully understand until fifteen years later as a medical student.
In the first, I am on a stretcher on my way to surgery. The gurney is surrounded by people in medical clothes: green scrubs, white lab coats, paper hats, face masks. IV bags dangle over my head, and at the foot of the bed, machines flash numbers with squiggly lines.
We are set to go down to the operating rooms. The elevator doors close, and then, before we go anywhere—or so I think now, looking back—there is shouting and the doors open again, and I’m not sure if we have gone up or down or wherever we are supposed to be going, or why we are stopped, and why everyone is so frantic.
After that, there was the hallway outside the operating room and the white ceiling sliding by above. I was given, through an IV, medicine that instantly rendered me and the surrounding world light, wondrous, beautiful, and unbound in ways that might help a person who is not a drug addict understand why somebody could become one.
I didn’t recognize that elevator scene for what it was until I was a medical student in a hospital seeing someone else try to die. On that day in the summer of ’72, I “coded” in the elevator and was resuscitated. I made it to the operating room, where the surgeon opened my abdomen, cleaned out the pockets of pus that hadn’t been adequately cleaned the first time, and put in drains to ensure that this time I got better.
In the second memorable scene, it is evening and my parents are in my hospital room, keeping me company. Although my covers are up, I look pregnant and my stomach hurts with a severity that makes the belly pain I had on the trip across the Rockies seem like a welcome alternative. I cannot get comfortable. I moan and cry. Nurses come in and out, doling out medications. I cry more, though it makes the pain worse.
The hospital and city are dark and quiet when the surgeon appears, wearing neither scrubs nor suit and tie but slacks and a sport shirt. He talks to my parents and me before examining me. The surgeon says my gut is paralyzed. We must get it moving again, he says, and to do that I will need to walk and move around.
I protest, cry more. My mother cajoles. The surgeon insists. He starts me off slowly, has me lie first on one side, then the other. He positions me atop the bed on all fours like a toddler on the verge of pulling up. Then he has me walk the halls with one of the nurses. I begin to fart. I fart and fart. At some point, the adults go home and I go to sleep.
Is it possible that a few key scenes over a month or two in the summer of 1972 made me a certain sort of doctor? I think so. When I first walked into a pediatric ward as a medical student, a body blow of sensory memories time-traveled me from almost-doctor to small-sick-child and that world of cold walls, tall strangers, acrid odors of medications, antiseptics, and bodies, the endless refrains of beeps, moans, whispers, pain, and unknowing and wordless long, lonely nights. My summer of sickness taught me things a doctor needs to know about what care is and is not, and what it’s like to be sick and disabled in our ability-obsessed world, and how pain can be so bad that you would do anything to make it go away. It taught me what it’s like to be frail and small and vulnerable, and what kindness looks like, and cruelty, and how very much parents love their children, and what medicine can do when its tools fit a problem. And it taught me how wonderful it is to be alive and healthy, and that a great trauma can be transformative in ways both good and bad.
As director of human resources for a large San Francisco medical center, Veronica Hoffman knew a thing or two about doctors. They got up early and they were compulsive. Most likely, she reasoned, they’d get up early on Sundays too.
She waited as long as she could. A little before eight, she looked over at her seventy-nine-year-old mother, Lynne, and picked up the phone.
Across town, I refilled my coffee cup and checked my pager to make sure its batteries were charged. So far, my on-call weekend had been eerily quiet, and I wondered if I’d missed calls. The pager was in my hand when it alarmed, its glowing green display showing a patient’s name, record number and contact information, the caller’s name, and the reason for the call. In most practices, the caller is the patient. In geriatrics, it’s not uncommonly someone else—an adult child, hired caregiver, friend, or visiting nurse. On this particular message, in addition to a daughter caller, the most notable words were: mother not herself and worried.
After I dialed, Veronica answered almost immediately.
“Thank you for calling so quickly.” Her words, although measured and polite, carried an unmistakable undertone of urgency and concern. “I don’t know if it’s anything. The paramedics were here last night—early this morning—and they didn’t think so. I’m probably overreacting.”
I said that if something was worrying her, that was a good reason to call, and I asked her to tell me what was going on.
“My mother and I had a special event planned for yesterday. She was looking forward to it all week. We kept talking about it. We’d even discussed what she’d wear and what time we’d leave. And then yesterday morning she didn’t get up. She just didn’t seem interested in going. It was really strange.”
I murmured to signal that I was paying attention and grabbed a pen and scratch paper while launching our electronic medical record on my computer. Already, I knew Veronica was right to have called.
“She did get up eventually, but all day she just didn’t seem like herself. I asked if she felt sick but she said no. She didn’t even mention the thing we were supposed to be doing. I told her I thought we should go to emergency, but she said no, that there was no need, and she didn’t want to go.”
At this point, I couldn’t resist the urge to interrupt. “What’s she like usually? Is she healthy?”
In particular, I wanted to know whether Lynne had any medical conditions that might offer clues to whatever was going on. Given Veronica’s response, not only might the potential explanations for her mother’s changed behavior differ but so might the treatment options and my follow-up queries and next steps.
“Oh, she’s great,” Veronica said. “She has some problems but nothing too serious—blood pressure, heart disease, arthritis, that sort of thing. That’s why she sees Dr. P., but she’s in good shape, mentally and physically. We just live together because we like each other.”
That made me smile. Meanwhile, I debated whether to ask for a few more specifics. Sometimes families and doctors have different notions of what constitutes minor and major health issues. In theory, I could get the essentials of her medical history from the electronic record. In practice, getting into the record from afar required making my way through a series of password-protected firewalls, and I wasn’t in yet. Still, something new was going on with Lynne, and I needed to prioritize determining its urgency.
I asked Veronica to tell me more. “Is she doing basic things the way she usually does them? Eating? Walking? Talking …?”
“She had a good breakfast and some lunch yesterday. Not too much dinner, but she was tired. We did go out for a walk in the afternoon. I thought it was strange that she wanted to walk but didn’t want to go to the event.”
I relaxed a little. If Lynne was seriously ill, she probably wouldn’t have much appetite or enough strength to go out and walk.
“She’s just kind of slow,” Veronica added. “And vague.”
I unrelaxed. “Has she ever been like that before?”
“Never.”
At this point I needed to figure out whether there was a brain problem specifically or if the change more likely represented delirium. Although in everyday life delirium is used to mean a state of self-delusion or ecstatic excitement, it has a completely different and very specific meaning in medicine. It refers to a syndrome of mental confusion that is always costly (to both the patient and the medical system), usually leads to other complications, and diminishes the patient’s chances of a complete recovery. It prolongs the time a person spends in the hospital, leads to permanent declines in general health and mental function, and increases their risk of nursing home placement and death. A seriously ill person of any age can develop delirium, but it occurs most commonly in older patients, especially if they have underlying dementia. Delirium can develop as a result of something as seemingly minor as catching a cold or taking an over-the-counter allergy or sleeping pill. It can happen because of major or minor infections, surgery, a broken bone, a medication, a new environment, and almost anything else.
I asked Veronica whether her mother’s speech made sense.
“The paramedics asked her a bunch of questions about her name, my name, the date and year, how old she is, where we live, and she got them all right. She’s communicating, just slowly. Our walk was slow too.”
Because of age-related differences in disease presentation, doctors often get more relevant and helpful information from old patients when we ask about changes in basic activities. Such questions resemble those used by pediatricians about eating, sleeping, peeing, pooping, and playing. In geriatrics, we use the same first four and add to them questions about mobility, pain, mood, behavior, and how they spend their days. The point is not to infantilize old people but to recognize the biological reality that at both ends of the life span, diseases are less likely to manifest in the ways we define as “standard” and more likely to show up as a change in basic function.
At seventy-nine, Lynne wasn’t in the old-old category chronologically or functionally, but her daughter’s answers confirmed a problem in need of attention. Whatever it was, it appeared to be serious and slowly progressing. We needed to act before it became catastrophic.
“Mom got worse last night,” said Veronica. “She was getting ready for bed and not wearing her pajama bottoms. She never walks around naked. At around ten, before I went to sleep, I saw the light in the bathroom. When I got up a little before one, it was still on and mom was just standing there. She seemed disoriented and I thought maybe she’d been there that whole time. That’s when I called 911.”
She paused as if expecting me to admonish her.
I stopped typing her words into our record system. “You did just the right thing.”
“The paramedics didn’t think so.”
“They were mistaken.”
“They checked her out and said they couldn’t find anything wrong with her. When I said she seemed confused, they said, ‘Your mother’s almost eighty years old, it’s the middle of the night, what do you expect from her?’ I was devastated. They made me feel like I had done something wrong by calling 911.”
I had to try and behave professionally, even if what I wanted to do was curse and lament, not just those particular paramedics but our entire health care system with its devastating ignorance about old age. I was also already thinking about who we’d need to contact to provide geriatrics training to our city paramedics and all the reasons they would give for not needing it or not having time for it.
“You weren’t wrong.”
Shaming a patient’s concerned relative is, under any circumstance, unacceptable professional behavior. “What happened next?”
She said they looked through her mother’s medications.
That was a good next step. Of the many conditions that can cause delirium, drugs are among the most common offenders. Starting a new medication can do it, as can stopping certain types of medicines abruptly without tapering the dose. Occasionally, people can react to something they’ve been on for years as their body’s ability to process the drug changes.
I scrolled down Lynne’s medication list in our electronic record. But what appears on our official logs and what a patient is actually taking often differ, so I also asked Veronica what her mother was on and if there had been any recent changes.
Veronica read me the list, then said, “The paramedics checked all the bottles and said she seemed to be taking everything except the antidepressant. I guess she stopped that three weeks ago, though I didn’t know about it. They said that was it. That was the problem. They gave her a pill at two thirty this morning and told us both to go back to bed.”
I took a deep breath. Suddenly stopping an antidepressant can cause a bad reaction, but the withdrawal would manifest gradually over days, not suddenly three weeks later.
“How is she this morning?” I asked.
“Sleepy. And still not at all herself.”
Since Lynne was a fairly healthy seventy-nine-year-old and because the basic activities questions hadn’t led me to any specific diagnosis, it now made sense to run through the review of symptoms, or ROS. After discussing the patient’s primary concerns, doctors use the ROS to fill in details and ensure nothing else is missed. The questions progress from head to toe, grouping organs by location, such as eyes, ears, nose and throat, and physiological role, such as cardiovascular or nervous system.
Veronica answered some questions herself but mostly she repeated my queries to her mother. I could hear faint nos periodically in the background. No fever, cough, shortness of breath. No increased urination or new incontinence. No chest pain. No weakness in a limb or changed vision, speech, or swallowing. No belly pain, nausea, or vomiting. Some diarrhea for a few weeks, but that happened. No blood in her bowel movements.
But then, there it was: Lynne had a headache, and not just any headache. The worst headache of her life. She described it as a ten on a scale of one to ten, where ten is the most severe pain imaginable. That she said she felt fine while experiencing that kind of pain was another worrisome clue.
I told Veronica that we needed to get her mother to the emergency department right away.
“I wanted to take her yesterday and earlier but she wouldn’t go.”
“She’s not herself and she can’t make good decisions. Just tell her what’s happening.”
We agreed that Veronica would phone for the ambulance and I’d call ahead to the hospital. Before we hung up, I told her how glad I was that she had called, that her instincts were just right, and that her mother was lucky to have such a perceptive and persistent daughter. I also promised to give the paramedics feedback on their care of her mother. At the very least, they needed to know that most older adults are not confused, that sudden confusion always indicates a problem in need of medical attention, and that ignoring a family member’s concerns is bad medicine.
After a few hours, I logged back into the medical record. On the CAT scan of Lynne’s brain, it looked as if someone had spilled black ink onto a white picture. Sometime between when she’d gone to bed Friday night and when her daughter had gone into her bedroom on Saturday morning, she’d begun bleeding into her head. By Sunday morning, she’d had a large hemorrhagic stroke.
Three months later, I recognized Veronica’s name in my e-mail inbox. In her message, she apologized for not writing sooner and thanked me for my help and support. She then gave me an update on her mother, who was finally back home: “It has been a difficult journey, but I was able to come home this afternoon and give my Mom a hug. I was able to ask her what she might like for dinner. I was able to plan her eightieth birthday celebration in September.” Lynne was changed by the stroke, but her life still provided pleasure and meaning to them both.
Veronica admitted she was still stunned by the paramedics’ comments about her mother. The reality is that they probably meant well. They might even have been following procedures. Across the country, the police are called and often make arrests of people with dementia who get lost and trespass or who fight back when “a stranger” (a caregiver they don’t recognize) appears to be trying to take off their clothes or make them go somewhere they don’t want to go. In cities and prisons, older people are shot when they are “not cooperating”23 because they cannot hear commands or can no longer fall to their knees. In some cases, as with Lynne, they are assumed to have dementia when they do not, and in others they are assumed to be fully responsible for their actions when they are not because of dementia. Yet when e-mailed about a course designed to fill in geriatrics education and training gaps among practicing health professionals, a locally and nationally prominent physician leader said he couldn’t think of anyone who would benefit. In medicine, clinicians often believe that taking care of older patients is the only necessary qualification24 for taking care of older patients, a logic they would never consider applying to the care of children or people with cancer.
But change is coming. Many police departments, including ours in San Francisco, increasingly recognize the unintended harms25 to older adults from their lack of geriatric knowledge and have begun training programs. Their efforts may be paying off. In a recent viral news story, a Southern California police officer called by a bank to arrest an upset nonagenarian instead took the man to renew his expired driver’s license, then brought him back to the bank, where he successfully cashed his check.