The Ponce de León Thing
PRIOR TO 1940, PHIL explains, the primary function of medicine was the diagnosis of disease. Prior to 1940, in fact, we were helpless to ameliorate most conditions, especially the many lethal and debilitating childhood illnesses. Before 1940, what doctors refer to as their “therapeutic armamentarium” was hardly impressive. Except for a very few medications (digoxin, thyroxine, insulin), immunization for a handful of infectious diseases, and some surgical procedures, doctors had little understanding of the causes of most diseases, and little wherewithal to alleviate their effects. “Indeed,” as Gerald Grob notes in the prologue to The Deadly Truth: A History of Disease in America, “some of the therapies deployed before 1940 were very likely harmful.”*
Sixty years later, however, there are few diseases for which something cannot be done to alleviate symptoms and to prolong life. Although we have made extraordinary advances in the treatment of diseases such as AIDS, of childhood illnesses, and, as in my case, some of what were formerly life-ending conditions, when it comes to those diseases that currently account for the greatest portion of our morbidity and mortality—heart disease, cancer, diabetes, the major mental illnesses (schizophrenia, depression, bipolar disorder), many infectious diseases, and the vast proportion of other long-duration illnesses—we continue to remain largely ignorant of their etiologies.
The essential point, my friends emphasize, is that while we have become wonderfully efficient at controlling the symptoms and progression of the diseases they commonly encounter in their clinical practice—heart disease, AIDS, stroke, depression—we have made much less progress in understanding the underlying causes of these diseases and, therefore, in being able to prevent or to cure them.
“The great problem with neurology, for example,” Phil says, “is that we don’t know how to help the brain heal itself. All the other organs and body parts can heal themselves, and we can, to varying degrees, aid these processes better than we used to. But not the brain. Our knowledge of the brain is way behind our knowledge of, say, the heart and the circulatory system.”
My readings in the history of disease, and of medicine, confirm what my friends tell me. David Weatherall, professor of medicine at Oxford and a specialist in human genetics, puts matters this way: “While genuine progress has been made in our understanding of how to manage heart disease, stroke, rheumatism, the major psychiatric disorders, cancer, and so on, we have only reached the stage at which we can control our patients’ symptoms or temporarily patch them up.”
Reiterating what Lewis Thomas wrote about half way technologies more than a quarter century ago, Weatherall continues: “Our lack of success over the last fifty years, in getting to grips with the basic causes of these diseases, combined with our increasing understanding of the pathological consequences of diseased organs, has bred modern high-technology medicine.* Much of it is very sophisticated and effective at prolonging life, but it neither prevents nor cures many diseases.”
Like my friends, Weatherall is acutely aware of the many gains we have made. “Our ability to patch up patients and to prolong their lives seems to be almost limitless,” he states.* Still, like my friends, he also notes a curious anomaly—that the more we can do to enhance and prolong life for those suffering from diseases whose underlying causes we do not understand, the more this seems to lead to both a “dramatic increase in the cost of medical care” and “a dehumanizing effect on its practitioners and the hospitals in which they work.”
This dehumanizing effect on doctors and the institutions in which they work—the devaluation of those practices that enable a doctor to know and understand patients in their uniqueness, and, when treating their patients, to have the wherewithal and time to be thoughtful in their judgments—is not only lamentable for reasons we usually term personal or humanistic but, as my friends contend, profoundly inimical to the practice and efficacy of medicine itself.
“Insurance companies don’t blink much when it comes to lab tests or procedures for my patients that cost hundreds of dollars,” Jerry explains. “But they won’t pay, or will pay only minimally, for me to sit with a patient for a half-hour or an hour and take a thorough history, or to sit with a patient and talk about the patient’s problems—or the patient’s progress. And if I don’t know my patient well, and my patient doesn’t trust that I know him and care about him, then I can’t be the kind of doctor I want to be and should be.”
Although taking a thorough history and listening attentively to a patient, like caring for people with chronic and disabling diseases, or providing public health measures that help prevent disease, may not seem as heroic or glamorous—as sexy—as a bypass, a transplant, an artificial heart, a reengineered gene, a new “miracle” drug, or some other biotechnological innovation, it is prevention and rehabilitation that, these past hundred years, have made and continue to make the greatest difference in terms of both morbidity (how sick we are) and mortality (how long we live).
Before the advent of the Salk vaccine in 1955, the word polio would spread in whispers through our world each summer: news would reach us that a child we knew—a cousin or distant relative, the son or daughter of a friend, a boy or girl with whom we went to school or to summer camp—had contracted the disease, and would most likely be crippled for life.
In my memory, the polio scare (as in my parents’ phrasing: “There’s a polio scare—let’s just hope it doesn’t lead to an epidemic!”) came regularly each year after school was out in June, and when it did, beaches and swimming pools were closed, I was warned to stay out of crowds, to wash my hands frequently, to be careful about the water I drank, and never to swallow water from a lake or stream. This happened both in Brooklyn and in upstate New York, where my mother, brother, and I either went to sleepaway summer camps (my mother, in exchange for tuition for me and Robert, worked as camp nurse) or rented rooms in a large old farmhouse where there was a communal kitchen, and where my mother’s brother and four sisters, with their children, also rented rooms.
I recall, too, an earlier time—I was perhaps nine or ten years old—when, in my elementary school, P.S. 246, we lined up in the auditorium, the boys in white shirts and red ties, the girls in white blouses and dark skirts, to be given vaccinations. Class by class, we filed down to the front of the auditorium, and one by one we rolled up our sleeves, stepped forward, and received our shots. This occurred a few years after World War II, and we were told that by receiving these injections without complaint or tears we were heroes too: young Americans mobilizing against treacherous enemies—disease, disability, and epidemic—in order to keep our nation healthy, strong, and free.
I remember watching my friend Ronald Granberg—a tall, broad-shouldered, red-headed boy chosen to lead the Color Guard and carry the American flag down the center aisle at the start of assembly each Friday morning—get his shot, take a drink from the water fountain to the right of the stage, and faint straightaway into the spigot, chipping a sizeable triangle from his right front tooth. Twenty years later, our teacher, Mrs. Demetri (who lived around the corner from me, and gave me oil painting lessons in her apartment at night), told me she met Ronald in the street one day when he was a grown man, and that they recognized each other immediately. “Open your mouth, Ronald—” Mrs. Demetri told me she commanded him first thing “—and show me your tooth!”
My friends and I grew up and came of age in a time when, as David Weatherall writes, “it appeared that medical science was capable of almost anything”—in a time when the diseases that throughout our parents’ and grandparents’ lifetimes had been the chief instruments of infant and childhood death, and of crippling lifelong disabilities, were disappearing.
In these pre-AIDS years, Jerry explains, citing the success, among other things, of the worldwide program to eliminate smallpox, the medical community seemed to believe that infectious disease was, by and large, a thing of the past.
“When I was doing my internship, I was one of the few young doctors choosing to go into infectious disease,” he says. “I did it because I wanted to work in areas of our country and the Third World—poor areas—where there was still work to do that might make a real difference, and where these diseases were still taking an enormous toll. For the most part, however, I was anomalous in my choice of specialty. Back then, infectious disease was certainly not considered a promising specialty for medical students and young doctors, either clinically or in terms of research.”
Optimism about the “conquest” of disease—not only the infectious diseases, but all diseases—was widespread. The surgeon general of the United States, William H. Stewart, was frequently quoted as having declared, in 1967, that “it was time to close the book on infectious disease,” and the sentiment was, Jerry confirms, widely accepted as a truism (even though the surgeon general, it turns out, never said it!) and has continued, despite the AIDS pandemic and the emergence—and reemergence—of other infectious diseases, to prevail.*
In a more recent example, we have Dr. William B. Schwartz, in Life Without Disease: The Pursuit of Medical Utopia (1998), asserting that if “developments in research maintain their current pace, it seems likely that a combination of improved attention to dietary and environmental factors along with advances in gene therapy and protein-targeted drugs will have virtually eliminated most major classes of disease” (italics added).* More: a molecular understanding of the process of aging, he predicts, may lead to ways of controlling the process so that “by 2050, aging may in fact prove to be simply another disease to be treated.”
“The virtual disappearance overnight of scourges like smallpox, diphtheria, poliomyelitis, and other infectious killers, at least from the more advanced countries,” Weatherall writes about the post-World War II period, “led to the expectation that spectacular progress of this kind would continue.”*
“But this did not happen,” he explains. “The diseases that took their place—heart attacks, strokes, cancer, rheumatism, and psychiatric disorders—turned out to be much more intractable.”
The more we were able to eliminate many of the infectious diseases that led to premature death, that is, the more chronic and degenerative diseases such as cancer and heart disease replaced them as our leading causes of sickness and death. In the 1880 federal census, for example, neither cancer nor heart disease—our major killers a hundred years later—was listed among the ten leading causes of death.
Throughout the nineteenth century, gastrointestinal diseases, especially among infants and children (manifested largely as diarrheal diseases), were the leading causes of death. By the end of the nineteenth century, in large part because of public health and public works projects (clean water, sewage, sanitation), deaths from gastrointestinal diseases had declined, and tuberculosis and respiratory disorders (influenza, pneumonia) emerged as the major causes of death.
In 1900, neoplasms (cancer) accounted for less than 4 percent of all deaths and ranked sixth as a cause of mortality, while diseases of the heart accounted for slightly more than 6 percent and ranked fourth.* Eleven years later, in 1911—the year of my mother’s birth (she was one of eight children, two of whom died in infancy)—when respiratory diseases and tuberculosis were still the primary causes of death, heart disease and cancer accounted for nearly 17 percent of total mortality.
From 1911 through 1935, mortality from tuberculosis declined steadily, and influenza and pneumonia became, and remained, the two leading causes of death, taking their highest toll among people forty-five years and older, while the figure for heart disease and cancer, combined, rose to 30.4 percent.
By 1998, however, cancer and heart disease had replaced pneumonia and influenza as our leading causes of death, diseases of the heart accounting for 31 percent and malignant neoplasms for 23.2 percent. Of the fifteen leading causes of death, only pneumonia and influenza (3.6 percent, combined) now fell directly into the infectious group, and they took their greatest toll largely from individuals afflicted with a variety of other health problems, many of them deriving from what epidemiologists call “insult accumulation”—the long-term effects of organ damage caused by the childhood illnesses these individuals had survived.*
But we should note that diagnostic categories and criteria are, then as now—especially with respect to heart disease—ever changing. “We didn’t even know what a heart attack was until some time in the early years of the twentieth century,” Rich says. “It hadn’t really been invented yet—not until James Herrick discovered and wrote about it, and it took a while for the medical community to believe him.”
Until 1912, when Herrick published a five-and-a-half-page paper in the Journal of the American Medical Association, “Clinical Features of Sudden Obstruction of the Coronary Arteries,” the conventional wisdom was that heart attacks were undiagnosable, fatal events that could only be identified on autopsy. Although Herrick did not claim he was discovering anything new, his conclusions represented a paradigm shift—a radically new way of thinking about old problems that called conventional beliefs into question.
By comparing symptoms of living patients to those who, after death, were found to have had blocked arteries, Herrick demonstrated that coronary artery disease was recognizable in living patients. At the same time, he offered evidence suggesting that a totally blocked major coronary artery, as in my case, need not cause death, or even a heart attack. He concluded that heart attacks were most likely caused by blood clots in the coronary arteries, and that some heart attacks were survivable.
“Unsurprisingly,” Stephen Klaidman writes, “no one believed him.* The old paradigm was not ready to topple. Herrick said that when he delivered the paper, ‘It fell like a dud.’”
Six years later, in 1918, Herrick provided additional evidence to support his theory, including comparative animal and electrocardiograph tracings that identified the existence of blocked coronary arteries, and this time, Klaidman writes, “the livelier minds in the medical profession finally began to take notice.”
Although Herrick’s theory remained the conventional wisdom from 1920 to 1960, at which time it began to be questioned, it was not until 1980 that another American physician, Marcus DeWood, using a technique unavailable to Herrick—selective coronary angiography—proved that it was, in fact, blood clots within the coronary arteries, and not the slow accretion of atherosclerotic plaque, that caused most heart attacks. Thus was Herrick’s theory, nearly seventy years after he first proposed it, fully confirmed.
Thus, too, Rich contends, do we see how slowly and indirectly it is that we often arrive, in medicine, at the knowledge that allows physicians to be useful to their patients.
“And the most important element in our ability to be useful,” Rich says, “and to continue to test old and new hypotheses, and so discover those things that, as with Herrick, allow us to be increasingly useful, remains what it has been since I began as a medical student: listening.
“Listening to the patient has been, is, and will continue to be, I believe, the hallmark of medical diagnosis, the most fundamental element in the practice of good medicine. Wasn’t it Osler who said, ‘Listen to the patient—and the patient will give you the diagnosis’? Well, he was right. For it is the careful taking of a history—and the active listening and observing that accompanies this—that enables doctors such as Herrick to see what’s really there and what others, alas, too often do not see.
“This,” Rich says, “is what I continue to believe is and should be at the true heart of medicine—the time-honored art of medicine—and, alas, it is fast disappearing.”
In the years before Rich and I were born, and before cancer and heart disease had become our major killers—in the years when infectious and respiratory diseases were still the primary causes of death, and when doctors often had few resources at their disposal other than listening and consoling—the deaths of infants and children were grimly commonplace, and rates of infant and child mortality substantially, grievously higher than they are now.
In 1900, of the fifteen leading causes of death, infectious diseases accounted for 56 percent of the total.* When total mortality from all causes is taken into account, the three cardiovascular-renal conditions—heart disease, cerebral hemorrhage, and chronic nephritis—came to only 18.4 percent.
Between 1900 and 1904—the year my father was born—death rates per thousand for white males and females under the age of one were 154.7 and 124.8. (Comparable rates during these years for non-white Americans—mostly blacks—were more than twice as high.) The mortality rates for white males and females between the ages of one and four during these same years were 17.2 and 15.9, and for nonwhites 40.3 and 30.6.* However, by 1940—two years after I was born—the infant mortality rate had fallen by nearly 75 percent, while in the one-to-four-year age group, the figures had fallen even more dramatically (to 3.1 per thousand for males and to 2.7 for females).* Moreover, infectious disease had become a minor cause of mortality.* Whereas mortality rates for measles, whooping cough, and scarlet fever, for example, were 13.3, 12.2, and 9.6 per hundred thousand in 1900, in 1940 they were, respectively, 0.5, 2.2, and 0.5.
During the first half of the twentieth century, average life expectancy for Americans rose nearly 50 percent, from 47.3 in 1900 to 68.2 in 1950 (comparable figures for blacks were 33.0 and 60.7). In the second half of the century, figures for average life expectancy continued to rise, and infant and child mortality rates continued to decline, but they did so to a much lesser extent. From 1950 to 1998, however, life expectancy rose by only slightly more than 10 percent—from 68.2 to 76.5 for the total population, and from 60.7 to 71.1 for blacks, while infant mortality declined from 29.2 in 1950 to 7.2 in 1998. And while, in 1900, more than 3 out of every 100 children died between their first and twentieth birthday, today fewer than 2 in 1,000 do. Moreover, the American Academy of Pediatrics reports, “nearly 85% of this decline took place before World War II, a period when few antibiotics or modern vaccines and medications were available.”* (Note, though, the unexpected finding that, based on 1998 figures, the United States had the slowest rate of improvement in life expectancy of any industrialized nation.)
Just as Rich catalogues the remarkable advances he has seen in the treatment of heart disease since 1959, when he began his medical studies—the advent of monitors that can detect potentially lethal heart arrhythmias, of the cardiac care unit, of medications that break up clots and prevent atherosclerosis, of pacemakers, ventricular assist devices, electronic defibrillators, and of various new surgical procedures (bypasses, transplants, angioplasties, stenting)—so my other friends list the new means they have at their disposal for treating disease and the symptoms of disease: drugs and regimens that control high blood pressure, effective analgesic medications for the management of rheumatic disorders, remarkable diagnostic aids such as MRIs and CAT-scans, powerful medications that can put diseases such as AIDS, depression, schizophrenia, Huntington’s chorea, multiple sclerosis, and various cancers into short- and long-term remission.
Not only can we now prolong life in ways that were previously not possible, but we have, especially in the last quarter century, developed effective ways to enhance the day-to-day quality of the lives being prolonged. Twenty years ago, as Rich and Dr. Hashim acknowledge, little could have been done for me. I would most probably have died, or if not, might well have been seriously disabled for the rest of my life.
But the optimism bred a half century ago by the elimination of many childhood diseases, and by the gains we have made since then, has also, in the practice of medicine, become responsible for dangerous illusions, false hopes, and wasteful policies.
The belief, for example, that all conditions are amenable to “cure”—the various “wars” against diseases that attempt to persuade us that we can “battle” and “conquer” diseases the way we battle and conquer wartime enemies—by “mobilizing” resources, and “attacking” alien invaders (bacteria, viruses)—tends to distort our medical and human priorities, and to show little insight into how the biological world actually works, and how scientific advances come into being.* It also elevates the seeming science of medicine above the art of medicine both by greatly exaggerating the power of technology (often mistaken for and confused with “science”) to improve and save lives, and by falsely dichotomizing the science of medicine and the art of medicine.
One effect of this is that we often begin and end by treating patients not as people—individual human beings with unique histories and identities—but as interchangeable humanoid vessels in which various diseases, along with treatments and cures for diseases, will interact in predictable, uniform ways. Such beliefs are championed by drug companies, medical groups, and hospitals in public relations and advertising campaigns that continually deluge the public with claims made no less dubious and misleading by their familiarity and vagueness.
“Discover the Only Cholesterol Medicine Proven to Do All This,” states a February 12, 2001, full-page ad in the New York Times for Pravachol. There follows a checklist contending that Pravachol will lower “bad” cholesterol, raise “good” cholesterol, “extend life by reducing the risk of a heart attack,” and also reduce the risk of first and second heart attacks, strokes, atherosclerosis, bypass surgery, and angioplasty. At the top of the page, this suggestion: “Clip this ad and bring it to your doctor.” (The United States remains the only industrialized nation that allows prescription drugs to be advertised directly to the public.)*
In widely dispersed print and television ads for Zocor, Dan Reeves, an NFL football coach, confides that “suddenly, lowering my high cholesterol became even more important than football.”* After undergoing emergency bypass surgery, Reeves reports he “had a full recovery, and was even able to coach [his] team in the biggest game of the season four weeks later.” Having learned to “take better care of [himself]” he advises the following: “When diet and exercise are not enough, ZOCOR can help people with high cholesterol and heart disease live a longer life by reducing the risk of a heart attack” (italics added).
Columbia Presbyterian and New York Weill Cornell Cancer Centers claim, in typically militaristic language, that they have been “at the forefront of the fight against cancer” and are now “working together to defeat this relentless disease.”* In “one of the boldest initiatives ever undertaken,” they offer “new hope that the fight will be won” because at these cancer centers “experts” are helping to “uncover genes that cause cancer—essential to conquering the disease.”
And America’s Pharmaceutical Companies, the public relations firm that represents the drug industry (“leading the way in the search for cures”), proclaims that “pharmaceutical company researchers are working hard to discover breakthroughs that will help to make many illnesses and diseases a thing of the past and bring more patients new hope for a better tomorrow.”
Phil is blunt concerning such seemingly unexceptional claims and the false hopes and illusions they inspire, as well as the fact that patients, with increasing frequency, are coming to their doctors and demanding the medications they have read and heard about: mostly what have become known as lifestyle medications (Viagra, Prozac, Paxil, Rogaine) and the statins (Lipitor, Mevacor, Pravachol, Zocor), whose ads repeatedly suggest, in addition to banalities about “new hope,” “new cures,” and “better tomorrows,” what has not been proven: that these drugs will “extend life” and enable us to “live longer.”*
“I call it the Ponce de León thing,” Phil says.* “Everybody’s selling you the fountain of youth—eat this and don’t eat that and you’ll live forever. Take this medication, or exercise so much and so much every day, or have your doctor test you for this and perform that procedure and prescribe this form of therapy or that regimen and you’ll feel better than ever, get rid of all your bad feelings, and live forever. And if these things aren’t enough for you, there’s always cryogenics. It’s insane.”
“The belief that disease can be conquered,” Gerald Grob comments, “reflects a fundamental conviction that all things are possible and that human beings have it within their power to control completely their own destiny.”*
“The faith that disease is unnatural and can be conquered,” he continues, “rests on a fundamental misunderstanding of the biological world. If cancer is the enemy, then the enemy is ourselves. Malignant cells, after all, are hardly aliens who invade our bodies; they grow from our own normal cells.”
“Inflated rhetorical claims to the contrary,” he insists, “the etiology of most of the diseases of our age—notably cardiovascular disease, cancer, diabetes, mental illnesses—still remains a mystery.”
Then too, as my friends explain, not only do most diseases—including those that, in terms of mortality, predominate in our time (cancer and heart disease)—appear to have multiple causes (very few diseases are genetic in origin, and of those that are, most are quite rare, and even fewer are caused by single genes), but they are intimately bound up with the simple fact of aging: that we are mortal, we grow old, and we die.*
Writing in the New England Journal of Medicine about ways publicity for medical research often encourages us to deny the reality of death and aging, Daniel Callahan, senior fellow at the Harvard Medical School and director of International Programs at the Hastings Center, a research institute that addresses ethical issues in health, medicine, and the environment, quotes William Haseltine, chairman and chief executive officer of Human Genome Sciences.* “Death,” Haseltine has proclaimed, “is a series of preventable diseases.”
“The tacit message of the research agenda, is that if death itself cannot be eliminated,” Callahan comments, “then at least all the diseases that cause death can be done away with.
“From this perspective,” he continues, “the researcher is like a sharpshooter who will pick off the enemy one by one: cancer, then heart disease, then diabetes, then AIDS, then Alzheimer’s disease, and so on.”
The “thrust of the research imperative against death is to turn death itself into a contingent, accidental event,” Callahan submits, and one result of this way of thinking is that it “promotes the idea among the public and physicians that death represents a failure of medicine.”
“Since we are a self-replacing entity,” William Haseltine informs the New York Times, “and do so reasonably well for many decades, there is no reason we can’t go on forever.”* He explains: “The fundamental property of DNA is its immortality. The problem is to connect that immortality with human immortality and, for the first time, we see how that may be possible.”
When Phil and I discuss my mother, who has been diagnosed with Alzheimer’s disease and has been in a nursing home since 1992 (by which time she no longer knew who I was; for the last four or five years—I am writing this in the summer of 2002, shortly before her ninety-first birthday—she has not recognized even her regular nurses), Phil shakes his head.
“Sometimes I don’t understand why Alzheimer’s is such a big deal,” he says. “As we get older, lots of our systems begin to wear down, and that seems natural to me. In the old days, see, when her memory got bad and she couldn’t take care of herself, Aunt Edith would live with one of her children or a brother or sister, and when people got together she would usually sit quietly by herself, and if anybody asked about her, the family would say, ‘Oh that’s Tante Edith—she doesn’t remember things so well anymore, but she still bakes great strudel.’
“I mean, why are all these young people jogging and working out on treadmills and in health clubs all the time? Why is everyone on these diets all the time? Why do old men take up with young things, and women get boob-lifts and face-lifts? It’s the Ponce de León thing if you ask me—thinking we can cheat the angel of death and stay young forever.
“And the drug companies, with all their power, they take advantage. Sure. That’s the Willie Sutton thing. When he was asked why he robbed banks, he answered, ‘Because that’s where the money is.’ It’s the same with medicine—it goes where the money is. And these days the money’s in Prozac and Lipitor and Viagra. Did you know that nearly a third of all stents fail, and that new studies are telling us that all the chemotherapy we gave for cancer, with the enormous suffering it produced, probably didn’t make any difference in how long people lived? And as for all those cholesterol meds—for basically healthy guys like us, it’s a crock. What do we need to take that crap for, without any proof that it makes a difference, yet knowing for certain that somewhere down the road, as with most meds taken long-term, there are going to be unforeseen, nasty side effects? What’s wrong with growing old and dying is the question I ask.”
To which I reply: Believing what you do, and dealing on a daily basis with people who have migraines and headaches of unknown origin, who have suffered severe trauma and/or irreversible brain damage, have had strokes, and have been struck with fatal, debilitating diseases—why do you do it, and, as I’ve seen through the years when I’ve been with you, how do you maintain such an optimistic, hopeful attitude? What motivates you day after day?
“Okay,” Phil says. “I see it this way. In my specialty I’m always dealing with people who are sick. They’re not cured, because if they were, they wouldn’t be in my office or at the hospital. That’s the given. But the longer I do it, the more I know and the more I can be useful to people. Why be a doctor? Because you make a decent living, you satisfy yourself, and you do good in the world. That’s the beauty of it. Hopefully, you’re helping people—and we do help people much more than when I started out, when we didn’t know that a lot of what we did was harmful. The things we can do now for people are truly marvelous—but we’re often constrained, mostly by the insurance companies and medical groups that want us to spend less and less time with our patients, and to get them out of the hospital as quickly as possible.
“I want my patients to go home—if they have a home to go to—as soon as possible too, but I wind up spending more and more of my time fighting with insurance companies, especially for how much care my patients need after they leave. I mean, look at you: if you’d had a stroke during surgery and were incapacitated, who would have paid for people to be with you in ways essential to your day-to-day life—to your will to live?
“But you’re always learning, and that’s what I love—I wake up each morning knowing there are going to be new challenges, and new things to learn, and that I can be useful to other human beings.” Phil shrugs, says again what he has said before: “For me, that’s the beauty of it.”
While my other friends also talk about the beauty of a life in which they are constantly learning new things, and while they talk about the struggles and rewards they experience in trying to be useful to others, they also, like Phil, lament the devaluation of the doctor-patient relationship. They do so, not because they are nostalgic for some idealized and illusory golden era when family doctors with warm bedside manners made house calls and had their offices in their homes (as most of the doctors I knew did when I was growing up in Brooklyn, their wives often serving as their nurses or receptionists), but for decidedly practical reasons: because it is only by carefully listening to and examining a patient, by putting a patient’s symptoms and concerns into the larger context of the patient’s individuality and history, and by considering the individual patient in the context of their own knowledge and clinical experience, that they believe they have a good shot at an accurate diagnosis and a beneficial treatment plan.
Because Rich listened carefully to me over a period of time—because he knew me—he was, even though three thousand miles away, better able to gauge the exact nature and true gravity of my condition, and thus to urge me into treatment at once (and then, along with Jerry, to persist in choosing and getting the best possible care for me), than were the doctors who actually saw me and examined me in Northampton.
“But they weren’t seeing you,” Rich says. “Instead of seeing you and listening to you—and hearing what you said: the nature of your pain, its precise location, its comings and goings, its progress over time—they ran more tests. And tests have an aura of scientific certainty—especially if they come out of a computer, right? Oh there’s nothing ‘subjective’ there!
“But they weren’t seeing you, my friend,” he says again. “And the more our technologies evolve, and the more we rely on them—and they can be wonderfully useful, let me assure you—the more we’re in danger of not paying attention to the human being in front of us. So that if we think the machine knows more than we do—or rather, if we begin to think we can never know as much as the machine does—if we stop trusting those instincts and that knowledge based upon a lifetime of study and of seeing patients—then we are in real trouble.”