Health: The Unbearable Vampireness of Being
There is only one difference between a long life and a good dinner: that, in the dinner, the sweets come last.
—Robert Louis Stevenson
At seventy-six years, Louisiana has just about the worst life expectancy of any American state. Only Alabama, West Virginia, and Mississippi have it worse. People in the top twelve states typically can expect to live to age eighty, with Hawaiians topping the list of longevity at an average of eighty-three years.1 Louisiana’s substandard showing comes from high poverty and crime and poor education and health care. The opposite generally holds true in the longer-lived states. It doesn’t help that Louisiana has to endure one disaster after another. If it’s not suffering through some sort of pestilence—such as the Yellow Fever that raged through New Orleans for much of the nineteenth century or the Bubonic Plague that followed in the early twentieth century—then weather calamities like Hurricane Katrina are throttling it. Such is life in the “Big Easy” and its surroundings.
If the prevalence and commonality of death has had any positive side effect on the state, it’s that residents have attuned themselves to its context. “Early on, I got some sense of history and how ages compare and how one of the responsibilities we face in this age is to be conscious of what’s unique to it, insofar as we can, and make intelligent decisions as to what’s available to us,” says Anne Rice, one of New Orleans’s most famous daughters. “You can’t do that if you know nothing about ages past or if you believe lies about ages past. If you’re aware that in 1850 people starved to death in the middle of New Orleans or New York, that’s a dramatic difference between past and future. I’m fascinated by it. Why everybody isn’t, I don’t know, but I am.”2
Rice’s classic novels—Interview with the Vampire, The Vampire Lestat, Queen of the Damned, and many more—predate the current vampire craze. The undead monsters first inked into English by John Polidori, Lord Byron’s physician, pervade the culture now unfortunately: schmaltzy movies, teen-angst books, and soft-core porn TV shows. Rice’s oeuvre still stands above most of the genre, however, because it represents a unique approach still not replicated even decades after many of the books first appeared. New Orleans, the veritable antithesis of the Anthropocene epoch, framed Rice’s perspective as she grew up there. Modern metropolises have transformed their environs into finely tuned systems of order, but the Crescent City teems with a charmingly antiquated natural chaos. Centuries-old buildings in the French Quarter creak and lean while roots of ancient oak trees burst from the sidewalks of the nearby Garden District. The residents prefer it this way, like listening to a vinyl record with its pops and hisses that tell their own story. The city offers a living, breathing reminder of the past and therefore of how far humanity has come.
“The failure of most vampire literature is that the authors can’t successfully imagine what it’s like to be three hundred years old. I try really hard to get it right,” Rice says. “I really love taking Lestat”—her most famous character—“into an all-night drugstore and having him talk about how he remembers in 1789 that not a single product there existed in any form that was available to him as a young man in Paris. He marvels at the affluence and the wealth of the modern world.”
To our caveman friend Blarg, modern humans might appear not unlike Lestat and his vampire kin. We don’t necessarily consume blood to live, nor can we transform into bats, wolves, or mist, but we do have a host of seemingly superhuman powers, much like vampires. Chief among those, to the primitive human, would be our ability to live long lives.
If Blarg were exceptionally lucky, he might have made it to his forties, but he more than likely would have succumbed to pneumonia, starvation, or injury before his early twenties—if he survived infancy in the first place, that is. Life expectancy for humans more than ten thousand years ago was short and didn’t improve much for a long time. In ancient Rome, the average citizen lived to only about age twenty-four. But most counted themselves fortunate to get even that far; more than a third of children died before their first birthday. A thousand years later, expectations looked much the same.3
Over the course of the next eight hundred years, people in the more advanced parts of the world added only fifteen years to their life expectancy. An average American in 1820 could expect to see thirty-nine. Lifespans started to pick up in the early nineteenth century—around the same time that vampire myths were proliferating in Europe—and really sped up in the twentieth thanks to a decline in infant mortality but also because of improvements to health in general. By 2010, the average US life expectancy had nearly doubled from two centuries prior, at seventy-eight years, with similar results in other developed countries. Today, people in Japan live longest, averaging about eighty-two years.4 To a caveman or an average Roman, that would seem like an eternity.
Rice and many others from the Pelican State recognize this perspective. Even with their state’s comparatively low life expectancy, they’re still far better off than most people at any point in history. “I would be dead if we were in the nineteenth century,” says the septuagenarian. “I’m a type-one diabetic; I’d be long dead. I probably would have died three times over from things that have happened to me. But we’re living in the most wonderful age, it’s just the most incredible age because never before has the world been the way it is for us. There’s never been this kind of longevity or good health. It’s incredible to have this many people living in harmony and peace with their fellow human beings, having so many choices of how they want to live and where they want to live, what they want to do with their lives.”
THE VAMPIRES OF OKINAWA
As Rice implies, life expectancy ties closely to economic prosperity, which we’ve seen relates to technological development. Like economic growth, human health has seen similar dramatic improvements throughout history on account of technology. Longevity hasn’t quite become exponential, but it has been profound in recent times and looks likely to accelerate even more in the future, in step with economic advancement. We tend not to think about these facts that are vital when assessing whether humanity is better off because of technology.
Countries with the biggest improvements participated in the Industrial Revolution early and saw better food production and fewer famines as well as stronger resistance to and treatment of diseases. As the Organization for Economic Cooperation and Development puts it, “Increases in life expectation are an important manifestation of improvement in human welfare . . . There has been significant congruence, over time and between regions, in the patterns of improvement in per capita income and life expectation.”5
Of course, this tremendous explosion hasn’t happened everywhere. Life expectancy in many African nations still hovers around forty, and in Swaziland a typical person can expect to make it to just thirty-one. The average for the continent as a whole is fifty-two, whereas Western Europe clocks in at seventy-eight.6 On the plus side, the projections for less developed parts of the world, just as with their economic prospects, are pointing in the right direction. Crude birth rates—the number of people born each year per thousand members of the existing population—are plummeting. In 1900, forty-two people were born in the developing world for every thousand already there; by 2050, that figure will drop by almost two-thirds to fifteen new births.7 This decline mirrors what has happened in every developed nation: As each country industrializes and its people rely less on agriculture and subsistence lifestyles, their economic fortunes improve, and families can have fewer children because infant mortality declines and they don’t need as many hands to support the household. In more advanced economies, couples focus more on their careers and providing greater resources to the fewer children they do have.8
The unfeeling science of demographics does mask the unfortunate truth that life expectancy estimates often skew downward because of the large numbers of children who die early. Again, the trend here also looks positive: Infant mortality is declining dramatically in many developing nations. In some African countries, it’s falling by as much as 8 percent a year. The continent overall is experiencing a faster decline than anywhere at any point in history. Like the largely untold news about decreasing poverty in the developing world, this is “a tremendous success story that has only barely been recognized,” according to the World Bank.9 The life expectancy gap between advanced and developing nations is narrowing. In 1950, people in more developed regions could expect to live to around sixty-six, compared to forty-two in less advanced countries. By the end of 2015, that twenty-two-year gap will have declined by half to eleven years.10 Slowly but surely, the world is equalizing in this measure as well.
On the other side of the equation, life expectancy is going up because people aren’t dying as early as they did, and that has almost everything to do with science and technology. Dietitians and food scientists continually uncover better ways to eat, like the traditional Okinawa diet. Even among their already long-lived Japanese compatriots, Okinawans stand out. Living on the group of islands rather than on mainland Japan makes a person five times more likely to hit the century mark because of a typical food regimen low on meat but rich in nutrients and calories. Scientists studying this diet have found that it significantly lowers the risk of heart disease and several of the most prevalent forms of cancer.11 Nutritionists and food producers around the world are absorbing those lessons and passing them on. Indeed, Trader Joe’s stocks seaweed snacks —coming soon to a table near you, seaweed sandwiches?
Conversely, the obesity epidemic is growing, and some demographers believe it could seriously affect life expectancy in many countries over the long haul. So far, it’s a relatively recent phenomenon that has yet to show up in overall numbers. Simple economic growth may end up countering at least some of the problem since plenty of evidence shows that more affluent people tend to eat better or healthier. They stop going to McDonald’s and start eating more wholesome, organic foods. A 2013 report from the Center for Disease Control (CDC), for example, found that “as income increased, the percentage of calories from fast food decreased.”12 The other half of obesity comes from a sedentary lifestyle, a trend that may worsen as automation and robots replace more forms of physical labor. One of the biggest challenges facing today’s governments lies in countering this natural trend toward inertia and pushing for physical education.
Still, many of the top health-related causes of death in the world are declining steeply. Tuberculosis, for example, ranked among the top ten killers in the world in 2008, yet it had fallen off the list completely by 2011. In the United States, its incidence has plummeted since 1900. Developing countries, where the disease prevails now, will follow similar trajectories as their economies improve.13 Treatment of HIV/AIDS, also on the World Health Organization’s top killer list, is also improving by leaps and bounds thanks to better drugs and education. The total number of people infected quadrupled from 1990 to 2010, but the overall growth of the epidemic has stabilized because of better anti-retroviral therapies. Overall new annual infections and AIDS-related deaths are declining.14 As a 2010 report put it, people in their mid-thirties living in developed countries who discover they are HIV-positive today can expect to live another thirty years.15 Even better news is the growing number of people apparently cured of the disease through bone-marrow stem cell transplants.16 The governments of advanced countries now need to figure out how to facilitate such treatments for developing nations, where they’re needed most.
Even certain forms of cancer are not necessarily the death sentences they used to be. Survival times for non-Hodgkin’s lymphoma, breast, and colon cancers have improved dramatically over the past forty years, to the point where scientists are saying they are at “an amazing watershed” in understanding cancer in general.17 There’s still much more work to do with other types of cancer, but the overall trajectory is pointing in the right direction.
MALTHUS VERSUS MOORE
The product of all these improvements—longer life expectancy—freaks out the Malthusians. The good reverend Thomas Malthus wrote in 1798 that exponential population growth would outstrip the world’s food production capabilities, and ever since then people have been predicting shortages resulting in famine, pestilence, and war. It hasn’t happened because improvements in food production technology have also turned out to be exponential rather than linear.
World War II in particular ushered in a new era of mass food production as techniques and technologies developed to feed troops overseas came to benefit the general population. Freeze-dried foods, microwave ovens, refrigerators, plastic packaging, and other wartime innovations helped stabilize and boost production and distribution. Fast-food chains such as McDonald’s—despite all the negativity they attract—also contributed by standardizing systems and instituting stringent quality testing. If not for them, mass outbreaks of food poisoning likely would happen far more often. The Green Revolution of the 1960s and 1970s also helped: New chemical fertilizers and crop-breeding techniques dramatically increased production capabilities in countries such as Mexico, India, and the Philippines. Even space exploration has translated into better earthly food. Many low-sodium products owe their pedigrees to NASA because astronauts have to watch their salt intake in space. That humble tub of margarine in the fridge just happens to be an indirect byproduct of rocket fuel.
Today, food production is undergoing another technological boom with genetic engineering. It started in 1994 with the Flavr Savr tomato, designed to ripen faster, but the field has expanded to include insect-repelling corn and vitamin-rich rice. Many people fiercely resisted these genetically modified foods at first, especially in Europe, but acceptance among regulators and the public is growing as predicted doomsday scenarios fail to materialize. GM foods, as they are known collectively, hold the promise of applying technological stacking to food production because altered crops can have not just one but several new traits. SmartStax corn, created by chemical company Monsanto in 2009, offers a good example as it resists both insects and herbicides. Future versions could also require less water or sunlight. Like humans, food is gaining technological superpowers. Last but not least, the converse is also happening, where organic foods are becoming more popular. At the moment, they are typically more expensive than heavily processed foods, but with higher prosperity levels allowing more access to them, economics of scale will soon start to kick in. Organic food, in the very near future, will be more affordable and therefore more widespread.
The world’s population hit seven billion in 2011. Another two billion will come along by 2050, for a total of nine billion. Experts believe this rate, combined with better economic conditions in developing nations, means current food production will need a boost of nearly three-quarters of its current capability.18 That’s a tall order, but production has kept pace so far and stands poised to continue to do so. Moore’s Law is trumping Malthus’s dread.
That figure of nine billion doesn’t include just new people being born. It also includes people living longer. Case in point: about 40 percent of the girls born in Great Britain in 2013 will live to age one hundred, with boys not far behind. By 2060, that proportion will increase to 60 percent.19
In 2003, the UN released a bold report, “World Population to 2300,” which sought to predict the next three centuries of global demographics. The authors stressed that it was difficult to guess past 2050, but they estimated that world population would peak at 9.2 billion in 2075, then decline after that.20 Why? Life expectancy is climbing, and people are dying less frequently, but crude birth rates are falling considerably faster. Families in many countries already are falling short of the population “replacement level” of 2.1 children each. America hit that exact number in 2010, while many other developed nations—including France, Germany, the United Kingdom, and Canada—already had fallen below it. Families aren’t making babies fast enough to account for the number of people keeling over. People may be living longer, but everyone dies in the end.
Life expectancy, meanwhile, is “assumed to rise continuously with no upper limit, though at a slowing pace dictated by recent country trends. By 2100, life expectancy is expected to vary across countries from sixty-six to ninety-seven years, and by 2300 from eighty-seven to 106 years.”21 Given the trajectory of the past two centuries, those estimates are conservative, which the report acknowledges. “The diffusion of knowledge and technology, which could narrow the gaps between countries, was not factored into the projection methodology. Such an implicit assumption of independent trends does not affect short-term projections, but it seems to affect long-range projections like this one.”22
Technology will affect the gaps and also extend life overall. Many scientists believe we’re on the cusp of discoveries that will add significant years to average expectancy. Harvard geneticist David Sinclair, who is researching resveratrol—an anti-aging compound found in red wine—thinks 150 years will be achievable soon. “We are going through a revolution,” he says. “We might have our first handle on the molecules that can improve health.”
Futurists, or those social scientists who try to predict the future, are taking these advances to heart. Singularity prophet Ray Kurzweil, among others, believes that effective immortality lies around the corner. At the rate science is going, he figures that life expectancy will gain a year every year by the late 2020s, which means that if we can make it till then we’ll have a good shot at living forever.23 Kurzweil goes further in his predictions by arguing that the human brain eventually will be reverse engineered, which then will allow for the reconstruction and effective cloning of humans and their personalities. When that happens, we’ll be able to transfer our consciousness into machines and virtual worlds and live forever like the Cylons of Battlestar Galactica.
Kurzweil’s is a controversial view. Many neuroscientists believe the brain, personality, and consciousness may never be understood fully. On the other hand, Kurzweil seems right in suggesting that many of his critics can’t see the forest for the trees; specialists entrenched in their own fields often can’t take into account advances in other areas as we saw with the Human Genome Project. His belief about replicated immortality may get a shot in the arm with the process of mapping the brain. In 2013, President Obama launched Brain Research through Advancing Innovative Neurotechnologies—yes, BRAIN—while the European Union is funding the similar Human Brain Project. Doubts will arise about what results such efforts will net, but they likely will be important and happen more quickly than some observers expect.
Think of some of the past mind-blowing breakthroughs. In 2012, researchers at Wake Forest University and the University of Southern California developed “memory chips” to turn specific memories off and on in a mouse’s brain. An implanted mouse might remember how to navigate a maze perfectly one minute, but with the simple flick of a switch that same rodent instantly forgot everything it knew.24 Even more impressive was a successful experiment by UC Berkeley scientists to create videos from memories using an fMRI machine. After being exposed to random YouTube videos, a computer reconstructed test subjects’ visual recollections with a fair degree of accuracy.25 It’s the science fiction of Total Recall and Inception, and it’s already happening.
The near-term reality amid all these life-expectancy projections probably lies somewhere in the middle. Human longevity will stretch longer than the UN’s technologically absolved prognostications and possibly improve significantly through coming advances. Either way, humans are living dramatically longer than even our forebears of only a century ago. This new reality is having and will have major repercussions not just on society and our cultural institutions but on who we are as people.
NETFLIX FOR THE HEART
One question that inevitably arises when talking about living longer is, are we living better? The last chapter answered that question on an economic level, but what about health-wise? A person might live to a hundred today, but what’s the quality of those latter years? Is it worth it if that means living in pain or with severe physical limitations? As Ciaran Devane of Macmillan Cancer Support puts it:
While it is wonderful news that more cancer patients are living longer overall, we also know they are not necessarily living well. Cancer treatment is the toughest fight many will ever face, and patients are often left with long-term health and emotional problems long after their treatment has ended. For instance, of those colorectal cancer patients still alive between five and seven years after their diagnosis, two thirds will have an ongoing health problem.26
At the very least, more of us will look forward to a cocktail of pills every day during our twilight years. Is that reality, where we all become drug dependents, worth it? Kurzweil thinks so. He admits to taking 150 different pills—mainly vitamins, supplements, and preventative medications designed to slow down aging—so he can make it to the Singularity. Quality of life is purely subjective, he says, until you come face to face with mortality. “I’ve spoken to many 100-year-olds, and if you ask them if they want to live to a hundred and one they will tell you they do.”27
The question of living better can’t be answered empirically unless we consider what used to make us sick and kill us. The top three killers of Americans in 1900—pneumonia or influenza, tuberculosis, and gastrointestinal infections—don’t appear on the 2010 list, banished to manageability along with historic illnesses such as smallpox, scurvy, and rubella.28 Today’s top three—heart disease, cancer, and noninfectious airways disease—stand apart. Unlike their predecessors, they’re not infectious; instead they’re environmental, self-inflicted, or genetic. Some doctors believe that makes them eminently more treatable, perhaps with the sort of preventative self-care practiced by the likes of Kurzweil. Others think we’re entering a technology-driven health care revolution that not only will beat back some of the worst killers but also greatly improve the quality of life after illness.
Cardiologist Eric Topol researches genomics at Scripps Research Institute in La Jolla, California, and speaks regularly at events such as the annual Consumer Electronics Show (CES) in Las Vegas, Nevada; the Technology, Entertainment, and Design (TED) talks; and Wired magazine conferences. He often presents the same message that he did in his 2012 book, The Creative Destruction of Medicine: The health care industry is undergoing a bottom-up transformation thanks to digital technology that will result in longer and better life.
Topol hasn’t used a stethoscope in years. Instead, he uses a relatively inexpensive handheld ultrasound device, which preempts the need for an electrocardiograph and lets him share results with patients right away. Nor does he prescribe a Holter monitor, a complicated and uncomfortable heart-monitoring contraption that patients need to wear for long periods of time. Instead, he gives them a much smaller and cheaper monitor that sticks to their skin, much like a bandage, that patients then mail back to his office. The monitor, he says, is like “the Netflix of heart-rate monitoring.”29 He also applauds the heart-monitoring apps released for smartphones and tablets. He makes a point of asking new patients what phones they’re using. If he doesn’t like what he sees, he gives them some unusual advice. “The prescription I give them is, ‘You need to get a new phone.’ ”
Gadgets and apps like these are proliferating quickly and comprehensively. With inexpensive heart-rate monitoring and step-counting wristbands such as the Nike Fuelband and the Fitbit becoming a hit with consumers in recent years, inventors and entrepreneurs are flooding the market with all manner of self-tracking tools. Such devices—from the June, a stylish bracelet for women that monitors ultraviolet light exposure, to the Muse, a thin headband that monitors brainwave activity and advises on relaxation techniques—are overrunning the CES. Sensoria’s Fitness Socks gauge how well you walk, your individual footfalls and gait, then give you an overall sense of your foot health. Kolibree’s connected toothbrush tracks how well you brush your teeth, meaning that you don’t have to wait until the dentist’s office to know how poorly you’re doing it. The HapiFork tracks how quickly you eat and chew and buzzes if you’re going too fast. No human function or activity can’t be tracked, measured, and corrected. As Moore’s Law kicks in, all of these gizmos and apps will get better and cheaper.
The proliferation of self-tracking means doctors and individuals are assembling an increasing wealth of data, which inevitably will cause health care to become more personalized. That development runs counter to the population-based method of medicine administration, where pharmaceutical companies and doctors effectively guess based on sample groups. For much of the modern era, pharmaceutical companies have manufactured drugs after testing them on a relatively small test group of people, many of whom don’t include or represent the final patients. Like fingerprints, every person is different because he or she possesses a unique biological makeup, complete with its own nuances and combinations of health conditions. That’s why so many mass-market drugs either don’t work or come with a terrifyingly long list of possible side effects. With better data and the spread of individualized health information, pharmaceutical companies increasingly can specialize and improve drugs and treatments for smaller groups of people, as they have with certain types of cancer and cystic fibrosis.30 Topol thinks diabetes, for example, can break into twenty-five different subtypes, with different drugs for each. Such medicine will be more effective and carry fewer side effects.
On the diagnosis side, supercomputer assistants—some derived from the likes of IBM’s Jeopardy champion, Watson—are aiding doctors in crunching all that data to generate better assessments.31 Put all those pieces together, and those longer lives that people are experiencing don’t have to teem with pain and misery. “People don’t want to make it to a certain year, they want to make it to a certain quality of life,” says Topol. “Decreasing the burden of chronic diseases, that’s where it’s at. This will transcend the old dinosaur era of medicine.”
The term “moonshot”—originally coined to describe NASA’s Apollo program that put Neil Armstrong and Buzz Aldrin on the moon—over the past few years has become a metaphor to describe undertakings by companies and institutions to solve humanity’s huge issues. In 2013, Google announced one such moonshot, the launch of the California Life Company, Calico for short, aimed at extending life and improving its general quality. It’s an unusual move for a company that makes most of its money from Internet advertising, but Google’s founders insist that they have a responsibility to use their wealth to improve humanity. But Calico’s researchers will be looking at health concerns from a different perspective. Google cofounder and chief executive Larry Page puts it this way:
Are people really focused on the right things? One of the things I thought was amazing is that if you solve cancer, you’d add about three years to people’s average life expectancy. We think of solving cancer as this huge thing that’ll totally change the world. But when you really take a step back and look at it, yeah, there are many, many tragic cases of cancer, and it’s very, very sad, but in the aggregate, it’s not as big an advance as you might think.32
As with just about everything Google does, Calico may have entirely unexpected results. Who knew that when Google first launched its search engine in 1997 that it ultimately would become an advertising behemoth? And who knew that Google Maps would eventually spawn Street View, a tool with which we can virtually view the world?
Whether we’re living better is one of the most subjective questions we can ask ourselves since so many factors come into play. Age and era are the biggest. If you had asked a thirty-year-old in the eighteenth century to rate her quality of life, her answer would have differed widely from a similarly aged person living in the developed world right now. Yet a ninety-year-old today might feel the same as that thirty-year-old three centuries prior. Neither person would have the necessary context to consider the other’s life.
LIFE AS A COMMODITY
“We’re seeing death in a new way,” says Rice. “Instead of taking it for granted, the people I know see it as a personal catastrophe. I get e-mails from people who are actually surprised that someone has died. They regard it as an injustice. I understand their feelings, I get it, but this is a fairly new perspective on death. Nobody in the 1900s would have regarded death as a personal catastrophe. They would have mourned and might have been grief stricken, but they saw death all around them.”
In that sense, death as an event is increasing in its importance, which conversely means that the value of human life is also rising. Imagine how valuable human life will become if we do manage to clone ourselves digitally as futurists believe. How tragic would the loss of a person be then, where perhaps they are wiped out by some sort of computer virus when their entire being could otherwise be preserved forever?
In economics, a commodity is more valuable the rarer it is, which is why a finite resource such as oil can fetch top dollar. Human life, measured as time on Earth, works in the same way—but it also doesn’t. If we can expect to live a long time, we may not treasure individual years as much. We might even waste some by indulging in extraneous pursuits, such as sailing around the world or mastering the ukulele. On the other hand, if we live for many years, the value we have to other people, such as friends and family members, tends to increase.
Life differs most from commodities in the value it has for the person possessing it. If an individual has some oil but doesn’t like it, he or she can sell it for a nice profit because other people or entities do value it. An individual’s life, however, doesn’t have the same transferable value. Relatives and loved ones may treasure your life, but ultimately it isn’t worth much if you don’t yourself, which is where quality comes in.
A host of factors determines quality of life, beginning with a biological baseline. As the old cliché goes, you don’t have anything if you don’t have your health. But obviously what we do with our lives, the relationships we have, and the degree to which we meet our goals and dreams, determines the value we place on them.
At the beginning of Interview with the Vampire, Lestat is a confident and happy undead monster in late-nineteenth-century New Orleans. He creates a vampire family of sorts by siring an aristocrat named Louis and then a young girl named Claudia. They live together happily for a while, but eventually his protégés turn on him and flee. Near the end of the book, in modern times, Lestat is living in squalor, barely alive. Despite his immortality and the automatic fulfillment of his biological baseline—as long as he drinks blood—he’s miserable after years of being alone with the memory of his family’s rejection. The message is clear: It’s not enough simply to exist. Other necessary factors bring value to life.
Most vampire fiction falls firmly into the realm of horror, but Rice’s books read more like psychological case studies; the vampire characters comment on the human effects of technology and progress or project themes or ideas discussed in a previous chapter. With technology extending our lives and improving our health, the analogy fits better today than ever before: Humans may not be vicious psychopaths who drink the blood of innocents, but we are becoming more akin to vampires in that way. Age affects Rice’s characters in different ways. Some become wise and contented while others grow vain and egotistical. Which path are we treading as we inch toward immortality?
As a trope, vampires represent a subconscious desire for immortality. Some of us think about dying from time to time and even plan for it, but we’re not good at imagining what realistically might happen when it does occur. “We can’t conceive of our own death,” says Rice. “It’s just not possible, and yet we’re living mortality every moment that we’re alive. The more I see people die and experience the death of those I’ve loved and the more time I spend in sick rooms seeing people die, the more I’m aware that death takes people unawares. There’s no way you can prepare for going into another country of existence or winking out as if somebody simply pinched a candle flame out. We like to imagine we’re vampires because we feel really comfortable doing that.”