IN 1908, ONLY FIVE YEARS after the Wright Brothers began flying, H. G. Wells published a book titled A War in the Air, in which Germany starts an air war against Great Britain, France, and the United States.
To say Wells had a penchant for prescience would be a vast understatement.
In 1914, the Institute of International Law tried to ban the dropping of bombs from flying machines,1 but it was too little, too late. Giant German “Gotha” planes began bombing Great Britain in 1917. That year, 180 miles west of London, a baby was born, named Arthur, who would come to be regarded as the preeminent science fiction writer of the twentieth century. As he became more famous, Arthur C. Clarke increasingly regarded predicting the future a “discouraging, hazardous occupation.” That may be true, but Clarke was awfully good at it, anticipating satellites, home computers, email, the internet, Google, live-streaming TV, Skype, and smart watches.
Clarke had some strong opinions about scientists: a physicist in his 30s was already too old to be useful. In other scientific disciplines, a 40-year-old has likely experienced “senile decay.” And scientists of over 50 are “good for nothing but board meetings, and should at all costs be kept out of the laboratory!”
Toward the end of his life, Clarke gave a series of interviews. Most of them were recorded and edited because he had halting speech caused by postpolio syndrome. In one interview, he revealed that he had a use for washed-up scientists: “When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.”2
I’m a scientist who is now 50. Some people might call me distinguished. And my students definitely don’t want me in the laboratory. So although I can’t say I’m certain about my predictions, I’m apparently well qualified to make them.
I’ve been asked on occasion, by members of the US Congress and the like, for predictions of technological breakthroughs and how they might be used for good or evil purposes. A few years ago, I gave an opinion on the top five future advances in the biological sciences that would be of relevance to national security. Though I can’t reveal what I said, I expect that most people would have thought they were science fiction. My best estimate was that they would happen sometime before 2030. Within six months, two of them became science fact.
I don’t know precisely when the first individual to cross the threshold of 125 years will be, but he or she will certainly be an outlier, as pioneers always are. In just a few years, he or she will be joined by another. Then dozens more. Then hundreds. Then the fact won’t be worth noting. Even longer lifespans will become more and more common. The world may see the first sesquicentenarian sometime in the twenty-second century. (If you think that’s far off, consider that some researchers believe half of all American children born today will celebrate New Year’s Eve 2120. Not outliers—half.)3
Those who think all this is impossible are ignorant of the science. Or they are denying it. Either way, they are almost certainly wrong. And because things are moving so fast, many of them might even live to realize they are wrong.
No biological law says there is a limit to how long we can live; there is no scientific mandate that the average age at death must be 80 years. And there is no God-given mandate to die after fourscore years. Indeed, in Genesis 35:28, Isaac is claimed to have lived “one hundred and fourscore.”4
Thanks to the technologies I’ve described, a prolonged, healthier human lifespan is inevitable. How and when we’ll achieve it is a bit less certain, although the general path is quite clear. The evidence of the effectiveness of AMPK activators, TOR inhibitors, and sirtuin activators is deep and wide. On top of what we already know about metformin, NAD boosters, rapalogs, and senolytics, every day the odds increase that even more effective molecule or gene therapy will be discovered, as brilliant researchers around the world join the global fight to treat aging, the mother of all diseases.
All of that comes on top of the other innovations that are on track to further lengthen our lives and strengthen our health, such as senolytics and cellular reprogramming. Add to that the power of truly personalized care to keep our bodies running, prevent disease, and get ahead of problems that could be troublesome down the road. That’s not to mention the very easy steps we can all take right now to engage our longevity genes in ways that will provide us with more good years.
With significantly prolonged vitality an inexorable part of our future world, what do you want that world to look like?
Are you comfortable with a future in which the rich live much longer than the poor and in doing so get richer with every passing year? Do you want to live in a world in which an ever-increasing population greedily scraps for every last remaining resource while the world continues be less and less habitable?
If so, there is nothing left for you to do. The status quo will get us there—regardless, in fact, of whether we prolong human life or not. You can sit back, relax, and watch the world burn.
There is another potential future, though, one in which prolonged youthfulness is the torch that lights the way to greater universal prosperity, sustainability, and human decency. This is a future in which tremendous resources are freed from a medical-industrial complex that is based on battling diseases one by one, thus creating tremendous opportunities to tackle other challenges. It is a future in which people who have lived on this planet for a long time are revered for their knowledge and skills. It is a future of global Good Samaritanism.
It’s also a future for which we must fight, for it is in no way guaranteed.
To get there, we have some work to do.
I am a serial entrepreneur, a disciple of innovation, and a grateful beneficiary of people’s investment in me and the teams I’ve brought together to solve hard problems.
I also recognize, however, that the free market doesn’t magically produce good science or equitable outcomes when it comes to health care. In any research endeavor, a balance of public and private funding is vital to producing the conditions that encourage unbound scientific exploration, the investment in early discoveries, and a degree of common ownership that better ensures that the benefits of newfound knowledge will become available to the greatest possible number of people.
That balance has become ever more precarious in recent years. Starting in 2017, for the first time since World War II, the US federal government was no longer the majority source of funding for basic scientific research in the United States.
Federal funding for science in the United States began in the 1880s, when the Marine Hospital Service, the predecessor of the National Institutes of Health, was charged by Congress with examining passengers on arriving ships for clinical signs of infectious diseases, such as cholera.5 In 1901, a routine supplemental appropriations act, providing $35,000 for a new building, became the founding legislation of the NIH. Congress was not convinced that the money would go to good use, so it ensured that every year, funding would be at the discretion of Congress, and so it remains. Hopefully Congress remains convinced that NIH funding, which provides hundreds of competitive grants to scientists around the country, is money well spent, because without NIH-funded research, the majority of the medicines and medical technologies we rely on would never have been discovered, not to mention the thousands of new medicines still waiting to be discovered.
At least for now, the federal government still makes up a large share of total funding for medical research at hospitals and universities, ensuring that R&D isn’t driven by profit alone. This is important, so scientists like me can run with their imaginations and instinct, sometimes for a decade, before any commercial applications are apparent and long before any investor would consider supporting the work to help it survive the innovation “valley of death.”
Clearly, government is essential in this ecosystem, but in a world in which there is more competition for overall research funding than ever before, good scientists investigating aging are having to seek more and more private financial support for their work; world-changing research certainly isn’t cheap, and when it’s funded by a company with short-term goals, it isn’t free, either. That’s why it’s important that we reverse the decline in public funding for medical research, which fell by 11 percent in real dollars from 2003 to 2018.6
The situation is particularly hard for researchers studying aging. Funding to understand the “biology of aging” gets less than 1 percent of the total US medical research budget.7 With an aging population and ever-increasing health care costs, why aren’t governments dramatically increasing funding for aging research to keep people healthier for longer?
The reason is, in nearly every nation in the world that has made a civic investment in medical research, that research is tied back, if not completely tied up, by the definition of disease.
If you are a scientist with an idea for a novel way to slow the progression of cancer or a researcher with an inventive idea for ending Alzheimer’s disease, the NIH and similar national research funding agencies around the world are there to help. The NIH isn’t simply a bunch of buildings in Bethesda, Maryland. It allocates more than 80 percent of its budget to almost 50,000 competitive grants to about 300,000 researchers at more than 2,500 universities and research institutions. Medical research would almost grind to a halt without that money.
It’s worth drilling down into the NIH budget to see which of the 285 diseases that are being researched get the most attention.8
• Heart disease gets $1.7 billion for a disease that affects 11.7 percent of the population.
• Cancer gets $6.3 billion to impact 8.7 percent.
• Alzheimer’s disease gets $3 billion for a disease that impacts 3 percent—at most.9
How much does obesity, which affects 30 percent of the population and reduces lifespan by over a decade, get? Less than a billion dollars.
Don’t get me wrong. Compared to how the government spends a lot of its money—the cost of a single F-22 Raptor fighter jet is upward of $335 million, for instance—this is all money well spent. To put it into even greater perspective, though, consider this: US consumers spend more than $300 billion per year on coffee.10
To be fair, life without coffee might not be worth living. But if you are a researcher who wants to make life even better—by slowing or reversing diseases of aging—you have a bit of a problem. There just isn’t that much public money being spent on that area of science.
In 2018, Congress appropriated $3.5 billion for research into aging, but if you were to dig into the budgetary documents, you’d see that the money went almost entirely to research Alzheimer’s disease, perform clinical trials of hormone replacement therapy, and study the lives of the elderly. Less than 3 percent of the funding for “aging research” was actually for the study of the biology of aging.
Aging disables 93 percent of people over the age of 50, but in 2018 the NIH spent on aging less than a tenth of what was spent on cancer research.11
One scientist who is particularly annoyed by the budgetary focus on individual diseases is Leonard Hayflick, the scientist who first discovered that human cells in a dish have a limited capacity to divide and eventually senesce, after having reached the Hayflick Limit.
“The resolution of Alzheimer’s disease as a cause of death will add about 19 days onto human life expectancy,” he noted in 2016.12 Hayflick has suggested that the name of the National Institute on Aging, a division of the NIH, might as well be changed to the National Institute on Alzheimer’s Disease.
“Not that I support ending research on Alzheimer’s disease, I do not,” he said, “but the study of Alzheimer’s disease and even its resolution will tell us nothing about the fundamental biology of aging.”
The relatively tiny amount the United States spends on research into aging, however, is generous when compared to that of most other advanced nations, which invest next to nothing. There is no doubt that this situation is a direct result of the establishment view that aging is an inevitable part of life rather than what it actually is, a disease that kills about 90 percent of the population.
Aging is a disease. This is so clear that it seems almost insane that those four words need to be repeated again and again. But I’ll do so anyway: aging is a disease. And not only is it a disease, but it is the mother of all diseases, the one we all suffer from.
Paradoxically, no public funding agency around the world classifies aging as a disease. Why? Because, if we are fortunate to live long enough, we all suffer from it. And thus for now, the pool of public funding available for research aimed at prolonged vitality is rather paltry; the biggest checks are still being written to support initiatives aimed at recognized diseases. And at the moment I am writing these words, aging isn’t recognized as a disease. Not in any nation.
There are several ways to speed innovation to find and develop medicines and technologies that prolong healthy lifespan, but the easiest is also the simplest: define aging as a disease. Nothing else needs to change. Researchers working on aging will compete on equal footing with researchers working to cure every other disease in the world. The science-based merits of grant proposals will dictate which research efforts are funded. And private investment will continue, as it should, to drive innovation and competition.
Labs like mine, focused specifically on developing innovative therapies to treat, stop, and reverse aging, will no longer be rare. There will be one or more at every health science university in the world.
And there should be, because there is no shortage of scientists lining up to enlist in this army. Right now, I and other researchers who study aging are being besieged by eager, experienced, and absolutely brilliant youths who want nothing more than to devote their lives to the fight to stop aging. For lab heads like me, it’s a virtual buyer’s market. There are far more people who want to work in aging than there are labs they can work in. What this means is that there are a lot of people who, despite being wicked smart and raring to tackle the aging problem, are having to work in other fields or other professions. This will soon change.
The first nations to define aging as a disease, both in custom and on paper, will change the course of the future. The first places to provide large amounts of public funding to augment the fast-growing private investments in this field will prosper in kind. It will be their citizens who benefit first. Doctors will feel comfortable prescribing medicines, such as metformin, to their patients before they become irreversibly frail. Jobs will be created. Scientists and drug makers will flock to that country. Industries will thrive. Their national budget will see a significant return on investment. Their leaders’ names will be in the history books.
And the holders of the patents, the universities and the companies, will have more money than they know what to do with.
I’m proud to say that Australia is leading the charge to define aging as a treatable disease. I recently made a trip to Canberra to meet with Greg Hunt, the minister for health, and Deputy Secretatry Professor John Skerritt of the Therapeutic Drug Authority, and about 15 of Australia’s other top aging researchers. I learned that developing a drug for aging may be far easier in my native land than in the United States. While the United States expects evidence that a disease is cured or alleviated, in Australia it is possible for a drug to receive approval for “influencing, inhibiting or modifying a physiological process in persons.” In the aging field, we know how to do that!
Singapore and the United States are among the nations that are also seriously considering a regulatory shift. Whichever does so first will be making a historically important decision, one that will benefit itself first and foremost.
There’s a reason why the United States virtually owns the aerospace sector—exporting products worth more than $131 billion in 2017, or more than the next three national exporters combined. “First in Flight” isn’t just a good slogan for North Carolina license plates; it’s a statement about why being out front matters. Americans retain the pioneering spirit of their ancestors: anything is possible. More than a century after the Wright brothers flew the first planes at Kitty Hawk, and after almost losing out to the French and British, the United States is still ahead in the flight game. It has the world’s most powerful air force. It got to the moon first. And it has a big lead in the development of public and private initiatives to put people on Mars.
But none of that will impact human history as much as the first nation to declare aging a disease.
At a bare minimum, governments have a vested interest in making sure the innovations we develop to protect human life are used wisely and for our collective benefit. The time to talk about ethics and how personal privacy will be impacted by these coming technologies is now, for once the bottle is opened, it will be exceedingly hard to put the genie back. DNA-based technologies that enable the detection of specific pathogens, for example, could also be used to search for specific people. Technology now exists to create humans that are stronger and longer lived. Will parents choose to give their children “the best possible start”?13 Will the United Nations outlaw the genetic improvement of citizens and military?
To create a future worth living in, it won’t be enough to simply fund research that lengthens and protects people’s lives and ban its misuse. We must also ensure that everyone benefits together.
The dentist looked bored. “Your teeth are fine,” she told me as she peered into my mouth. “Just the normal wear and tear. I’ll send the hygienist in for your cleaning, and we’ll get you on your way.”
It seemed as though she were turning away before her fingers were even out of my mouth.
“Doctor, if you can spare a moment,” I said, “can you tell me what you mean by ‘normal wear and tear’?”
“You’re getting older, and your teeth are showing that,” she said. “Your two front teeth are worn down. Totally normal. If you were a teenager, we would probably fix them but—”
“Well, then,” I said. “I’d like to have them fixed.”
Eventually the dentist relented, although not before I told her what I did for a living and explained to her that I was hoping to use my teeth for a very long time to come. I also assured her that I would be happy to pay for the procedure even if my insurance wouldn’t.
Her resistance was understandable. When dentists look into the mouths of patients who are in their 40s and 50s, they have long been looking at teeth that are halfway done with their jobs. But that’s no longer the case. Our teeth—like all of our other body parts—are going to have to last a lot longer now.
My experience at the dentist was a microcosm of the way middle-aged people are treated in every facet of the health care system. When a doctor looks at a 50-year-old person right now, his or her goal is to keep the patient “less sick,” not to ensure that he or she will be healthy and happy for decades to come. Who among us over the age of 40 has not heard a doctor say the words “Well, you’re not twenty anymore”?
There are two things that guide medical treatments more than anything else: age and economics. The first often limits what doctors are even willing to discuss in terms of treatment options, because they assume that people are supposed to slow down, begin dealing with a bit of pain, and gradually experience the degradation of various body parts and functions. The second dictates these discussions even more, because regardless of how much potential a procedure may have to improve a patient’s life, it is pointless, and even heartless, to tell someone about care he or she can’t afford.
Indeed, our medical system is built on ageism. When we are young, we don’t get treatments that could keep us healthy as we grow old. When we are old, we don’t get the treatments that are routinely used on the young.
This all has to change. The quality of our medical care should not be predicated on age or income. A 90-year-old and a 30-year-old should be treated with the same enthusiasm and support. There will be enough money to pay for this because of the trillions of dollars that won’t have to be spent by insurance companies or the government, and hence ourselves, on treating the chronically ill.14 Everyone should be entitled to treatments and therapies that improve quality of life, no matter what the date on his or her birth certificate is. As we move toward a world in which the number of our birth years indicates less about us than ever, we will need to adjust the assumptions, rules, and laws that govern what medical treatments people can receive.
Equitable access to medical care, no matter how long life may be, is a terrifying idea to many people, because it sounds awfully expensive. That’s understandable, because as it stands, social medical programs across the globe are straining under the ever-increasing cost of treatment, especially treatments that are provided to those who are very sick, very old, and likely to get nothing more than a few extra years—if that—out of the deal.
That’s not what the future of medical care needs to look like. Right now, the overwhelming majority of the money we spend on medical care is spent fighting diseases. But when we are able to treat aging, we will be tackling the biggest driver of all disease. Effective longevity drugs will cost pennies on the dollar compared to the cost of treating the diseases they will prevent.
In 2005, a study by Dana Goldman and his colleagues at RAND in Santa Monica put some numbers on this. They estimated the value that new discoveries would add to society and the cost to society to extend a human life by one year.15 The cost of an innovative medicine to prevent diabetes: $147,199. Of a cancer treatment: $498,809. Of a pacemaker: $1,403,740. Of an “antiaging compound” that would extend healthy years by a decade: a mere $8,790. Goldman’s numbers support an idea that should be common sense: that there is no cheaper way to address the health care crisis than to address aging at its core.
But what if the drugs don’t keep people healthy? What if they simply prolong life, like many cancer chemotherapy drugs, which are approved based on their ability to provide a longer life, not a higher quality of life? Society should debate whether longevity medicines that don’t keep us healthier should ever be approved. If they were to be allowed, there would be even more elderly people with disease and disability, and, according to Goldman, health care spending in thirty years’ time would be 70 percent higher.
Fortunately, the science suggests this nightmare scenario is not going to happen. When we have safe and effective drugs to slow aging, they will also extend our healthspans. What will be left will be medical maintenance, which is exceptionally cheap; emergency medicine, which is costly but rare; and communicable diseases, which we’ll be able to track, treat, and prevent with far greater efficiency and effectiveness. It’s similar to making the switch from gasoline-powered cars that need oil, belts, tuning, and regular maintenance to electric cars that tell you occasionally to top up the windscreen washer fluid.
Having lived in Australia, the United Kingdom, and the United States—three countries with an intertwined history, language, culture, and trade—I’ve found it interesting to see how similar they are in some ways and how different they are in others.16 One big difference is that most Australians and Britons rarely assume that their way of doing things is the best. Americans, however, often believe that their way of doing things is assuredly the best.
I’m not saying the United States doesn’t do a lot well and shouldn’t continue to blaze its own path in many areas of domestic and global policy, but I’ve long been perplexed by the American resistance to studying what actually works elsewhere.
In science we call this experimentation, and it’s what propels our civilization forward. The more experiments are conducted, the better informed we are. And some experiments work really well.
Seeded as a prison colony, Australia is one of the least religious countries in the world, but when it comes to providing for its citizens, it is a city upon a hill.17 Like the United States, Australia has its problems: traffic snarls, a high cost of living, and strict rules aimed at saving lives, even if those rules often take the fun out of life.
There is a statistic, though, of which Australians are increasingly proud: a fifty-year-long experiment to protect and preserve every citizen, regardless of social position, education, or income. Deaths from car accidents and smoking are the lowest in the world, thanks to strict laws and hefty fines. Even before these laws passed, there was bigger change afoot. In the mid-1970s, a universal health care system was enacted, one of the first ever, and life expectancy in Australia began to shoot up. Similar to the United States in the 2010s, the next government tried to limit the scope of that progressive reform but ultimately failed.
A controversial, right-wing politician, Bronwyn Bishop, helped create an independent Australian Federal Department of Health and Ageing, which lasted from 2002 to 2013 with a budget of around AU$36 billion, focusing on health promotion, disease prevention, and services and caregivers for the aged.
During this time, Australia continued on an upward trajectory, using its wealth to create more health and productivity in its workforce, and its health and productivity to create more wealth, a virtuous cycle of the highest moral order.
Between 1970 and 2018, Australian men gained an extra twelve years of life. Their healthy life expectancy is 73 years, ten years higher than the global average, thanks to a significant decline in the percentage of people suffering from disabling health conditions.18
The elderly in Australia are being less elderly, less of a burden, and much more productive than other nations. If you visit Australia, the difference between its fit, active elderly and those in the United States who are saddled with obesity, diabetes, and disability is noticeable.
My father thought he was headed for the grave. Instead, he’s most often headed to concerts or the mountains. He spends several nights a week eating out with friends. He’s adept at computers and new high-tech gadgets and was one of the first people in Australia to have a smart speaker with a virtual assistant at home. He’s unbothered by international travel, so we get to see him frequently. He’s gone back to work. Physically and mentally, he is at least thirty years younger than his mother was at his age.
His remarkable health may or may not be due to the molecules he takes—the coming years of his life will be an indicator, whereas scientific proof will come only in the form of double-blind placebo-controlled trials—but he is also helped by frequent exercise, access to excellent medical care, and a system that believes in disease prevention, not just late-stage treatment. He is a shining example of a new generation of Australians in their 70s and 80s who are not just living longer but living far better than any of their forebears. In 2018, Australia ranked seventh on the global Human Capital Index, a measure of the knowledge, skills, and health that people in a nation accumulate over their lives, just behind Singapore, Korea, Japan, Hong Kong, Finland, and Ireland. The United States ranked twenty-fourth. China ranked twenty-fifth.
The trajectory for Australia is up, and the Aussies aren’t looking back.
Having seen what works, other, mostly European countries have adopted similar health care systems. Australia now has reciprocal agreements with the United Kingdom, Sweden, the Netherlands, Belgium, Finland, Italy, Ireland, New Zealand, Malta, Norway, and Slovenia, which means that citizens from those countries can receive the same medical care in Australia as they can at home, and vice versa. Imagine an entire world like that.
Meanwhile, some countries are being left behind. And one, in particular, is moving backward.
Thanks to the burgeoning addiction to calories and opioids, and a health care system that is inadequate, if not completely inaccessible to one-third of its population, the United States recently experienced a decline in life expectancy for the first time since the early 1960s. That decline may soon exceed the decline in life expectancy caused by the Spanish flu epidemic in 1918. This is happening despite the fact that the United States spends 17 percent of its GDP on health care, nearly double that of Australia.
I don’t mean to disparage the country in which I live—it has been very generous to my family and me. But I am frustrated. Ever since I arrived in the country that actually put humans onto the moon, it has been a shock to see opportunities to help more people for less money wasted over and over again.
The United States has been a leader in both public and private investment into lifesaving medical research. And although it can be hard to track the origin of every drug in this increasingly interconnected world, by one estimate 57 percent of all medications are developed in the United States. Other nations, especially those that don’t invest as heavily in medical research, should be grateful to the United States for discovering and developing most of the drugs that ensure their increasingly long lives.
In a just world, the citizens of the United States would be the greatest beneficiaries of the medical breakthroughs they subsidize and produce. They’re not.
Australians are. Britons are. As are the Swedes, the Dutch, the Irish, and the Slovenians. They’re all benefiting in terms of lifespans and healthspans, because they have the sort of universal access to health care that 15 percent of registered Democrats and half of all Republicans in the United States have come to fear.19 That the average American lifespan is just four years shorter than Australia’s20 belies the fact that in the poorest regions of the United States citizens live a decade even shorter than that.21
As the Australian example proves, when everyone is living longer and healthier, everyone does better. So why isn’t this a topic of discussion in the United States? Why aren’t people charging Capitol Hill with protest signs and the proverbial pitchforks, demanding more investment, universal access to medicines, and the healthiest lifespan on the planet? As other countries enjoy increasingly longer, healthier lives, perhaps Americans will wake up and smell the disparity. But I suspect they won’t. Though the World Health Organization ranks the United States at number 37, below Dominica, Morocco, and Costa Rica and one up from Slovenia,22 it’s still common to hear US politicians say, without any justification, that the United States has the best health care system in the world, and millions of people believe it.23
The alternative to a universal right to be treated—regardless of age and regardless of the ability to pay—is a world in which rich people increasingly benefit from even longer and even healthier lives than they already enjoy, while poor people suffer through short, disease-ridden existences. This is a terrible idea for rich and poor alike.
My line of work has put me into contact with some of the wealthiest people in the world, who are understandably interested in learning the secrets of longer and healthier lives. I’ve yet to meet a single one who wishes to see such a divide come to pass. In that direction, after all, lie the seeds of revolution—and revolt seldom goes well for the ruling class. As the venture capitalist and “very large yacht” owner Nick Hanauer wrote in a memo to “My Fellow Zillionaires” in 2014, “there is no example in human history where wealth accumulated like this and the pitchforks didn’t eventually come out. You show me a highly unequal society, and I will show you a police state. Or an uprising. There are no counterexamples. None. . . . We will not be able to predict when, and it will be terrible—for everybody. But especially for us.”24
Hanauer’s warning came before longevity genes were on most people’s radar and long before most people had so much as contemplated what significantly lengthened lifespans and healthspans could do to the rich-poor divide.
Universal access to technologies that prolong vitality won’t fix every problem associated with income inequality, but it’s a crucial start.
By cosmic standards, this region of the Milky Way isn’t a horribly inhospitable place for life to evolve in. We’re here, after all. And the outer edges of spiral galaxies like ours seem to hold reasonably good promise for a few life-sustaining planets to materialize,25 far better than the dwarf galaxies that are the most abundant type of star systems in the universe.
The way the astronomer Pratika Dayal sees it, however, the most likely places for life to form and thrive in are the rarer, metal-rich, giant elliptical galaxies—twice as big as the Milky Way and often much bigger, holding as many as ten times the number of stars and perhaps 10,000 times as many habitable planets.26 By the way, if you’re under the misconception that if we screw up this planet we can just travel to a new one, consider that the closest known habitable exoplanet is twelve light-years away, as the crow flies. That sounds close, but barring the discovery of a space wormhole or light sailing of tiny cargoes at near light speed, it would take at least 10,000 years to get a few humans there27 (which, I’ve argued, is another good reason to figure out how to extend human lifespan).
The closest giant elliptical galaxy is Maffei 1, which is about 10 million light-years away. We can assume that if explorers from Maffei 1 ever make the trip to visit us, they’ll be from an exceptionally advanced society. I expect they’ll have a few questions, for they will want to know how far we have advanced, too.
First, I believe, they’ll be curious about the easy things: Have we figured out pi to a million decimal places? The speed of light? The fact that mass and energy are the same thing? Quantum entanglement? The age of the universe? Evolution?
Next they’ll ask us about some of the harder stuff: Have we learned to use the resources available on our planet wisely? We’ll get passing marks on that one, I suppose, as long as we don’t mention lead pipes, nuclear bombs, and Furbys. Have we done so sustainably? “Um, pass.”
Then they’ll likely want to hear about what other worlds we’ve visited. “We sent twelve guys to Luna,” we’ll say. “Where’s that?” they’ll ask. We’ll point to the big white orb in our night sky. “Hmmm,” they’ll say. “Just the men of your species?” We’ll nod, and they’ll roll their 146 eyes.
After that, they’ll want to know about our lifespan. Have we figured out how to live far beyond the time given to us by evolution? “Er, we didn’t know that was a thing worth studying until a few years ago.” They’ll offer a bit of overly enthusiastic encouragement, as a human adult might do for a baby who is learning to eat solid food.
The next question will be a rather grave one: “How do you die?” they’ll ask. And how we answer that question is going to be an important indicator of just how advanced we truly are.
Right now, as my mother’s death exemplified, the way most of us die is barbaric. We go through a long period of decline, and we’ve come up with ways to extend that period of pain, grief, confusion, and fear so that we must experience even more pain, even more grief, even more confusion, and even more fear. The sorrow, sacrifice, and turmoil this creates for our families and friends are protracted and traumatic, so that when we finally pass on, it often comes as a relief to those who love us.
The most popular means to the end, of course, are diseases—which can strike in the prime of life. Heart disease at 50. Cancer at 55. Stroke at 60. Younger-onset Alzheimer’s at 65. Way too frequently, what is said at funerals is that someone left this life “way too early.” Or the diseases don’t kill, and the fight to beat them back again and again is a decades-long exercise in suffering.
These are terrible answers to the question of how we die. The answer we should strive for—just as much as we strive for prolonged vitality—is “when we are ready, then quickly and painlessly.”
Fortunately, the science of longevity shows that the longer we make rodents live, the faster they tend to die. They still die of the same diseases, but, perhaps because they are very old, and the animals are on the brink anyway, they tend to suffer for days rather than months, then keel over.
This is not the only way we should meet our end, though.
“Physician-assisted suicide.” “Death with dignity.” “Elective euthanasia.” Whatever we call it, we need to end the patchwork of laws and customs that force people to travel great distances, often when they are already suffering in one way or many, to bring their lives to a peaceful end.
These are the sorts of barriers that the eminent ecologist David Goodall faced in 2018, at the age of 104, when he was forced to leave his home in Australia, where physician-assisted suicide is illegal, and journey to a clinic in Switzerland, where it is lawful and safe. No one should have to choose between dying in a foreign land and committing a crime as his or her last act on Earth.
Thus no one with a sound mind who is over the age of 40—about the age at which one has paid back the initial societal investment in his or her education—should be denied the right to die on his or her own terms. And anyone, at any age, with a terminal diagnosis or painful chronic illness should have the same right.
Yes, there should be rules. Certainly there should be counseling involved and a waiting period. It should never be easy to take one’s life on a whim rather than taking arms against a sea of troubles. If it were, I and many others would probably not have made it through our teenage years. But we should not presume to leverage guilt and shame upon sane adults who wish to control the day of their final breath.
Nearly every day, and often multiple times in a day, someone tells me that they have no interest in living to 100, let alone many decades longer.
“If I get to a hundred, just shoot me,” they say.
“I think that seventy-five healthy years sounds about right,” they say.
“I just can’t imagine having to live with my husband for even longer than I already have to,” one rather distinguished scientist once told me.
That’s fine.
Indeed, there seems to be little appetite for the idea of living in perpetuity. I recently gave a talk to a general audience of about a hundred people spread across ages 20 to 90, a good cross section of the local community. The main donor to the institute was late, so I had to fill in time. I grabbed the microphone and did a little experiment.
“How long do each of you want to live?” I asked.
By a show of hands, a third said they’d be happy with 80 years; I told that group that they should all apologize to all the audience members who were older than 80. That got a laugh.
Another third indicated that they’d like to see 120. “That’s a good goal,” I said, “and probably not an unrealistic one.”
About a quarter wanted to make it to 150. “That’s not a silly thing to dream about anymore,” I said.
Only a few people wanted to live “forever.”
The numbers were similar at a recent dinner at Harvard for scientists who study aging. Very few of the attendees said they were gunning for immortality.
I’ve talked to hundreds of people about this topic. Most people who want immortality are not afraid of death. They just love life. They love their family. They love their careers. They would love to see what the future holds.
I’m not a fan of death, either. It’s not because I’m afraid of being dead. I can say this without reservation. On a plane, my wife, Sandra, clings to my arm at the first sign of turbulence, whereas my pulse doesn’t change. I travel enough to have experienced mechanical troubles on planes more than a few times, so I know how I react when faced with possible death. If the plane goes down, I die. Letting go of that fear was one of the best things I ever did.
Here’s where things get really interesting: when I do this little survey and then tell the audience that they could retain their health no matter how many years they live, the numbers of those who say they’d like to live forever shoots way up. Almost everyone wants that.
It turns out that most people aren’t afraid of losing their lives; they are afraid of losing their humanity.
And they should be. My wife’s grandfather was sick for many years before he died in his early 70s. At that point, he’d been in a vegetative state for several years—a truly horrible fate—but he had a pacemaker, and so, whenever his body would try to die, he’d be zapped back to life.
Not back to health, mind you. Back to life. There’s a big difference.
In my mind, there are few sins so egregious as extending life without health. This is important. It does not matter if we can extend lifespans if we cannot extend healthspans to an equal extent. And so if we’re going to do the former, we have an absolute moral obligation to do the latter.
Like most people, I don’t want unlimited years, just ones filled with less sickness and more love. And for most of those I know who are engaged in this work, the fight against aging isn’t about ending death; it’s about prolonging healthy life and giving more people the chance to meet death on far better terms—indeed, on their own terms. Quickly and painlessly. When they are ready.
Either by refusing the treatments and therapies that offer a prolonged healthy life or accepting those interventions and then deciding to leave whenever the time is right, no one who has returned what they have been given should have to stay on this planet if he or she does not wish to do so. And we need to begin the process of developing the cultural, ethical, and legal principles that will allow that to happen.
The environmental writer and activist George Monbiot is among those who have observed that when it comes to the future health of our planet, people are overly preoccupied with the number of humans on Earth while ignoring the fact that consumption “bears twice as much responsibility for pressure on resources and ecosystems as population growth.”28 Monbiot, who is on the far left, isn’t right about everything, but he’s certainly right about that. The problem isn’t population; it is consumption.
We know that humans can live healthily and quite happily while consuming far less than most do in the developed world. But we don’t know if they will. It is for this reason that among scientists who subscribe to the idea that our planet has an absolute limit of people it can sustain, those who have offered a generous estimate of the Earth’s carrying capacity are those that assume that our species will be capable of making more from less, even, perhaps, as we increase the standard of living of billions of people. The more pessimistic predictors, meanwhile, generally assume a global “tragedy of the commons” in which we greedily consume ourselves to death at an all-you-can-eat buffet of natural resources. Generally, people will be people, so which way we head will largely be determined by politics and by technology.
At least in one regard—the “stuff factor,” so to speak—technology is already driving a tremendous and positive change, a global process of “dematerialization” that has replaced billions of tons of goods with digital products and human services. Thus it is that wall-to-wall shelves dedicated to records and compact discs have been replaced by streaming music services; people who once needed vehicles for once-in-a-while travel now open an app on their phones to request a ride share; and entire wings of hospitals once used for storing patients’ records have been supplanted by handheld cloud-connected tablet computers.
As Steven Pinker has pointed out, a lot of the time, energy, and money we once spent making “stuff” is now “directed toward cleaner air, safer cars and drugs for ‘orphan diseases.’ ”29 Meanwhile, the “experiences, not things” movements and the like are transforming the ways in which we save and spend money—and leaving us with less crap in our basements. After a century of movement toward McMansions, the latter half of the 2010s saw a significant drop in the square footage of new homes and increasing demand for smaller apartments,30 continuing a centuries-long migration from farm-based rural living to smaller, shared urban spaces. As the global success of WeWork proves, today’s young adults are not only comfortable with much smaller working and living quarters, with shared community spaces such as offices, kitchens, gyms, laundries, and lounges, but increasingly are demanding them.31
The slow death of stuff is not the end of consumption, though. We’re as addicted as ever to wasting food, water, and energy. As it stands, the United Nations has warned, we are polluting water far faster than nature can recycle and purify it. We literally throw away half of the world’s edible food each year, more than a billion tons of it, even as millions of people are left hungry or malnourished.32
At the current pace of population growth and economic mobility, the United Nations estimates, by 2050 it will take the equivalent of nearly three of our planet’s resources to sustain our lifestyles for one year. Yet the United Nations spends surprisingly little time debating consumption, let alone forging international agreements that would help build a world in which no society consumes more than its share of what the Earth can produce under contemporary technological conditions.
That last part is important: just as it is helping us reduce our “stuff” addiction, technology absolutely has to play a role in solving these other consumption problems—for there is no free nation in the world that can unilaterally force its citizens to consume less while others on the planet consume more. Laws can encourage businesses to conform, but we also have to make it attractive and easy for individuals to consume less.
Therefore, we must invest in research that allows us to grow more healthy food and transport it more effectively. And please make no mistake: that includes accepting genetically modified crops, those engineered to include a trait in the plant that doesn’t occur in its wild form, such as resistance to insects, tolerance to drought, greater vitamin A production, or more efficient use of sunlight to convert CO2 to sugar—as an absolutely necessary part of our food future. With more efficient plants, we could feed up to 200 million additional people, just from plants grown in the US Midwest. 33
These crops have gotten a bad rap for being “unnatural,” although many people who hold this view don’t recognize that most of the food we think of as “natural” has already been subject to significant genetic manipulation. The ears of corn you see at the grocery store look nothing like the wild plant from which modern corn came; over the course of nine thousand years, the spindly finger-length grass known as teosinte was cultivated to evolve larger cobs and more rows of plump, soft, sugary kernels, a process of modification that significantly altered the plant’s genome.34 The apples we’ve grown accustomed to eating have a bit more resemblance to their small, wild ancestors, but good luck finding one of those ancestors; they have been nearly wiped off the planet, and that’s no great loss to our diet, since the biggest genetic contributor to modern apples, Malus sylvestris, is so tart it’s darn near inedible.35
In 2016, the National Academy of Sciences, in a sweeping report on genetically engineered crops, noted that lab-modified plants could be vital for feeding the planet’s growing human population if global warming threatens traditional farm products. And since numerous other reports over the past few decades had not been enough to assuage continuing public concern, the report’s authors once again reaffirmed the academy’s position that GMO crops are safe for both human consumption and the environment.
There is nothing wrong with skepticism, but after thousands of studies, the evidence is irrefutable: if you believe climate change is a threat, you can’t say that GMOs are, because the evidence that GMOs are safe is stronger than the evidence that climate change is occurring.
The World Health Organization, the American Association for the Advancement of Science, and the American Medical Association have also affirmed that, as WHO puts it, “no effects on human health have been shown as a result of the consumption of such foods by the general population.” Moreover, these foods could be vital to meeting the challenge of feeding the billions of people who are already going hungry in our world and the additional billions who will be joining us on this planet in coming years.
If we are to feed the world now and in the future, we need to embrace safe new technologies.
According to UNICEF, up to 2 million deaths each year could be prevented if poor families had access to more vitamin A in their diets in crops that are perfectly safe.36 Vitamin A supplements aren’t working as well as is needed. Between 2015 and 2016, vitamin A supplementation coverage dropped by more than half in the five countries with the highest child mortality rates.
An open letter signed by more than a hundred Nobel Prize winners called on governments to approve genetically modified organisms: “How many poor people in the world must die before we consider this a ‘crime against humanity’?” they asked. We could feed a billion more people with more nutritional food. With climate change, we may have no choice.
To decrease the impact of humans, there’s also a tremendous need to figure out how to satiate the global demand for protein without the tremendous environmental costs of farmed animal meat. Made with 99 percent less water, 93 percent less land, and 90 percent fewer greenhouse gases, innovations that are giving us damn-near-close-to-meat products—with plant “leghemoglobin” that “bleeds” and some good old-fashioned mad science—are booming and will need to continue to boom if we are to feed our appetite for tasty protein without further degrading our planet.
There’s no question that one of the greatest technological advances in this century has been the discovery of precise, programmable “genome editing.” As with most other breakthroughs, there were dozens of brilliant people involved in the lead-up to it,37 but Emmanuelle Charpentier, then at the Laboratory for Molecular Infection Medicine in Sweden, and Jennifer Doudna at UC Berkeley have garnered the most fame for their remarkable discovery that the bacterial Cas9 protein is a DNA-cutting enzyme with an RNA-based “GPS” or “guide.”38 The next year, Feng Zhang at MIT and George Church at Harvard proved that the system could be used to edit human cells. They, too, garnered fame—and some very valuable patents.39 News of the discovery spread quickly down the hall to my lab. It seemed too good to be true—except it was.
The technology is colloquially known as CRISPR, for “clustered regularly interspaced short palindromic repeats,” which are the natural DNA targets of Cas9 cutting in bacteria. Cas9, and now dozens of other DNA-editing enzymes from other bacteria, can alter plant genes with accuracy, without using any foreign DNA. They can create exactly the same kind of alterations that occur naturally. Using CRISPR is far more “natural” than bombarding seeds with radiation, a treatment that is not banned.
That’s why the decision by the Court of Justice of the European Union in 2018 was so unexpected and upsetting to the United States. The court ruled in favor of Confédération Paysanne, a French agricultural union that defends the interests of small-scale farming, and eight other groups, to ban CRISPR-made foods.40
The ruling defies science. It bans healthy foods that could relieve the environmental burden, increase the health of the poor, and allow Europe to cope better with global warming. The ruling also scared developing nations away from CRISPR-modified crops; there they could make a positive impact both on people’s lives and on their land.
The text of the ruling makes it clear that it was not a decision to protect consumers from the dangers of GMO; it was part of a global trade war to prevent US-patented products from entering the European Union. The US secretary of agriculture, Sonny Perdue, made this abundantly clear in his response: “Government policies should encourage scientific innovation without creating unnecessary barriers or unjustifiably stigmatizing new technologies. Unfortunately, this week’s ECJ ruling is a setback in this regard in that it narrowly considers newer genome editing methods to be within the scope of the European Union’s regressive and outdated regulations governing genetically modified organisms.”41
Of course nations should be able to help local farmers whose livelihoods are threatened, but there are other ways to do that. It’s ultimately hurtful to everyone on the planet to use the cover of “dangerous science” to justify trade restrictions, especially to those who need the new technology most.
We also need to solve the shortage of fresh, drinkable water. Cities such as Las Vegas, a very thirsty town in the middle of the driest place in the United States, have demonstrated that by marrying conservation and innovation, efficient water recycling is not only possible but profitable; whereas metro Vegas grew by half a million people from 2000 to 2016, its total water use fell by a third.
We often adopt new technologies way too slowly, but when we finally do, they can solve some of our biggest problems. It was back in 1962 that scientist Nick Holonyak, Jr., created the first practical visible light–emitting diode. At General Electric they called it “the magic one.” It took another half century to develop an LED house bulb, and even then, many US consumers revolted, preferring to slow the phase-out of incandescent bulbs even as other nations moved forward with the LED revolution. Eventually, a combination of tax incentives and laws that outlawed the Edison lightbulb forced the adoption of LED lighting. Today’s LED lights use 75 percent less energy than incandescent lighting and last fifty times as long, which in a typical home is about two decades.
Widespread use of LEDs in the United States is set to save the equivalent of the annual output of forty-four large electric power plants, saving about $30 billion a year.42 To put this into perspective, that money could double the budget of the National Institutes of Health and set forty thousand scientists to work on lifesaving medicines. Human ingenuity is not a zero-sum game.
Longer, healthier lives will do us little good if we consume ourselves into oblivion. The imperative is clear: whether or not we increase human longevity, our survival depends on consuming less, innovating more, and bringing balance to our relationship with the bounty of our natural world.
That might seem like a tall order. Indeed, it is a tall order. But I believe we can stand tall—and together—to meet it.
In many ways, we are already doing so.
At the 2018 Global Climate Action Summit, for instance, it was announced that twenty-seven cities had reached peak emission levels. A peak, not a plateau. All of those places were seeing steep emission declines. Among that group of cities was Los Angeles, which was once definable by its ubiquitous smog. It had cut its emissions by 11 percent. In one year.43
Yes, there are more people than ever in the cities of North America, South America, Europe, and Asia, but today the impact of each human in those regions is declining. We’re rapidly moving from petroleum to natural gas, solar power, and electricity. When I first visited Bangkok, I experienced respiratory distress. Now, more days than not, there is blue sky. When I arrived in Boston in 1995, a splash of water from the harbor could land you in hospital—or in the grave. Now it is safe for swimming.44 The same is true of Sydney Harbor, the Rhine River, and the Great Lakes.
Going backward or even staying put is not a viable solution to the current crisis. The only path forward is one in which we embrace human capital and ingenuity.
One of the best examples comes from a tiny town in South Australia. After the closure of the last coal-fired power station in the state in 2016, investors built Sundrop Farms on the barren coast, then hired 175 people who had recently become unemployed.45 The farm uses free energy from the sun and seawater to make 180 Olympic-sized swimming pools’ worth of freshwater per year, an effort that in the past would have burned a million gallons of diesel fuel. Today, 33 thousand pounds of fresh organic tomatoes are shipped each year from the port where coal used to come in.
Sundrop is an example of a Schumpeterian “gale of creative destruction,” the type of technological paradigm shift we will need to usher in the age of longevity and prosperity. For this to happen, we need more visionary scientists, engineers, and investors. We need more smart legislation to speed, not impede, the adoption of Earth-saving technologies. This will free up money and human capital that are currently wasted. The freed-up money needs to be reinvested in people and technologies, not in meaningless “stuff,” to ensure that humanity and the Earth endure—indeed thrive—together.
The University of Pennsylvania was a wonderful school at which to study theology and the classics. It had recently launched a medical school, too. As a native Philadelphian, Joseph Wharton was proud of the local college. But the millionaire industrialist also believed the university was missing something essential.
“With industry now powered by steam and steel, we can no longer rely on apprenticeship alone to create future generations versed in business,” he wrote to friends and associates on December 6, 1880, just months before officially opening the world’s first business college, the Wharton School. “There needs to be institutions to instill a sense of the coming strife of the business life and of the immense swings upward or downward that await the incompetent soldier in this modern strife.”46
But Wharton could scarcely have predicted the extent of the “strife” that was on the horizon: a nascent labor movement in Europe would soon go global, bringing with it revolutionary changes in workers’ rights.
Among those changes was something that had never existed in the history of labor: the weekend. We tend to take the five-day workweek as a given, but it’s an exceptionally recent innovation. It didn’t exist as a concept—or even a phrase—until the late 1800s.47 The same can be said of legal limits on daily working hours, the abolition of child labor, medical benefits, and health and safety regulations. All of this was a response to the needs and demands of labor—and, indeed, the best interests of business owners such as Wharton.
The global Schumpeterian transformation now at hand will reshape the world as profoundly as the Industrial Revolution. Every business school in the world should be preparing its students for what is coming—and labor advocates should be doing the same. The idea of connecting retirement to a person’s chronological age will be an anachronism soon enough. And just like Social Security, the structures that support labor pensions will need to be reevaluated.
Skillbaticals, which might take the shape of a government-supported paid year off for every ten worked, might ultimately become cultural and even legal requisites, just as many of the labor innovations of the twentieth century have. In this way, those who are tired of “working harder” would be afforded every opportunity to “work smarter” by returning to school or a vocational training program paid for by employers or the government, a variation of the universal basic income that is being discussed in the United States and some countries in Europe.
Meanwhile, those who believe they are happy and secure in their careers can enjoy what has come to be known as “a miniretirement”—a year off to travel, learn a language or musical instrument, volunteer, or refresh and reconsider the ways in which they are spending their lives.
This is not a particularly crazy scheme; sabbatical leave is common in higher education. Yet an idea like this might seem ridiculous to those who only consider the way the world works today. Who would pay for such a benefit? How will companies retain workers in the long term without the promise of a “gold watch retirement” plan at the end of decades of service?
But whoever engages in this discussion now will have the upper hand when we decide how to redistribute the resources freed by the elimination of ever-skyrocketing insurance premiums and pyramid-scheme pensions. Yet few business professors are so much as thinking about this coming change and even fewer courses are being taught on the subject in places such as the Wharton School. Labor leaders, meanwhile, are locked in an understandable but ultimately futile fight for retirement and benefits for workers who in the past would have labored for forty or fifty years, retired for a short spell, and then rather promptly died. Almost no one is fighting over what the world of work will look like when age is truly nothing more than a number.
But that era is coming. And it is coming sooner than most people and institutions realize.
“I sure am glad I won’t be around when that happens.”
I hear this a lot—mostly, it seems, from people who are in, or soon approaching, retirement. These are folks who have already decided that their lives are going to end in the next couple of decades. They’re certainly hoping to stay healthy during that time and maybe eke out a few extra good years if they can, but they don’t think they’re going to be around for much longer than that. To them, the middle of this century might as well be the next millennium. It’s not on their radar.
And that’s the world’s biggest problem: the future is seen as someone else’s concern.
In part, this stems from our relationship to the past. Very few of us had the opportunity to get to know our great-grandparents. Many of us don’t even know their names. That relationship is an abstraction. And so most of us don’t think about our great-grandkids as much more than a fuzzy, abstract idea.
Sure, we care about the world our children will live in because we love our children, but the conventional wisdom about aging and death tells us they’ll be gone a few decades after we are. And yes, we care about our grandkids, but by the time they come around, we’re often so close to the exit that it doesn’t seem as though there’s much we can do about their future anyway.
This is what I want to change—more than anything else in the world. I want everyone to expect that they will meet not only their grandchildren but their great-grandchildren and their great-great-grandchildren. Generations upon generations living together, working together, and making decisions together. We will be accountable—in this life—for the decisions we made in the past that will impact the future. We will have to look our family members, friends, and neighbors in the eye and account for the way we lived before they came along.
That, more than anything else, is how our understanding of aging and inevitable prolonged vitality is going to change the world. It will compel us to confront challenges that we currently push down the road. To invest in research that won’t just benefit us now, but people 100 years from now. To worry about the planet’s ecosystems and climate 200 years from now. To make the changes we need to make to ensure that the rich don’t enjoy an increasingly lavish way of life while the middle class begins to tumble toward poverty. To ensure that new leaders have a fair and legitimate opportunity to displace old ones. To bring our consumption and waste into balance with what the world can sustain today and many centuries into the future.
This isn’t going to be easy. The challenges are vast. We are not only going to have to “touch the third rail” of politics—Social Security—but to douse ourselves in water and lie down upon it, adjusting our expectations about work, retirement, and who deserves what and when. We’re no longer going to be able to wait for prejudiced people to die; we’re going to have to confront them and work to soften their hearts and change their minds. We can’t just allow the Anthropocene extinction to continue—at a rate thousands of times higher than the natural rate—we need to slow it dramatically and, if we can, stop it altogether.
To build the next century, we’re going to have to figure out where everyone is going to live, how they are going to live, under what rules they are going to live. We’re going to have to ensure that the vast social and economic dividends we receive from prolonging people’s lives are spent wisely.
We’re going to have to be more empathetic, more compassionate, more forgiving, and more just.
My friends, we’re going to have to be more human.