Science & Medicine

0610026_02.tif

Near Geneva, a scientist peers into the Compact Muon Solenoid of the Large Hadron Collider, where the Higgs boson was found

Photograph by Maximilien Brice

540329.tif

dr. jonas salk

march 29, 1954 | Portrait by Boris Artzybasheff

Through war and peace, the rise and fall of empires, seismic migrations from agrarian villages to thrumming cities and all the other epic transformations of the past 90 years, there has been one unbroken rhythm: the march of scientific progress. In fact, it has been less a steady drumbeat than a near constant acceleration. Propelled by the genius of Albert Einstein—time’s Person of the Century—who published his general theory of relativity in 1915, the early decades of the 20th century produced a burst of insights into the fundamental questions facing humanity: What is the universe made of? How did it begin? When and how will it end?

Einstein’s revelations that space and time stretch and curve and that mass and energy are two sides of one coin brought an end to the order and certainty of the Newtonian universe. In Einstein’s wake came a range of even more startling ideas, many of which are now part of our general vocabulary: Erwin Schrödinger’s 1926 equation for quantum mechanics (not to mention his cat!), Werner Heisenberg’s 1927 uncertainty principle, Edwin Hubble’s discoveries in 1924 and 1929 that there are galaxies beyond our Milky Way and they are flying ever farther apart, and Georges Lemaître’s theory—now scientific dogma—that everything we know and everything that ever was began with the Big Bang.

In subsequent decades, Einstein’s heirs, employing ever larger accelerators to smash atomic particles and ever more powerful telescopes to look further back in time, have discovered black holes and pulsars and hunted down the quirks and quarks of the standard model of particle physics right up through the 2012 discovery of the long-predicted Higgs boson. Along the way, they found evidence that much of the universe is made of mysterious dark matter and dark energy—the latter first proposed and then rejected by Einstein. The theoretical work also set the stage for practical efforts to harness the power of the atom, with decidedly mixed results: it brought us nuclear power and nuclear bombs.

If the theory of relativity shifted paradigms in physics, grasping the double-helix structure of DNA did the equivalent for the life sciences. “We have found the secret of life,” boasted Francis Crick in a Cambridge pub in 1953 after he and James Watson made the discovery. It’s hard to imagine any of today’s work on neuroscience, genetics, cancer, HIV/AIDS and so many other biological fields without the fundamental understanding of how DNA is transcribed into RNA to produce the essential proteins of life.

From that moment, molecular biology seemed to move at warp speed, as TIME noted in its 1971 cover story on “The New Genetics.” Breathless though it was, it could not anticipate that, just 50 years after Watson and Crick’s eureka moment, the entire human genome—some 3 billion base pairs of A’s and T’s, C’s and G’s—would be transcribed: a book of life, a recipe for a human being!

While the benefits of the human genome project—­personalized medicine, gene therapy, and other wonders—will more likely be realized in the next 90 years, other breakthroughs in biology and medicine came through more traditional methodologies. Old-fashioned crossbreeding of wheat cultivars in Mexico led to a great humanitarian triumph: the 1960s Green Revolution of high-yield, disease-resistant grains. For this achievement, which helped rescue the Indian subcontinent from famine, Norman Borlaug (who never graced a TIME cover) earned a 1970 Nobel Peace Prize and the epithet “the man who saved a billion lives.”

And it was good old bench science and keen powers of observation that enabled Alexander Fleming to discover the antibiotic properties of penicillin, a moldy contaminant that was killing off the Staphylococcus colonies in his petri dishes. The world’s first wonder drug saved thousands of troops from gangrene and death in World War II. It also laid the foundation of a new era of modern medicine, in which doctors were heroes. Among the most admired were Jonas Salk and Albert Sabin, who developed polio vaccines in the 1950s that proceeded to cut the worldwide number of cases from 500,000 in 1950 to about 200 in 2012.

Much of the medical magic of the postwar period was made possible by wondrous new machines that made inner anatomy visible: CT, PET, MRI and functional MRI. More recent decades brought the revolution in psychiatric drugs, leading TIME to ask in 1993, “Is Freud Dead?” They also brought the promise of stem-cell research, which may one day make organ transplants obsolete: better to grow a new liver than make do with someone else’s.

While physicists, medical researchers and other scientists were among the biggest heroes of the past 90 years, their pedestals have acquired some tarnish. Even the greatest innovations have a dark side. With the nuclear age came a clear and ever present threat to human survival from stray bombs and power-plant meltdowns. With antibiotics came superstrains of multi-drug-resistant tuberculosis, malaria and other scourges. With new and improved seed crops came monoculture, rain-forest destruction, Big Agriculture and a pathway to today’s obesity pandemic. And with those cleared rain forests and the expanding use of carbon-based fuels came the global threat of climate change.

Our early 20th-century enthusiasm for scientific progress has been tempered by popular movements—and TIME cover topics—like environmentalism and alternative medicine. We head into the next 90 years a little less starry-eyed but still hopeful about tomorrow’s discoveries.

241027b.tif

Sigmund Freud

October 27, 1924

Portrait by S. J. Woolf

The father of psychoanalysis was 68 and a towering—if controversial—figure when time published the first of five covers featuring his image.

It is difficult to analyse Freud’s doctrine of psychoanalysis. Is it a science or a philosophy? As there can be no science with a philosophy, it is both. Freud says that injuries are caused to the body by the mind (neurosis); not the conscious mind, for no one is so foolish, but by the unconscious mind. The psychoanalyst’s job is, therefore, to bring into the conscious mind those factors which are disturbing the unconscious mind and so cause them to disappear.

The study of the problems of the unconscious mind led Freud to dream interpretation, which was to become the principal method of psychoanalysis. It was the quickest route of reaching a patient’s unconscious mind. Freud, in his Interpretation of Dreams, goes deeply into the whole subject and, as he almost always uses his own dreams as examples, the book is also an autobiography. In theory, psychoanalysis is the philosophy of the unconscious mind; in practice it is a means by which mental disorders can be cured.

290218b.tif

Albert Einstein

February 18, 1929

A week after 24¢ copies of his “Coherent Field Theory” reached the U.S. and a month before his 50th birthday, time’s future Person of the Century made the first of six cover appearances.

Einstein did not develop his conception of the world suddenly. He began by suspecting that nothing in the world was privileged, neither matter, nor motion, nor anything else. His suspicion led to the perception that there is one great physical law which describes everything.

First he inspected electrical and magnetic phenomena. Everyone knows, and had known, that they are intimately related. Electricity flowing through a wire coiled around a piece of iron makes that iron magnetic. As a piece of wire passes between the prongs of a horseshoe magnet, an electric current is generated. James Clerk Maxwell showed that the laws of electricity and of magnetism were very much alike. Albert Einstein, in 1905, showed that the forces were different aspects of the same mother force.

Maxwell said that two orphan boys resembled each other very much. Albert Einstein hunted around until he found that they were brothers, sons of the same electromagnetic mother.

440515.tif

Dr. Alexander Fleming

May 15, 1944

Portrait by Ernest Hamlin Baker

Scottish biologist Alexander Fleming discovered penicillin in 1928, but it wasn’t until it was mass-produced during World War II that its miraculous, lifesaving powers were widely known and celebrated.

Last year penicillin patients were still rare enough to be frontpage news. First such case was two-year-old Patricia Malone of Jackson Heights, Queens. The New York Journal-American, which begged enough penicillin from Dr. Keefer to save her life from staphylococcic septicemia, last week won the Pulitzer Prize for the story. After that, the whole nation watched one “hopeless” case after another get well.

There were the three doctors in the California mountains last winter who saved a seven-year-old girl when gas gangrene had forced repeated amputations of her left arm up to the shoulder: “As a last resort, penicillin was given after all hope had been abandoned for a recovery, which came like a miracle.” There was a doctor in Sioux Falls, S.D., who was astonished to save a man moribund with osteomyelitis and septicemia after sulfadiazine had failed: “This being the first case in which I have employed penicillin therapy, I feel that the results obtained, to say the least, were miraculous.”

Doctors now know in general which diseases penicillin helps, have worked out a tentative schedule of dosage.

481108.tif

Robert Oppenheimer

November 8, 1948

Portrait by Ernest Hamlin Baker

The first of two time covers on the father of the atomic bomb focused on his passionate postwar efforts to put the genie back in the bottle through nuclear regulation.

On July 16, 1945, all the long months at Los Alamos were put to the test in the New Mexico desert. Brigadier General Thomas F. Farrell was watching Oppenheimer when it happened: “He grew tenser as the last seconds ticked off. He scarcely breathed. He held on to a post to steady himself ... When the announcer shouted ‘Now!’ and there came this tremendous burst of light, followed ... by the deep-growling roar of the explosion, his face relaxed into an expression of tremendous relief.” Oppenheimer recalls that two lines of the Bhagavad-Gita flashed through his mind: “I am become death, the shatterer of worlds.”

Los Alamos and its aftermath left him with “a legacy of concern.” Two years later Oppenheimer told his fellow physicists that their weapon had “dramatized so mercilessly the inhumanity and evil of modern war. In some sort of crude sense which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin; and this is a knowledge which they cannot lose.”

670407.tif

Contraception

April 7, 1967

Photograph by Robert S. Crandall

Seven years after the Pill was approved for prescription use in the U.S., time hailed the way it had revolutionized “sex and family life” in America and would soon do so for millions abroad.

The pill” is a miraculous tablet that contains as little as one thirty-thousandth of an ounce of chemical. It costs 11¢ to manufacture; a month’s supply now sells for $2.00 retail. It is little more trouble to take on schedule than a daily vitamin. Yet in a mere [seven] years it has changed and liberated the sex and family life of a large and still growing segment of the U.S. population: eventually, it promises to do the same for much of the world.

“The pill,” as oral contraceptives are now universally known, may well have as great an impact on the health of billions of people yet unborn as did the work of Pasteur in revealing the mechanism of infections, or of Lister in preventing them. For if the pill can defuse the population explosion, it will go far toward eliminating hunger, want and ignorance. So far, it has reached only a tiny fraction of the world’s 700 million women of childbearing age, but its potential is clear from U.S. experience. Of the 39 million American women capable of motherhood, 7,000,000 have already taken the pills; some 5,700,000 are on them now.

671215.tif

Dr. Christiaan Barnard

December 15, 1967

Portrait by Robert Vickrey

In Cape Town, South Africa, the world’s first successful heart transplant captivated the imagination of humankind. It opened a new era of organ transplantation even as it raised ethical questions about defining life and death.

Dr. Christiaan Barnard moved into the first operating room and cut eight blood vessels to free Denise Darvall’s heart; then he severed it from its ligament moorings. It was disconnected from the pump, and was carried to [Louis] Washkansky’s room, where it was connected to a small-capacity heart-lung machine. There it lay, chilled and perfused with oxygenated blood, while Surgeon Barnard removed most—but not quite all—of Washkansky’s heart ...

In painstaking sequence, Dr. Barnard stitched the donor heart in place. First the left-auricle, then the right. He joined the stub of Denise’s aorta to Washkansky’s, her pulmonary artery to his. Finally, the veins ...

Now, almost four hours after the first incision, history’s first transplanted human heart was in place. But it had not been beating since Denise died. Would it work? Barnard stepped back and ordered electrodes placed on each side of the heart and the current (25 watt-seconds) applied. The heart leaped at the shock and began a swift beat. Dr. Barnard’s heart leaped too. Through his mask, he exclaimed unprofessionally but pardonably, “Christ, it’s going to work!” Work it did.

771107b.tif

Richard Leakey

November 7, 1977

Photograph by Carl Fischer

A surge of early-hominid discoveries, many by Richard Leakey and including the remarkable Lucy, unearthed in 1974, led time to take a fresh look at the ascent of man.

The most exciting of the recent discoveries have come from East Africa and Richard Leakey ...

The result of these findings is a radical revision of long-held views of evolution. As recently as a decade ago, scientists talked about a direct, unbranching line of descent—Australopithecus, Homo erectus, modern man—one following the other in logical order. Now all that has changed. “We can no longer talk of a great chain of being in the 19th century sense, from which there is a missing link,” says Phillip Tobias, 51, [Raymond] Dart’s successor as professor of anatomy at the University of the Witwatersrand medical school in Johannesburg. “We should think rather of multiple strands forming a network of evolving populations, diverging and converging, some strands disappearing, others giving rise to further evolutionary development.”

Anthropologists now believe that man’s family tree ... goes back to a primate called Dryopithecus, a true ape that appeared some 20 million years ago.

850812.tif

AIDS

August 12, 1985

Photograph by Erskine Palmer

The revelation that screen star Rock Hudson was battling AIDS gave the burgeoning epidemic a public face, catapulting it out of the closet and onto the cover of time.

More than one normally understated scientist has termed AIDS “the disease of the century.” Others have, in the tradition of divine justification, viewed it as God’s revenge on sodomites and junkies. There have been far more pervasive epidemics, certainly. In 1918 and ’19, Spanish flu killed more than 500,000 Americans and ultimately 20 million worldwide. A million Russians may have died of cholera in 1848 alone. But during these scourges there were always the possibility and hope that the fever would lift, strength would return, and life would go on. With AIDS, says Dr. Michael Gottlieb, the UCLA immunologist who is overseeing [Rock] Hudson’s care, “the word cure is not yet in the vocabulary.”

It is the virtual certainty of death from AIDS, once the syndrome has fully developed, that makes the disease so frightening, along with the uncertainty of nearly everything else about it ... In trying to understand AIDS, says Dr. William Haseltine, a leading investigator at Harvard’s Dana-Farber Cancer Institute, “we have moved from being explorers in a canoe to explorers with a small sail on the vast sea of what we do not know.”

871019.tif

Global Warming

October 19, 1987

Photograph by Elle Schuster

Frightening new findings about a hole in Earth’s ozone layer and mounting evidence for climate change prompted the first of eight cover stories on global warming, including 1989’s Planet of the Year.

Atmospheric scientists have long known that there are broad historical cycles of global warming and cooling; most experts believe that the earth’s surface gradually began warming after the last ice age peaked 18,000 years ago. But only recently has it dawned on scientists that these climatic cycles can be affected by man. Says Stephen Schneider, of the National Center for Atmospheric Research in Boulder: “Humans are altering the earth’s surface and changing the atmosphere at such a rate that we have become a competitor with natural forces that maintain our climate. What is new is the potential irreversibility of the changes that are now taking place.”

Indeed, if the ozone layer diminishes over populated areas—and there is some evidence that it has begun to do so, although nowhere as dramatically as in the Antarctic—the consequences could be dire ... Potentially more damaging than ozone depletion, and far harder to control, is the greenhouse effect, caused in large part by carbon dioxide (CO2).

970310b.tif

Cloning

March 10, 1997

Photo-illustration by Arthur Hochstein

The successful cloning of a lamb, named Dolly, from an adult ewe’s udder cell was a biological breakthrough that provoked worries about Frankensteinian science and human cloning.

Dolly, the clone, is an epochal—a cataclysmic—creature. Not because of the technology that produced it. Transferring nuclei has been done a hundred times. But because of the science. Dolly is living proof that an adult cell can revert to embryonic stage and produce a full new being. This was not supposed to happen.

It doesn’t even happen in amphibians, those wondrously regenerative little creatures, some of which can regrow a cut-off limb or tail. Try to grow an organism from a frog cell, and what do you get? You get, to quote biologist Colin Stewart, “embryos rather ignominiously dying (croaking!) around the tadpole stage.”

And what hath [Dr. Ian] Wilmut wrought? A fully formed, perfectly healthy mammal—a mammal!—born from a single adult cell. Not since God took Adam’s rib and fashioned a helpmate for him has anything so fantastic occurred.

A scientist uses ultraviolet light to look at DNA-strand results at Washington University’s human-genome lab in St. Louis

Photograph by Karen Kasmauski

000703b.tif

J. Craig Venter and Francis Collins

July 3, 2000

Photograph by Gregory Heisler

The first great scientific achievement of the 2000s was mapping the human genome. time’s cover story detailed the race between two rival teams and the potential they were unlocking.

After more than a decade of dreaming, planning and heroic number crunching, both groups [one led by Francis Collins and the other by J. Craig Venter] have deciphered essentially all the 3.1 billion biochemical “letters” of human DNA, the coded instructions for building and operating a fully functional human.

It’s impossible to overstate the significance of this achievement. Armed with the genetic code, scientists can now start teasing out the secrets of human health and disease at the molecular level—secrets that will lead at the very least to a revolution in diagnosing and treating everything from Alzheimer’s to heart disease to cancer, and more. In a matter of decades, the world of medicine will be utterly transformed, and history books will mark this week as the ceremonial start of the genomic era.

040607.tif

Obesity

June 7, 2004

Photo-illustration by Arthur Hochstein

With two-thirds of U.S. adults overweight and 30% of kids overweight or at risk, obesity was fast becoming the leading threat to American health. The big fat problem earned a special issue.

It’s natural to try to find something to blame—fast-food joints or food manufacturers or even ourselves for having too little willpower. But the ultimate reason for obesity may be rooted deep within our genes. Obedient to the inexorable laws of evolution, the human race adapted over millions of years to living in a world of scarcity, where it paid to eat every good-tasting thing in sight when you could find it.

Although our physiology has stayed pretty much the same for the past 50,000 years or so, we humans have utterly transformed our environment. Over the past century especially, technology has almost completely removed physical exercise from the day-to-day lives of most Americans. At the same time, it has filled supermarket shelves with cheap, mass-produced, good-tasting food that is packed with calories. And finally, technology has allowed advertisers to deliver constant, virtually irresistible messages that say “Eat this now” to everyone old enough to watch TV.

This artificial environment is most pervasive in the U.S. and other industrialized countries, and that’s exactly where the fat crisis is most acute.