We will now discuss in a little more detail the Struggle for Existence.
—CHARLES DARWIN
Sweet flowers are slow and weeds make haste.
—WILLIAM SHAKESPEARE
WE DREAM ABOUT progress. When we do, we imagine progress to be technological. We imagine the past to be lesser than the present, which is lesser still than the future. But when it comes to our management of the life around us, particularly the life in our homes, this may not be the case. Where we have controlled dangerous pathogens, it has been a huge step forward, but we went too far and killed the beneficial species too. Then we inadvertently built our houses in ways that favor problem species—fungi living in our walls, new pathogens lurking in our showerheads, and German cockroaches running beneath our doors. Meanwhile, the entire time there was always another way, another road. We could have, years ago, figured out how to favor species that benefit us in our homes. This may seem like a dangerous proposition. But it is less dangerous than the world we have made. What is more, it is a method that has already been tried. It was tried, in all places, on the skin of newborn babies. And it worked.
It all began in the late 1950s. A pathogen called Staphylococcus aureus type 80/81was spreading rapidly among hospitals in the United States.1 It threatened people who visited the hospitals and then, when they went home, threatened their families too. It was especially dangerous to babies, for whom it was, as one study at the time noted, “responsible for more potentially serious infections in hospitals than any other microbe.”2
Staphylococcus aureus type 80/81 (hereafter just “80/81”) lodged itself in a person’s nose or belly button and, having done so, became essentially impossible to eradicate. It was resistant to the main antibiotic used at the time, penicillin. Penicillin was first available to the public in 1944. Penicillin didn’t work for all kinds of pathogens (controlling the tuberculosis bacterium, Mycobacterium tuberculosis, for example, awaited the discovery of a new kind of antibiotic, streptomycin, produced by Streptomyces bacteria). But it did work on pathogenic strains of Staphylococcus aureus until, that is, strain 80/81 evolved. This strain was no longer killed by penicillin.3 Worse yet, it spread alarmingly quickly.
By 1959, Presbyterian Weill Cornell Hospital in New York was among the many hospitals in which 80/81 had become common in nurseries. In this, the hospital was ordinary. What was different about Presbyterian Weill Cornell was that two people—Heinz Eichenwald and Henry Shinefield—decided to find a solution to the problem of 80/81.4 Eichenwald was a doctor in the Department of Pediatric Diseases at the Cornell Medical Center at Cornell University, and Shinefield was a newly appointed assistant professor in the same department. In their work together, the two men would usher in an entirely new kind of medicine and an entirely new way of managing the life indoors.
Eichenwald and Shinefield diligently studied the nurseries in Presbyterian Weill Cornell. Every day before they went home, they checked the nurseries in the hospital for the presence of 80/81. They couldn’t have quite described what they were searching for; they would know it when they saw it. It was tedious work, but the tedium became a kind of ritual. Then the ritual began to pay off with novel observations.
The first observation was that the nurseries at Presbyterian Weill Cornell with the most infections had all been visited by the same nurse, a nurse later confirmed to have 80/81 lodged in her nose (hereafter we’ll just call her “Nurse 80/81”). Infections seemed to follow Nurse 80/81 wherever she went, or nearly so. Nurse 80/81 was clearly to blame. Infections in hospital nurseries were very common; so too were infectious nurses. In most hospitals this nurse would have been dismissed and the story of her effects on the nursery would be over. Case closed. Initially, this is what happened. Nurse 80/81 was, as Shinefield and Eichenwald would later note, “removed.” But there was more to the story at Presbyterian Weill Cornell.
Nurse 80/81 had contact with a total of 68 babies, 37 on the day of their birth and 31 not until twenty-four hours after their birth, on their second day of life. Of the 37 babies that she handled in the first twenty-four hours of their lives, one-fourth were colonized by 80/81. However, of the 31 newborns that she handled after they had been alive for twenty-four hours, none were colonized by 80/81. Their noses were, instead, colonized by other bacterial strains, including other apparently harmless strains of Staphylococcus aureus. Herein was a mystery among the bodies and fates of newborns. Why did the babies who were held by Nurse 80/81 during the first twenty-four hours of their lives get colonized by 80/81, even though those who were held when they were just one day older did not? In comparing these two groups of babies, Eichenwald and Shinefield developed a hunch about what might be happening. A hunch can make a career; it can also ruin one.5
Eichenwald and Shinefield imagined two possible explanations for the pattern they observed. The first and more ordinary explanation was that age conferred some sort of immunological maturity that better enabled the newborns to defend themselves. The older babies simply rejected the 80/81. Their bodies killed the pathogen before it could establish itself. Let’s call this the tough baby hypothesis. Scientists aren’t supposed to discount hypotheses that they find boring and unfortunate, but they do; and this is how Eichenwald and Shinefield felt about the tough baby hypothesis. It was boring.
Eichenwald and Shinefield’s second hypothesis was outlandish and slightly wild but also way more interesting. Eichenwald and Shinefield wondered whether the older babies had had more of a chance to be colonized by other strains. “Good” strains of Staphylococcus aureus might confer resistance to newly arriving pathogens (such as 80/81) much as if they were a kind of force field. If the latter hypothesis, which Shinefield termed “bacterial interference,” was right, it suggested a whole new world in which beneficial bacteria might be seeded onto bodies as well as hospital surfaces and homes.
In developing these ideas, the men knew, as they would note, “colonization of the newborn with Staphylococcus aureus is a normal event,” an event that will “always take place sooner or later.”6 This was well established. A number of studies had, by then, shown the skin of healthy adults hosted a shag carpet–like layer of microbes. In the nose, belly button, and a few other spots, those microbes nearly always included species of Staphylococcus aureus living in dense biofilms. Other patches of skin—be they forearms or backs—were dominated by other species of Staphylococcus, species of Corynebacterium, Micrococcus, as well as other oligarchs of the flesh.7 It is the normal condition of mammals to be covered in a shaggy layer of bacteria (though we now know that just which species form this layer depends greatly on the identity of the mammal). Even when naked, we are cloaked, and the same is true for the surfaces in our houses. We also know that babies, in utero, do not have microbes on their skin (or in their guts or lungs); babies’ bodies are colonized during birth.
In this context, Eichenwald and Shinefield imagined that the cloak of newly established microbes on the skin of these day-old newborns, particularly in their noses and belly buttons, might prevent other microbes from colonizing or thriving. More specifically, they thought that beneficial strains of Staphylococcus aureus might outcompete pathogens by taking up space and food resources before the pathogens could gain a foothold.8 Ecologists call this scenario “exploitative competition.” It was also possible that in addition to preventing the establishment of pathogens through exploitative competition, the established species might produce kinds of antibiotics, called “bacteriocins,” that actively deter or even kill other, later-arriving bacteria.9 Ecologists call this “interference competition.”10 Both kinds of competition are common in nature and well documented out among the plants in grasslands or ants in rain forests, but the idea that they might be occurring among bacteria on bodies and in buildings was radical at the time. It was not without precedent, and yet even so, it was a marginal concept—not so much lunacy as heresy.
At that time, medicine, particularly medicine dealing with infection, was focused on killing bad species or strains when they began to cause problems. It had been so ever since Snow found the contaminated well in Soho, London, ever since Louis Pasteur worked out that individual pathogenic species can cause disease (the germ theory). Almost no one searched for beneficial species or considered that illness might sometimes result from their absence.11 The focus was on pathogens. Pathogens and how to kill them. The culture was akin to that before the domestication of wild animals, a time when our ability to deal with large beasts relied entirely on our ability to avoid them or kill them. Eichenwald and Shinefield thought differently. They imagined that effective medicine, and human health more generally, could rely on a more holistic view of life.
They, along with their colleague John Ribble, devised an experiment. They wanted to see what would happen when newborns initially placed in a nursery with virtually no 80/81 were moved to the nursery in which more than half of all newborns were infected. Would the transferred babies be protected by their early colonization by bacteria other than 80/81? This experiment was carried out at Presbyterian Weill Cornell. The babies were put in a safe nursery, a nursery without 80/81, for sixteen hours. They were then transferred to a nursery in which 80/81 was not only present but ubiquitous. The result was unambiguous. The babies that started in the nursery without 80/81 were protected from 80/81, even though they were just a day old.12
This experiment was clever (though ethically dubious). It suggested a role for beneficial bacteria as a defense against pathogens; the beneficial bacteria appeared to outcompete or even to kill the pathogens. Yet it left open a range of other possible explanations, not least of which was the oh-so-very-boring tough baby hypothesis. Eichenwald and Shinefield decided to do the perfect experiment; they decided to garden the bodies of babies, to focus on intentionally favoring beneficial species rather than simply disfavoring pathogens.
The gardening would be carried out using a bacterial strain that Shinefield had isolated from another nurse, a nurse named Caroline Dittmar, who had visited the nursery in which the babies were not colonized by 80/81. Dittmar’s nose was colonized by a strain called Staphylococcus aureus 502A. Strain 502A was the same one found in forty of the babies in the healthy nursery, a strain Shinefield and Eichenwald believed to be both safe and capable of interference. The men studied Dittmar’s 502A for two years. It didn’t seem to be associated with disease of any kind, either in babies or in their families. It would later be revealed that the reason Dittmar’s 502A does not cause infections is because it lacks the ability to penetrate the mucosa of the nose to get into the bloodstream. When it finds a way into the bloodstream, it is just as pathogenic as any other bacterial species.13 Even while they were still studying 502A, Eichenwald and Shinefield began to use it to inoculate babies. They started with lower densities of the strain, and then, once it was clear that more was needed to get the bacterium to “take,” they used higher densities of about five hundred individual bacterial cells.14 Up to one year later, 502A still seemed to be present in the noses of most of the babies that had been inoculated (fewer in the belly buttons, for reasons that can only be guessed). What was more, the mothers of the babies also began to be colonized by 502A.15 Whatever Eichenwald and Shinefield had just done, it was apparently going to have a lasting effect. What was not yet known was whether the colonization by 502A would prevent colonization by 80/81.
Eichenwald and Shinefield were emboldened to take the next step. They found hospitals around the country in which 80/81 was both present and common. Or, rather, the hospitals found Eichenwald and Shinefield. The first was Cincinnati General, where Dr. James M. Sutherland, a neonatal specialist, was working. Sutherland called Eichenwald and Shinefield and asked for help. Sutherland’s hospital had been plagued by 80/81 during the fall of 1961. Forty percent of newborns were colonized by this harmful bacterial strain. Soon, Shinefield was on the road, with a sample of Caroline Dittmar’s 502A by his side, heading for Ohio. In Cincinnati, Shinefield and Sutherland inoculated the nostrils or umbilical stumps (or both) of half of the newborns in each of the nurseries at the hospital with 502A, the putative defender. The rest of the newborns were not inoculated. Which newborn was assigned to each treatment was random, as was the location where each newborn was placed among the hospital’s three nurseries. Shinefield and his colleagues then examined whether the individuals inoculated with the putatively beneficial Staphylococcus stood a reduced risk of infection with 80/81. They were planting one species—a crop—and hoping that it would ward off another—a weed. They were farming. Like farmers, they hoped to reap what they had sown. They hoped what they had just planted was not a garden of terrible weeds (and infected babies).
The results of this study were important. They were important to each newborn infected with 80/81 or any other pathogen in each hospital around the world; they were important to the houses into which the newborns would move after leaving the hospital. They were probably of relevance to, at that point, hundreds of thousands if not millions of lives, particularly in the United States, where as many as twenty-five out of every thousand babies died in the hospital or soon after at home, most often because of infection.
Sutherland and Shinefield did not have to wait long for results. Of the babies inoculated with the potentially beneficial Staphylococcus, 502A, only 7 percent became colonized with 80/81, the pathogen. None of these cases in which the pathogen 80/81 colonized the baby happened in the hospital; all happened after the babies went home, presumably from 80/81 bacteria living somewhere undetected in their houses. That 502A was unable to ward off the pathogen in 7 percent of the cases was not ideal (the ideal, of course, being zero), but the key was how this compared to the babies who were not inoculated with the beneficial 502A. Babies who were not inoculated with the beneficial 502A were much more likely to be colonized by the pathogen 80/81—five times more likely. Sutherland’s confidence in Eichenwald and Shinefield had been rewarded with concrete results.16 Babies gardened with 502A, a strain of bacteria cultured off an individual nurse, Caroline Dittmar, could ward off 80/81, the dangerous weed, most of the time.
Shinefield was soon back on the road; Eichenwald didn’t have the time to travel hospital to hospital, but Shinefield, as a new assistant professor, did. He would go on to repeat the study in Texas, where the results were similar, or perhaps even more promising. Just 4.3 percent of the babies inoculated with 502A were later colonized by the pathogen 80/81. In contrast, of the 143 infants in which the 502A was not seeded, 39.1 percent (nearly half) became infected with 80/81 or one of its close relatives. Just as in Cincinnati, the gardening seemed to work. Eichenwald and Shinefield would go on to repeat this experiment in Georgia (and write about it in a paper titled “The Georgia Epidemic”) and then in Louisiana (“The Louisiana Epidemic”).17
Gardening the body in self-defense seemed to unambiguously work. The strain 502A was an effective, safe defense against the most problematic pathogen in hospitals. But this was not enough. Shinefield and Eichenwald would try something else. This is where events could have gone potentially very wrong. They noticed that after Shinefield’s studies in Cincinnati and Texas, 80/81 entirely disappeared, briefly, from the nurseries. They decided to see whether they could use interference to more permanently eradicate 80/81 from hospitals.
Shinefield traveled hospital to hospital seeding newborns with Staphylococcus aureus 502A. He no longer used a control group. Now, he was just trying to cure children or, rather, to prevent them from ever suffering from infections in the first place. The results were astonishing. By 1971 four thousand newborns across the country had been successfully colonized, gardened as it were, with 502A. Not only did this reduce the prevalence of 80/81 in hospitals but also in some hospitals the team was able to eradicate the pathogen entirely. Gone. Done. On the basis of these results, Heinz F. Eichenwald concluded that “during the presence of a severe epidemic of staphylococcal disease, the use of 502A represents the most immediate, safest, and effective method of terminating the epidemic. I feel that we now have enough data, involving several thousand babies to indicate that this is a completely safe procedure.”18 Time would reveal just how the gardened beneficial strain Staphylococcus aureus 502A excludes pathogens such as Staphylococcus aureus 80/81. Beneficial strains of Staphylococcus produce enzymes that prevent the pathogens from forming biofilms; essentially, they prevent them from building their houses. They also produce bacteriocins toxic to other bacteria. The strain 502A uses bacteriocins to kill any species that tries to colonize where it has already become established.19 Finally, 502A may also (inadvertently) trigger the immune system of the host in such a way as to make colonization of any additional bacteria less likely.20
In the immediate aftermath of this work, excitement boiled. The approach seemed like one that might spread ward to ward around the world. It would spread to homes, where people and surfaces could be inoculated. Doctors even began to inoculate adults with 502A, adults who were suffering from problems with infectious Staphylococcus aureus. With adults, the procedure was trickier. Doctors had to first use antibiotics to kill all of the pathogens in the nose (akin to weeding before planting), and then they could inoculate the adults with 502A just as they had the newborns. It worked 80 percent of the time. Shinefield, Eichenwald, and colleagues had, with 502A, invented an entirely new approach to medicine. What was more, the idea of interference had implications far beyond the establishment of single species of bacteria on the skin of newborn babies.
In 1959 the British ecologist Charles Elton published a book called The Ecology of Invasions by Animals and Plants in which he argued (among other things) that the more diverse a grassland, forest, or lake, the less likely it was that it would be invaded by newly introduced weeds, pests, and pathogens.21 Evoking sentiments very similar to those of Shinefield and Eichenwald, Elton wrote that when animals invade diverse ecosystems they will “search for breeding sites and find them occupied, for food that other species are already eating, for cover that other animals are sheltering in, and they will bump into them and be bumped into—and often be bumped off.” And the more diverse the ecosystem, the greater the odds that an invading species would be “bumped off.” Elton also thought that the more diverse an ecosystem, the more likely it was that some predator or pathogen would be able to eat at and kill the invader. In general, Elton thought that more diverse ecosystems would be more resistant to invasion. Nearly sixty years of subsequent research have revealed that the pattern doesn’t always hold—in ecology there are always caveats. Yet it often does. It holds often enough for ecologists, a people not ordinarily prone to hyperbole, to describe the ability of biodiverse ecosystems to resist invasion as a core part of the “planetary life support system.”22 A patch of goldenrod flowers growing in an old field is harder to invade if that patch contains more varieties of goldenrod,23 or if it contains more kinds of herbivores. Elton’s hypothesis was meant to explain patterns among plants and mammals. But it should work on bodies and homes too. Interference, then, on skin or elsewhere in our daily life, might work even better if two species, or even dozens, were to be intentionally grown. Imagine the students and grand-students of Shinefield and Eichenwald growing a diverse garden on your newborn, or on you, or in your bedroom.
Of course, what works among mammals and goldenrod stems might not work among microbes. The most elegant way to test Elton’s hypothesis for microbes would be to construct microbial communities that differ in the number of species they contain. Such variation would mimic the natural differences in the diversity of microbial communities we see from one body to another, or one surface in a home to another. One could then introduce an invading species into such communities and see whether the invading species was less able to establish or persist in the more diverse communities. This didn’t happen during Elton’s lifetime (he died in 1991). But we can fast-forward a little and consider more recent research. Several years ago, a Dutch research group, led by the ecologist Jan Dirk van Elsas, carried out such a study. They did it in Petri dishes rather than on the skin of newborns, our perspectives on medical ethics having changed considerably since the 1960s.
Van Elsas and his colleagues filled flasks of sterilized soil with bacterial food and then populated those flasks with the same number of total cells of different numbers of strains of bacteria. The strains of bacteria were all isolated from the soil of grasslands in the Netherlands.24 One treatment had five strains of bacteria, another twenty strains. Another still had a hundred strains. Then, as a final treatment, van Elsas and his team simply had the wild, ferocious diversity of real soil, with thousands of species therein. Control communities had no bacteria—just bacterial food. Van Elsas and his colleagues then introduced a nonpathogenic strain of Escherichia coli (aka the infamous E. coli) into each community and watched what happened over sixty days. Like 80/81, E. coli was the invader. The prediction was that the more diverse the community, the tougher it would be for E. coli to establish and persist. Competition for space, for key resources, and even for resources produced by other bacteria would exist. Also, the more diverse the community, the more likely that some of the bacterial strains would be able to produce antibiotics and, in doing so, kill whatever newcomers happened to show up before they ever had a chance. The niches of the community would be either fully occupied by competing bacteria or toxic.
When van Elsas and his colleagues grew E. coli on its own, it flourished, as one would expect to occur on a sterilized surface in your house later dusted with a little microbe food, be it cookies, dead skin, or whatever else. Over the sixty days of the experiment, the abundance of the E. coli remained steady and high. But when van Elsas added the E. coli to the soil in which five other strains of bacteria had been growing, E. coli grew more slowly and then began to disappear more quickly. When he added it to soil in which twenty or a hundred other strains were already growing, it disappeared more quickly still. When he added it to the wild diversity of the real soil, it became hard to even find the E. coli within the sample. The more diverse the bacteria, the harder it was for E. coli to thrive. In part, van Elsas would go on to show, this was because the numerous strains in the more diverse bacterial communities were using many kinds of resources more efficiently than were the fewer strains in less diverse communities.25 Less was left over for E. coli to consume. The effect was even more extreme when van Elsas used an alternate approach, one that made the experiment an even better match for what might really happen in the soil. He created communities full of thousands of species of bacteria from the soil along with whatever bacteria-killing viruses were present as well.
The logical extrapolation of van Elsas’s results to the conditions in our bodies or homes is to predict that pathogens are better at establishing on surfaces on our bodies or in our homes if these surfaces are less diverse and more sterile (and therefore the pathogens have less competition). The caveat is that this is true only if microbial food is available (it always is) and the house isn’t totally devoid of life (but no house ever is). What a radical idea! Here was a possible extension of the approach used by Shinefield and Eichenwald to the world around us. We could prevent the invasion of pathogens by favoring biodiversity on our bodies and in our homes. And the same should apply as well to insects (having a greater diversity of insects in your home, be they spiders, parasitoid wasps, or centipedes, should be more likely to keep pests such as house flies or German cockroaches in check). What is more, it would have the added benefit of increasing our exposure to the biodiversity of bacteria that, according to the biodiversity hypothesis, our immune systems need to function well. Here was a direct practical application of Elton’s ecological insights.
IF SHINEFIELD AND EICHENWALD’S ecologically minded “Eltonian” approach worked and spread hospital to hospital and even into homes, you might be wondering why it sounds so foreign. You might be wondering why you have never heard of babies or homes being gardened. You have never heard of any of this because beginning in the 1960s modern medicine decided to take a different road.
After its initial success, Eichenwald and Shinefield’s idea enjoyed great popularity. It was to be the future! Then, it floundered. One fatality was associated with the accidental introduction of the “good” Staphylococcus aureus 502A into the blood of a newborn from a needle prick. Any bacteria able to make it into the bloodstream can cause an infection; once in the bloodstream the ordinary rules of good and bad, friend and foe, disappear. Also, a few of the inoculations of beneficial Staphylococcus led to skin infections (about one in a hundred). The infections were treatable with antibiotics, but they were infections all the same. The question was not whether these cases were a problem, it was whether they were a worse problem than what would have happened otherwise. They were not.
Eichenwald had noted early on that he and Shinefield had picked one of several possible approaches. One could garden beneficial strains that might interfere with and prevent the establishment of pathogens. One could also rewild the body and try to ensure it was colonized with a diversity of bacteria like that which might have covered our ancestors (minus the pathogens). Or one might “use various eradicative measures” that would kill Staphylococcus (or other pathogens) when infections occurred. One could garden, rewild, or kill. The third approach, Eichenwald went on to note, would have two problems. The pathogens would eventually evolve resistance to the eradicative measure leveled against them. Also, any attempt at eradication of the pathogen would kill both the good and the bad and, in the long term, make it easier for the bad to reinvade.26 This is the situation we often face when we make a choice about how to manage the species around us.
In spite of Eichenwald and Shinefield’s studies, collectively, hospitals, doctors, and patients chose the third approach, the killing. It seemed more sophisticated, part of a grand future in which we humans could control the world around us with ever more novel chemicals—be they antibiotics, pesticides, or herbicides. Though it had problems, we could figure them out in the future. The third approach also seemed, superficially, simpler. The antibiotic methicillin had become cheap and was easily available in hospitals. Using it was easy—nothing had to be grown, inoculated, or gardened. Methicillin was the first wave of a second generation of antibiotics, synthetic antibiotics engineered to be harder for the bacteria to evade. Methicillin was able to treat the Staphylococcus aureus 80/81 infections.
But even in those earliest days, it was recognized—and not just by Eichenwald and Shinefield—that eventually bacteria would adapt even to the new antibiotics, just as the pests and weeds would adapt to the pesticides and herbicides. Alexander Fleming, the discoverer of the antibiotic penicillin, had pointed out as much in his Nobel Prize speech in 1945.27 It was widely recognized by scientists that using antibiotics, especially on newborns, tended to kill the pathogen but also favored a group of unusual bacteria that did not seem particularly likely to be beneficial. Shinefield had commented on as much in his work, but did so as if stating the obvious, something everyone should know. Thus, the early successes of using antibiotics were clear, but so too, to those who were paying attention, were the longer-term problems: antibiotics were easy to use, but they had negative side effects on other microbes, including those beneficially on and in our bodies, and would eventually become useless when resistance evolved. Resistance would take longer to evolve if the antibiotics were used in moderation when most necessary. Conversely, it would evolve more rapidly if antibiotics were used indiscriminately. In full awareness of each of these aspects of antibiotic use, the eradication approach was taken. Antibiotics were used frequently and largely without regard for whether they were absolutely necessary.
When Fleming and others predicted the evolution of resistance, they understood it was likely, but not how it would occur. We now understand well the ways in which bacteria adapt to antibiotics. In large populations of bacteria, some individuals are likely to have (or to develop) mutations that allow them to better survive antibiotics. Those bacteria need not be good competitors, just survivors, because the use of antibiotics kills the species with which they compete. The origin of such mutations and their increase in abundance in the presence of antibiotics can now be shown in the lab. In a recent experiment, for example, Michael Baym, Roy Kishony, and colleagues at Harvard Medical School set up a long tray (two feet by four feet) of bacterial food in a medium of agar. In this experiment, Baym, Kishony, and colleagues played a trick on the bacteria. They laced part of the agar in the rectangular Petri dish with antibiotics. On the left and right edges of the Petri dish, where Baym and his colleagues placed the bacteria, no antibiotics were present. But moving toward the center of the dish, the concentration of antibiotics increased until, at the very center of the Petri dish, it was far greater than that used clinically, a concentration in a microbial world that corresponds to the nuclear option in the larger world. The scientists then filmed what happened over time.
First, the bacterial strain grew over the agar where there were no antibiotics. It covered the agar completely—a lawn of life. But then the food in those areas became scarce. The bacteria stopped dividing. Just beyond this antibiotic-free zone lay a field of food, but it was laced with antibiotics. In this context, any bacteria with the ability to venture out to eat the food with antibiotics would be more likely to survive into the next generation and, having survived, to monopolize the food resources. They would do better even if they did not do as well as those bacteria that initially grew on the abundant food without antibiotics. None of the bacterial cells in the Petri dish at the beginning of the experiment had genes that would allow them to deal with antibiotics. Every single original bacterial cell was susceptible to these antibiotics. It could have stayed this way, and if it did, the bacteria would have stopped at the edge of the antibiotics and the experiment would have been over. But they did not stop.
In the short time in which the bacteria were growing, mutations emerged. Only a few emerged per generation. But the generations were so very fast in coming that soon there were bacteria that could grow on the low concentrations of antibiotics, a few strains that—via the dance of mutation, bacterial sex, and survival—had made it through. Quickly, they consumed food in the agar that had low concentrations of antibiotics, and, once more, they grew hungry. But, before long, a single bacterium developed a mutation that allowed it to colonize the agar filled with a higher concentration of antibiotics. Later, thanks to yet another mutation, the even higher-concentration agar was colonized as well, until the entire plate of agar was sucked of its food and covered in bacterial cells. All of this, the entire thing, an evolutionary masterwork of stunning genius and consequence, happened in eleven days. Eleven days.28
Eleven days seems quick, but it is slow compared to what happens in hospitals. In hospitals (and homes), bacteria don’t have to wait for mutations. They can borrow genes from other bacteria that confer resistance to antibiotics. This is to say, in the real world all of this evolution would take far less time than eleven days. This is just what has happened again and again since the inoculation of babies with nonpathogenic bacteria was abandoned and the use of antibiotics became ever more common.
As a result of the overuse of antibiotics, the problems posed by resistant pathogens in hospitals are far worse than they were in the 1950s when 80/81 first appeared. They are worse not just among newborns but also more generally. Initially, some strains of 80/81 could be killed with penicillin (even if others couldn’t). By the late 1960s, virtually all infections with Staphylococcus aureus were due to strains resistant to penicillin. Not long thereafter, some Staphylococcus aureus strains also evolved resistance to methicillin as well as to other antibiotics. By 1987, 20 percent of infections with Staphylococcus aureus in the United States were due to strains resistant to both penicillin and methicillin. By 1997, more than 50 percent were; by 2005, 60 percent. Not only is the proportion of infections due to resistant organisms increasing but so too is the total number of infections. As the proportion of infections caused by antibiotic-resistant bacteria has increased, so too has the number of antibiotics to which bacteria are resistant, both in the United States and globally. Many infections are now caused by Staphylococcus strains that are resistant to all but the antibiotics of last resort, antibiotics such as the carbapenems that doctors hold back in case things go really bad.29 Some infections are now even resistant to the antibiotics of last resort. These infections cost the medical system billions of dollars a year in the United States alone and cause tens of thousands of human deaths per year.30 Nor is the United States an exception—the trends are similar in much of the world. Nor is Staphylococcus aureus an exception. Resistance is also ever more common in the bacterium that causes tuberculosis (Mycobacterium tuberculosis) as well as bacteria associated with intestinal infections, such as E. coli and Salmonella. In some cases, the rising incidence of resistance results primarily from the overuse of antibiotics by humans. In others, it is due both to the overuse in humans and the use in domestic animals, where antibiotics are given to make pigs and cows get fatter quicker.31
Even in light of this inevitable evolution and increasing understanding of the consequences of the overuse of antibiotics, the response of many hospitals to the surge in resistant bacteria has been to step up the war on microbes, to run forward screaming a wild antimicrobial battle cry. Hand-washing regimens have been stepped up, which is a good decision, or at least not a bad one. Hand washing using soap, as far as we know, has no effect on the layer of normal bacteria on the skin, but it does wash away newly arrived species, which in hospitals are likely to be pathogens. But the proactive use of antibiotics has also increased in a no-holds-barred approach to fighting pathogens called “decolonization.” In decolonization, the nasal passages of patients going into surgery, dialysis, or the intensive care unit are blasted with antibiotics to get rid of any Staphylococcus aureus. In the short term, this approach has been heralded by the hospitals that employ it.32 In the long term, the consequences seem clear enough: decolonization will lead the noses of these patients to be colonized by hospital bacteria. It also favors more resistance. Medical history is being reenacted on the bodies of patients. But this time, something is different. Thanks to the ways in which we use antibiotics and fund research, the rate at which bacteria are evolving resistance to antibiotics is outpacing the discovery of new antibiotics, and this is unlikely to change.33 Bacteria are evolving resistance to our antibiotics faster than we can replace the antibiotics. Yet the medical culture that has arisen since the work of Eichenwald and Shinefield sees little other way to control pathogens on bodies, in hospitals, or in homes. Nor is it alone. When it comes to the control in homes of insects, or fungi, the story is similar. We need another way.
IT WOULD BE HARD to restart Shinefield and Eichenwald’s program. It would be harder to start a more ambitious effort to garden our whole homes or hospitals. Our perspective on risk has changed in ways that emphasize the dangers of gardening and largely ignore the dangers of war. This is bad news. But there is good news.
Antibiotic-resistant bacteria, like pesticide-resistant insects, are poor competitors. In the wild, most of these resistant organisms are weaklings. They are what ecologists call “ruderal species,” species that survive only in environments where conditions are so chronically difficult that no other life-forms do well. Van Elsas showed that E. coli becomes less abundant and struggles to establish itself when soil microbial communities are diverse, but the E. coli that van Elsas studied was not resistant to antibiotics. If it were, we have every reason to believe that it would have struggled even more to persist in diverse communities. Like German cockroaches, antibiotic-resistant bacteria have fine-tuned their biology to the modern conditions we have created. They grow fast and take over our bodies and homes in the absence of competitors, viruses, and predators and in the presence of antibiotics. In such settings, resistant organisms thrive. But the compounds produced by the genes that confer resistance tend to be expensive for the bacteria—they require the bacteria to use the energy that they might have otherwise used to metabolize and divide. If there is no competition, living a slower, more expensive life doesn’t matter. But where there is competition, this slower lifestyle puts resistant microbes at a disadvantage. This is one of the reasons that the very worst antibiotic-resistant bacteria are often confined to hospitals. In hospitals, they are constantly challenged by antibiotics. This quickly gets rid of any bacteria that aren’t antibiotic resistant and rids the surviving bacteria of any competition. Even when the antibiotics are no longer being used, the competition is still held at bay, and so in hospitals, resistant microbes grow like nowhere else, released, like indoor German cockroaches, from competition, released and resistant to our assaults. Faced with competition, such species fail. Confronted with diversity, they fail. It is only in the unusual situations we have created on our bodies and in our homes that they succeed. This means that what we need to do to improve our situation may not necessarily be to fully garden the life around us; it may just be to rewild it a little. We need to find ways to tip the balance in our lives away from the pathogens and to let the biodiversity back in. We need to let biodiversity back in to help us fight deadly pathogens. We need to let biodiversity back in to help us fight chronic inflammatory disorders such as allergy and asthma. We need to let the biodiversity back in for these and many other reasons. And it can be simple, very simple. We’ve screwed up so badly that moderation can look like a panacea. We’ve screwed up so badly that inspiration for new ways forward may need to come from unlikely places and people, places such as kitchens, people such as bakers.