7

There is another way in which lateral gene transfer relates to the issue of food safety, although in this case, it has nothing to do with the food we actually consume. Instead, it is connected to the ways in which we raise our food.

I will preface this by mentioning that in 2008, a member of my extended family contracted drug-resistant Staphylococcus aureus, aka MRSA. This was an alarming but not particularly unique occurrence; a 2007 report by the Association for Professionals in Infection Control and Epidemiology found that these infections were 8.6 times more prevalent than had previously been thought, and that antibiotic-resistant bacteria are found in all wards throughout the majority of US hospitals. In other words, if you walk into a hospital in modern-day America, either as a patient or a visitor, you have a very good chance of being exposed to an antibiotic-resistant bacterium that is difficult, if not impossible, to treat. Oh, and has the potential to kill you.

My relative didn’t die from MRSA. She was never even acutely sickened by it. Like the other 30 million or so Americans who carry the bug in their nasal passages, she didn’t know she’d contracted it and wouldn’t have known but for a routine test (because the bacteria have become so common in medical facilities, many do routine screening). Unless it gets into your blood, typically through an open wound, the condition is actually fairly harmless. It might make your nose itch a bit or cause it to feel as if your allergies were acting up, but you can trot around with the bug for the entirety of your natural life and never know it. Still, an estimated 18,000 Americans annually aren’t quite so lucky: In these people, the drug-resistant bacteria make their way into the interior of the body, where they can profoundly affect the vital organs, leading to sepsis, toxic shock syndrome, and f lesh-eating pneumonia. All of these conditions are pretty much guaranteed to cause severe illness or death.

To understand how MRSA and the legions of other antibiotic-resistant bugs have evolved, we need to understand a bit about the interaction between antibiotics and bacteria. We also need to know that in modern America, most of the antibiotics produced aren’t utilized by the health care industry. In fact, nearly three-quarters of the antibiotics distributed in America aren’t even used to treat human beings (more on this in a moment).

Like so many of the other technologies that have redefined our food system and helped shape our definition of food safety (pasteurization, refrigeration, molecular biology), and that we largely take for granted, antibiotics are a relatively recent phenomenon. In fact, the word antibiotic wasn’t even coined until 1942, although the best-known of this new class of drugs, penicillin, had been around since 1928.

I was intrigued to learn that the word antibiotic evolved from the term antibiosis, which literally means “against life.” At first, I found this ironic; after all, antibiotics have saved millions of human lives and relieved untold suffering in less than a century. But then I began to consider how it is that antibiotics save lives: They kill. Put simply, antibiotics are poisons engineered to kill bacteria, including (hopefully) the bacteria making us sick. They do this by interfering, in a variety of ways depending on the class of drug, with the bacterial cell. For instance, penicillins belong to a class of antibiotics called beta-lactams, which interfere with the peptidoglycan cell wall structure of bacteria. Since human cells don’t have cell walls or peptidoglycan, they cannot be harmed, at least directly, by beta-lactam drugs. This helps explain why antibiotics don’t work on viruses, which are not independent cellular organisms that are “alive” and can be targeted by a particular drug.

In July 1946, just as antibiotics were going mainstream within the human population, the Journal of Biological Chemistry published a research paper from the University of Wisconsin that would have a profound impact on how we feed livestock in this country and, ultimately, on the future of disease and our ability to fight it. It would do one other thing, too: dramatically increase profits for the pharmaceutical industry. The defining conclusion in the UW paper was this one: “Sulfasuxidine and streptomycin singly or in combination lead to increased growth responses in chicks receiving our basal diet supplemented with adequate amounts of folic acid.” In other words, if you supplemented a chicken’s basic diet with the antibiotics sulfasuxidine or streptomycin, it grew faster than if you didn’t.

This finding might not have made such an impact if it didn’t coincide with the advent of Concentrated Animal Feeding Operations (CAFOs), areas where livestock is raised in extreme density and fed formulated, rationed diets intended to maximize growth while minimizing inputs and, therefore, increasing profits. The first animals to be subjected to the woes of confinement farming were chickens, thanks (or, from the chicken’s perspective, no thanks) to the discovery of vitamins in the early 1900s and the subsequent ability to synthesize them. This led to the realization that rather than keep the birds on grass and soil, where essential vitamins and minerals existed naturally, it would be oh so much more convenient to supplement the feed with these nutrients and bring the whole operation indoors. The convenience to consolidate chicken farm operations was simply too tempting to pass up, and in the 1920s, American poultry farmers began raising their chickens in warehouses, feeding them vitamin-fortified grain. And behold: The CAFO was born.

Not long after the University of Wisconsin released its groundbreaking study on supplementing livestock feed with antibiotics, the FDA approved the practice, so long as the dose was “subtherapeutic” (defined as less than 200 grams, or about a half pound per ton of feed). Not surprisingly, livestock producers lobbied passionately for FDA approval and, on being granted that approval, were pleased to find that the benefits outweighed expectations. In fact, larger animals seemed to respond even more favorably than chickens: Pigs gained about 10 percent more on antibiotic-laced feed, and feed efficiency was improved by 5 percent; beef steers needed 4 percent less feed to reach the same weight in the same period of time.

There was another benefit. Livestock that received routine subtherapeutic doses of antibiotics experienced fewer acute illnesses, which led to fewer culls and condemned carcasses. Now that they needn’t fear communicable disease, farmers crowded their animals closer and closer together, and soon pork producers joined the chicken industry by bringing their livestock indoors. And just like that, in a span of barely a decade, animals that had been raised outdoors for thousands of years were shuffled into huge, windowless warehouses, where they’d spend the entirety of their short and brutish lives shoulder to shoulder, nose to tail, beak to butt, eating engineered food and generally being forced to exist in ways that are entirely counter to their nature.

Indeed, although the CAFO model of poultry production preceded the routine feeding of subtherapeutic doses of antibiotics, it is widely acknowledged that without this practice of feed additives, commercial confinement livestock production could not have proliferated and would not have become the prevailing model for all species.

We are just now beginning to understand the glaring deficiencies of raising animals in this manner, deficiencies that range from moral to environmental to health. Still, it must have seemed like a no-brainer at the time: Here we had a miracle drug, one that not only cured humans of infections that had recently been tantamount to a death sentence but also seemed to improve livestock health, even as it caused the animals to pack on more meat over a given time span. In an industry that has long operated on razor-thin margins, a 4 percent gain in feed-to-meat conversion must have felt like winning the lotto jackpot.

And that, in short, is what happens to the other 70 to 75 percent of our nation’s antibiotics, nearly 27 million pounds’ worth annually. It goes into animal feed, to be consumed by critters that are not acutely ill but nonetheless grow better when the bacterial diversity of their digestive tract has been reduced, creating a noncompetitive environment that allows every precious calorie to be absorbed and targeted for a singular purpose: creating flesh.

It doesn’t take a cynic to note that the inclusion of antibiotics in animal feed does not exclusively benefit producers. After all, if the annual US market for antibiotics destined for human consumption is $8.5 billion (which it is), and that spectacular figure represents barely one-quarter of the total national demand, well . . . someone’s making an awful lot of dough selling antibiotics to livestock feed producers.

This dandy arrangement—happy producers, happy big pharma, unwitting consumers—continued for a few decades until, in the early 1980s, the CDC caught wind of a sharp uptick in cases of drug-resistant salmonella. In 1984, a study in the New England Journal of Medicine drew a convincing line between these cases and the subtherapeutic use of antibiotics in livestock. Still, other than a few concerned science and microbiology geeks, no one got too worked up. After all, this was still the golden age of antibiotics, with new classes of drugs being developed and bacterial resistance—when it did arise—generally applying only to one particular antibiotic and not others. So what if a particular drug was no longer effective against a particular strain of bacteria when there were a half-dozen other drugs that could fill in? The public response was muted to nonexistent.

It should be noted that the mechanisms by which these salmonella bacteria were evolving to survive drugs that had previously rendered them harmless were not entirely unknown. Indeed, drug-resistant bacteria had been part of the national health care meme for decades. The first reported US case of methicillin-resistant Staphylococcus aureus (MRSA) showed up in 1968, and it’s interesting to consider that the only reason S. aureus became resistant to methicillin is because it had already become resistant to penicillin, necessitating a switch to another class of antibiotics.

The thing about bacteria is, basically, that they’re smart. Beat them over the head with one stick for long enough, and they’ll figure out how to avoid getting hit by that particular stick. So you pick up another, and that works for a while, until it doesn’t. And so on, until you end up where we’re at today, with an epidemic of MRSA that no longer stands for methicillin-resistant Staphylococcus aureus, because it has evolved to the point where it can evade almost anything we can throw at it. Which is why MRSA has recently become an acronym for multidrug-resistant Staphylococcus aureus. How convenient that both methicillin and multidrug begin with M.

Now, it has long been accepted wisdom that the evolution of drug-resistant bacteria has its roots in overconsumption of antibiotics by humans. Or, perversely, the underconsumption: The reason you’re supposed to finish a full course of antibiotics, even if you feel better many days before the pill bottle runs dry, is to ensure that the drug has completely wiped out the population of nasty bugs in your body. Otherwise, the remaining stragglers might have an opportunity to regroup. Only now, having survived the antibiotic onslaught, they may well be resistant to future treatment by that particular class of drugs. And, because many of their peers were exterminated by the drugs, they enjoy a competitive advantage.

In any event, it is certainly true that human misuse of antibiotics owns plenty of the blame for creating species of bacteria that no longer fear our current arsenal of treatments. But remember: People consume only about one-quarter of the antibiotics in this country. And as it turns out, the critters that consume the other 75 percent might have as much, if not more, to do with the proliferation of drug-resistant bacteria as we do.

As bad as all this sounds, there are some people who believe it’s about to get a whole lot worse. Ellen Silbergeld, a professor of environmental health sciences at the Johns Hopkins Bloomberg School of Public Health, is one of them. Silbergeld was a professor of epidemiology at the University of Maryland School of Medicine when, in 1999, she attended a presentation by a candidate for a faculty position at the university. The presentation was on hospital-acquired infections; according to the presenter, some of the drug-resistant infections floating around health care facilities came from food.

This piqued Silbergeld’s curiosity. She knew food could be a vector for pathogenic bacteria, but that didn’t explain how the bacteria had become drug resistant. But then, like most Americans, she wasn’t aware of the practice of feeding antibiotics to livestock. “When I realized antibiotics were being used as a feed additive, I had an immediate strong sense that there wasn’t enough known about this.”

But how could that be? I asked her. After all, this was only a dozen years ago, at a time when numerous studies had correlated antibiotics in livestock feed with drug-resistant bacteria. Given their critical role in public health, surely these studies would have provoked an immediate and strong response, right?

Well, sort of. In 2006 the European Union banned the feeding of antibiotics and related drugs to livestock for the purpose of promoting growth. But in the United States, the practice remains commonplace, and meat producers have done their level best to discredit the connection between stuffing their livestock with 27 million pounds of antibiotics annually and the drug-resistant bacteria that now claim, by some estimates, over 100,000 American lives each year. Silbergeld believes that our collective disinterest is rooted more in the structure of our regulatory landscape rather than simple ignorance. “I think it fell in a zone between food safety and environmental microbiology. I think the general sense was ‘This is not my job.’ ”

So she set out to make it hers. “The idea that it would be a major issue seemed pretty clear to me. The challenge was figuring out how to study it.” Silbergeld considered where she’d find the greatest concentration of bacteria coupled with the highest level of animal-to-human exposure. She didn’t have to look hard: The Maryland poultry industry produces about 300 million broiler chickens annually, ranking it seventh among all US states in poultry production, despite being the 9th-smallest state in the union. Chicken is such a big deal in Maryland that there’s even a recipe known as Chicken Maryland(pan-fried with cream gravy, if you’re wondering).

Silbergeld began by studying three groups of people who had direct or close contact with large-scale chicken farms. These included laborers who entered the barns to catch and transport the birds, the workers who handled live birds at the processing plant, and nine people who did not directly work in the industry but lived near these facilities. The results were sobering: Of the nine people living in the community, 100 percent had been colonized with Campylobacter jejuni, a bacterium that is harmless to chickens but pathogenic in humans and that is the second-leading cause of gastrointestinal distress in the United States. Occasionally, it causes long-term neurological damage and a form of arthritis called Reiter’s syndrome. Interestingly, Silbergeld found that the workers who had direct contact with the live birds fared somewhat better: Of the chicken catchers, only 41 percent were colonized with the bacterium, while the line workers had a colonization rate of 63 percent. Still, it was enough to confirm Silbergeld’s fears. “It was clear to me that we were dealing with a significant public health risk.”

And the more she looked, the clearer it became. Not long after her initial report, she and five co-workers published the first US-based study of poultry workers colonized by resistant microbes. The startling conclusion? Fully half of the surveyed workers played host to E. coli that demonstrated resistance to gentamicin, an antibiotic used to treat many types of bacterial infections. Among the general population, the rate of colonization was a mere 3 percent.

As terrifying as these findings were, Silbergeld’s most worrying study underscored the potential for these bugs to spread among an unwitting public. She loaded a passenger car with sampling equipment and lurked in a parking lot near an intersection frequented by poultry trucks. Whenever one of the trucks approached the intersection, Silbergeld would pull into its slipstream and tail it. What she could see was the back of the chicken truck, the birds hunched and scared, their feathers being torn out by the wind. What she couldn’t see, but what her sampling equipment would later prove present: high levels of enterococci bacteria settling on the interior surfaces of her car.

Now, Enterococcus is a relatively benign strain of bacteria; it’s known to cause urinary tract infections and very occasionally meningitis in humans. Okay, so maybe “benign” isn’t the right word, but the point is, it’s not exactly the next E. coli O157:H7. But the problem isn’t merely the bacterium itself; it’s that a quarter of the enterococci that landed in Silbergeld’s car during Operation Poultry Truck proved resistant to antibiotics.

Thanks to lateral gene transfer, the issue is more complicated than that because, as it turns out, the same genes that make a relatively harmless strain of bacteria resistant to antibiotics can make other bacteria resistant, too. “The way we need to think about this is not as the resistance of one particular bug,” said Silbergeld. “The real story is of the increased pool of resistance genes that’s available to the microbial pool.” In other words, the danger isn’t so much that you will someday be driving down the highway behind a poultry truck and come in contact with drug-resistant enterococci or some other superbug (not that I’d recommend cruising your convertible along the highways leading to poultry facilities). No, the danger is far broader and more sinister than that: As more and more “resistance genes” are released into the population, more opportunities for LGT are unleashed. And some of that transfer will be in resistance genes.

The question is: Would halting the use of subtherapeutic antibiotic supplementation in livestock feed have a profound impact on the drug-resistant gene pool? Funny you should ask, because after Denmark banned the use of antibiotics in livestock feed for the purpose of growth promotion in 1999, the World Health Organization convened an independent panel to review the consequences in terms of both human and animal health. Their finding? “Extensive data were available that showed the termination of antimicrobial growth promoters in Denmark has dramatically reduced the food animal reservoir of enterococci resistant to these growth promoters, and therefore reduced a reservoir of genetic determinants (resistance genes) that encode antimicrobial resistance to several clinically important antimicrobial agents in humans.”

In other words, when Denmark banned the feeding of antibiotics to livestock, the reservoir of resistance genes declined.

To help me understand the implications of all this, I called James Johnson, the senior associate director of the Infectious Diseases Fellowship Program at the University of Minnesota and an all-around expert on the subject of Really Scary Bacteria That Can Kill You. It was a phone call that, at least in relation to my ability to sleep at night in blissful ignorance of the realities of drug-resistant bacteria, I wish I’d never made.

According to Johnson, there is simply no question that feeding antibiotics to livestock presents a significant human health hazard and that this hazard is killing people. It’s impossible to say with exacting certainty how many Americans die every year from drug-resistant bugs, in part because those most susceptible are often suffering from other ailments that might be listed as cause of death, but Johnson puts the number in the high tens of thousands, perhaps even more. Of course, not every one of these deaths can be attributed to animal feed, but Johnson is certain of one thing: “It’s a lot.”

To understand why he’s so sure of this, you need to understand a bit about where many of these bugs prosper. Which, as it turns out, is in the guts of livestock. Because the bacteria live peaceably and asymptomatically in animals, and because the animals are constantly eating low levels of antibiotics, the bacteria slowly become resistant to the drugs. “These are bugs that simply don’t live long enough in humans to gain resistance,” Johnson explained. “If a human gets campylobacter, he gets sick, and the bacteria are quickly expunged from his body. But a cow or pig can live with campylobacter for its entire life and never get sick. And if the bacteria are constantly exposed to antibiotics . . . ,” he trailed off. “Look, it’s not like animals have their bugs and we have ours and never the twain shall meet.” In other words, the bacteria that build up resistance and become part of the normal flora inside the guts of livestock are the very same bacteria that can infect humans, with no options for treatment by antibiotics.

What’s amazing, notes Johnson, is that awareness of the issue of antibiotic misuse among the medical community has reached an almost fever pitch. Indeed, preventive antibiotic use in the United States has largely been curtailed, even in situations where it has been shown to save lives. Why? Precisely because of fears over creating drug-resistant bacteria. “We know that we could save lives if we did widespread preventive prescribing of antibiotics to cancer patients, but we don’t because of the concern that we’ll create a monster. And yet, here’s the meat industry feeding them to our food—not to prevent deaths, mind you, but to make animals grow bigger and fatter. It’s so egregious, I don’t know how they can do it.” To put it even more bluntly: As a culture, we have decided to let some people die so we don’t create an environment where drug-resistant bacteria can evolve. Concurrent with this decision, we have created an almost ideal breeding ground for those very same bacteria, so that the meat and drug industries might enjoy increased profits. It boggles the mind, really.

Johnson underscored Ellen Silbergeld’s point about resistant genes, which by virtue of LGT can hopscotch between bugs pretty much on a whim. In other words, antibiotic resistance isn’t a unique function of one particular bacterium; rather, it is essentially a contagious condition that can be shared among different strains and species of bacteria. Even nonpathogenic strains of E. coli (to name but one) can carry resistance DNA that, in Johnson’s words, are like “soldiers ready to jump out and attack disease-causing bacteria.”

If there’s a happy ending to this story, it’s this: The FDA is starting to pay attention. In June 2010, the FDA released a draft guidance paper that called the use of antibiotics solely for the purpose of boosting production “injudicious” (probably because they couldn’t say “really freakin’ stupid”) and hinted at phased-in limits on antibiotic use in food-producing animals.

Not surprisingly, the meat-producing councils did not respond with unbridled enthusiasm. The National Pork Producers Council issued a position statement stating that existing FDA regulations “provide adequate safeguards against antibiotic resistance.” And the National Cattlemen’s Beef Association responded: “Antimicrobial resistance is a multi-faceted and extremely complex issue that cannot be adequately addressed by solely focusing on the use of these medications in animal agriculture.”

None of which does a whit to change Ellen Silbergeld’s mind or diminish her sense of urgency.

“I don’t know if this is fully appreciated by the lay public, but we are entering the postantibiotic era. We need to recognize that this is a problem now, not tomorrow or next year. Or we’re going to be in big trouble.”