The battle to remove Panalba and the other fixed-dose antibiotics had taken eleven years since the Kefauver subcommittee had zeroed in on the shortcomings of Pfizer’s Sigmamycin, the drug that Arthur Sackler had advertised with testimonials from eight phantom physicians. Much of the media coverage about the FDA’s victory was about the public savings of millions it would have otherwise spent on unnecessary and worthless drugs. The Supreme Court ruling meant the FDA did not have to ask for updated safety profiles before decertifying a drug. The agency was relieved because it thought it unnecessary and too time-consuming. The unintended consequences, however, of not requiring those safety profiles of the pre-1962 drugs was that the agency missed a chance to address the early stages of antibiotic drug resistance.
The FDA’s miscalculation, reinforced by the widespread belief in the biomedical community, was to dismiss reports of drug resistance as anomalies that affected few patients. Laboratory tests in the late 1940s and through the 1950s predicted antibiotic resistance would rarely develop.1 The tests showed that for bacteria to acquire resistance, they expended so much energy that it resulted in their self-destruction. And in the few instances in which bacteria did show evidence of resistance, its culture growth was slow. That reassured researchers, who cited it as evidence that even if antibiotic resistance developed in humans, “it would be unstable and short-lived.”2
Two of the pioneers who developed penicillin were not as convinced. In his 1945 acceptance speech for the Nobel in Medicine, Alexander Fleming warned that if penicillin was dispensed too liberally it would be less capable of killing bacteria over time.
“There is the danger that the ignorant man may easily under dose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”3
Ernst Chain had an even more ominous warning. He feared bacterial pathogens were so adaptable that once they began evolving to resist antibiotics, it was only a matter of time until no drug could stop them.
Most researchers and academicians rejected those dire forecasts as overdramatic and unproven.4 A mid-1950s report about a cluster of Japanese patients who did not respond to antibiotics because of resistant pathogens did not cause much concern among scientists. They dismissed it as an unexplainable anomaly. Doctors did not have the technology to diagnose why those patients had not responded to drug treatment.5
The early scientists worked from two incorrect assumptions. First, they thought that for there to be antibiotic resistance, some of the body’s bacteria would have to develop that resistance. And they assumed the results they obtained in their controlled lab settings predicted what would happen when antibiotics were widely and repeatedly dispensed to millions of people.
It was several decades before researchers proved that by “random chance” some people’s cells have genes capable of natural resistance to an administered antibiotic. The odds of resistant bacteria also increase when patients don’t finish the full course of prescribed treatment. And the same is true when an antibiotic is dispensed for viral infections such as the common cold. Although useless against viruses, introducing the drug into the body activates so-called resistance genes (R genes). Biologists now estimate that humans have about twenty thousand potential R genes. 6 I 7
Beginning in the mid-1950s, doctors liberally dispensed powerful broad-spectrum antibiotics. Their shotgun disbursement in the body meant they killed off good microbes in addition to any pathogenic bacteria. The germs left were those with natural resistance R genes, and they became more potent from having battled and survived the drug. The introduction of fixed-dose combination antibiotics in 1956 added to the overprescribing epidemic. Physicians dispensed those for common colds, sinus infections, sore throats, and runny noses. Every one of those needless prescriptions increased the odds for more resistant bacteria. Until the FDA removed them from the market in 1973, billions of doses of the combination antibiotics had been sold. Americans took more antibiotics every year in the 1950s and 1960s than all other prescription medications combined.8
Many also got extra antibiotics from the 30 million pounds of antibiotics used annually since the 1950s to fatten chickens, cattle, turkeys, and other animals used for meat, as well as protecting farmed fish from contracting diseases common in the overcrowded environments in which they were bred. They also became prevalent in plant agriculture, sprayed every spring for a week on fruit and vegetable crops to suppress the proliferation of otherwise destructive pathogens.9 The constant exposure to antibiotics from different sources boosts the number and speed at which resistant genes proliferate.10 II 11
What was the result of the overprescription frenzy of the postwar decades? Researchers have now determined that ground zero for antibiotic drug resistance was during the early 1960s in America. A “core” population of resistant strains developed and took hold in susceptible patients. Scientists now realize that in some people, molecules of independently replicating DNA called plasmids (plasmid-mediated antibiotic resistance) develop. Those morphed into supercharged biological carriers capable of spreading resistant bacteria. Those plasmids, scientists believe, are how antibiotic resistance spread from America to every continent, including locations where antibiotics were used rarely.12 III 13
The grim ramifications of the FDA’s decision not to update the safety profiles of the pre-1962 antibiotics only became apparent a decade later. The first indications of a problem were news reports of epidemic infection rates in developing countries of a common form of pneumonia and a sexually transmitted disease, neither of which responded to drugs.14 During the mid- to late 1960s about 35,000 American troops in Vietnam got infected with a resistant strain of malaria that Army doctors said was a “recent mutant of more common types.” It sparked a crash program by the National Institute of Science, the Walter Reed Army Research Institute, and leading tropical medicine physicians that failed to develop treatment (today, that mutant strain—falciparum malaria—is still the world’s deadliest.)15
It would take thirty years before leading microbiologists had concluded that “The use of antibiotics by humans can be seen as an evolutionary experiment of enormous magnitude.”16 As is now evident, it was an experiment in which the odds—beginning in the 1960s in America—were unwittingly stacked in favor of the pathogens.
I. In the late 1990s, the World Health Organization published an anonymous doggerel, titled “The History of Medicine,” that summarized the evolutionary view that germs will always prevail: “2000 B.C.: – Here, eat this root; A.D. 1000 – That root is heathen. Here, say this prayer; A.D. 1850 – That prayer is superstition. Here, drink this potion; A.D. 1920 – That potion is snake oil. Here, swallow this pill; A.D. 1945 – That pill is ineffective. Here, take this penicillin; A.D. 1955 – Oops… bugs mutated. Here, take this tetracycline; 1960–1999 – 39 more “oops.” Here, take this more powerful antibiotic; A.D. 2000 – The bugs have won! Here, eat this root.”
II. The CDC and WHO have over the decades repeatedly warned doctors they should be judicious by limiting their antibiotic dispensing. Still, the CDC estimates that about 30 percent of the quarter billion antibiotics prescriptions written annually in the U.S. are unnecessary. An additional $50 billion of antibiotics are improperly administered in nursing home and hospitals.
III. It was 2010 before British scientists concluded that MRSA, an opportunistic superbug infection caused by a resistant bacterial strain, also developed in the early 1960s.