EIGHT

Antibiotics

Amputating a man’s arm is a gut-wrenching and shocking act. Regardless of the clinical justification and no matter the years of practice, severing a limb from a body requires stubborn resolve and intense personal subordination. Perhaps some surgeons grow callous to cutting off a limb—I never have.

I am a surgical intern at Pennsylvania State University and all I want is a couple hours sleep. I figure, if I can lie down now, I’ll instantly fall asleep, and will get enough shut-eye before 4:00 A.M. to last me through another day of grunt work. But on this winter night, just as my body twitches and shocks itself to slumber, my pager vibrates me to reality. Like all surgical interns, I am taking call while “in-house,” staying in the hospital all night fielding phone calls from the Emergency Room, the hospital floor nurses, and outside patients.

In the darkness, I fumble for my little black Motorola pager on the nightstand next to my head. Checking the four-number code, I dazedly recognize “6550” as one of the extension numbers to the Medical Intermediate Care Unit. We don’t get many calls to that number, and I hope that I have been paged incorrectly. Without turning on the lights, I prop up on my left elbow and punch the number on the institutional green AT&T office phone.

A nurse answers my call, informing me of an urgent surgical consult on a seventy-eight-year-old man with elbow pain. She explains that he had been admitted hours before with heart attack–like symptoms, but that all preliminary tests were ruling out an MI (myocardial infarction, or heart attack). Oftentimes an MI patient complains of crushing chest pain, with associated left arm or jaw pain; alert ER personnel hear these complaints and immediately begin testing the patient for a “cardiac event.” Although all initial tests were ruling out an MI, the severity of his symptoms warranted a hospital admission. As the hours progressed, his arm pain worsened, and by 2:00 A.M., the medical team was getting nervous. The aged patient was developing blisters and “ecchymoses” (bruises) on his left arm. Instead of an MI, they are now considering some ominous issue with his musculoskeletal system.

I am a know-nothing surgical intern, just months removed from medical school graduation, but I agree to come evaluate the patient as a first-line responder for my surgical team. I sit up in bed, take a deep breath, and slide on my day-worn, slightly smelly socks. After fumbling for my shoes, my thoughts become more organized, and I’m already starting to generate a “differential diagnosis,” the list of possible causes behind this man’s presentation. While trying not to wake my bunkmate and fellow intern, I slip out of the night call room and jog up the echoic stairway to the medical floor.

Briskly walking down the darkened hallway, I arrived in the Medical Unit, which is a beehive of activity. Nurses and aides are darting around, and are oddly relieved to see me. Typically, floor nurses in an academic medical center rightfully have disdain for interns. They arrive every July with their new MD degree, but are as helpless as a newly licensed motorist trying to drive a stick-shift, uphill, for the first time. But these were medical nurses, adept at caring for cardiac patients, but greenhorns themselves when dealing with an odd musculoskeletal patient who would normally be a couple floors down on the orthopedic floor.

A young nurse points to the corner bed, where a seventy-eight-year-old gentleman restlessly lies in his hospital bed, his left arm propped up on pillows. Rapidly, I can see the bruises that the nurse was telling me about, and I can also see that his forearm is swollen. I ask, “Mr. Louis, does your arm hurt?”

This aged man is truly sick, and can only mumble a feeble, “yes.” Growing concerned, I approach his bedside, and focus on his arm. There are dark, splotchy patches of bruises the color of grape jelly. I lean over the bed, inspecting the inside aspect of his elbow. There are several raised burgundy-colored blisters above his elbow, and I am starting to feel out of my league. What am I looking at?

I reach for his wrist to lift his arm, and instantly feel the crackle of air under the forearm skin that feels like squeezing a bag full of wet Rice Krispies. My stomach drops, and while I don’t have much experience or judgment in the practice of surgery, I know this is gas gangrene, the byproduct of “flesh-eating” bacteria. There are classes of bacteria that are infamous in causing rapid infections that result in the death of the body’s soft tissues, so-called “necrotizing fasciitis,” with the occasional byproduct of “subcutaneous emphysema,” or gas underneath the skin. The physical exam finding of subcutaneous emphysema is frightening, to say the least.

I gently place Mr. Louis’s arm back on the pillows, knowing that I am seeing my first case of “nec fasc”—pronounced “neck fash,” in common parlance. (This is how residency works—you can read all about subcutaneous emphysema and necrotizing fasciitis, but until you have someone’s limb in your hands with crunchy air underneath the skin, you have not been properly initiated. Somehow the numbers work out. Relatively rare, every surgery resident has seen nec fasc.)

I turn to the nurse and say, “necrotizing fasciitis.” All conversation stops and everyone freezes.

“Really?” she says.

“Yes. I’m going to call Dr. Moulton, my senior resident.”

Connecting with Mark Moulton, I explain the details of the case. Getting to the point, he asks me, “Are we early enough to save his arm or will we have to amputate?” I confess to Mark that I really don’t know, that I don’t have any experience. Mark tells me to get the patient rushed to the operating room immediately. We will try and save Mr. Louis’s life, if not his arm.

A flurry of phone calls to the operating room and the anesthesia team achieves the impossible, and we are rushing to the OR within half an hour. Life is on the line. The rest of the orthopedic team has made its way to the hospital by 3:00 A.M., and my boss, Dr. Spence Reid, quickly concludes that an amputation is mandatory. In the pre-op holding area we get a portable X-ray that reveals air going all the way to the shoulder. Typical for necrotizing fasciitis, the bacteria are on a warlike march, leaving a plume of air in their wake, and before the bugs get to the chest, daring surgery must be performed. Not only do we need to amputate his entire arm, the collarbone and shoulder blade must also be removed, a so-called “forequarter” amputation (as opposed to hindquarter, or lower limb).

Even before transporting the patient to the OR suites we gave Mr. Louis a large dose of penicillin, but necrotizing fasciitis is notorious for not responding to antibiotics in the emergency setting. Penicillin helps, but surgical magnificence is demanded if the patient is to live another hour.

Once we urgently transfer the patient to the operating room and the anesthesia team intubates him, we rapidly position him on the surgical table. Racing to save his life, he is propped on his side and his entire left side and arm are swathed with greenish-blue surgical drapes. Dr. Reid works very quickly, making a dramatic, football shaped incision around the shoulder blade and chest. Under nonemergency situations, this dissection would likely take ninety minutes, but under the circumstances, the dissection is done at lightning speed, in barely a dozen minutes. The collar bone, the shoulder blade, the entire arm, and all the muscles attached to those bones are rapidly cut away. The nerves emanating from the neck and the large blood vessels emerging from the chest cavity must all be tied off and cut.

As a resident at the beginning of my training, I know I would kill this patient if I attempted to do the operation. I just don’t have the skills yet. Dr. Reid is a superb surgeon, a master craftsman with unique understanding, adept hands, supernormal concentration and stamina, and most important right now, heroic courage. Moments like this will kindle all these attributes in me for the rest of my life, and Dr. Reid’s greatest gift to me will be the gift of confidence, the ability to take on impossible shoulder and elbow cases in the future. Surgeons are criticized for arrogance and brashness; this critique is probably fair, but at this moment, a fearlessness nurtured from deep self-assurance is mandatory.

A surgeon can perceive if he has outflanked a flesh-eating bacterial infection—there is no crackly air in the layers of soft tissue that he is cutting through. A cocktail of life-supporting medicines continues to be injected into Mr. Louis’s IVs as our team completes the final steps of the forequarter amputation. Cutting edge antibiotics, in addition to penicillin, are being pumped into his body even as the team races to detach the limb.

The moment of liberation of the putrefied appendage finally occurs, leaving a gaping wound over the rib cage. There is a simultaneous sense of triumph over the bacterial horde and an acquiescence to the power of microorganisms as the limb is separated from the thorax and dropped into a hazardous waste trash bag. Aggressive irrigation with antibiotic-laden saline is performed, and a palpable optimism flickers to life in our operating room.

Mr. Louis, although bizarrely disfigured with no arm and no shoulder, will live.

Mr. Louis’s life was saved by surgery and by penicillin. I have posed the question many times to friends and patients: How many years ago was the first dose of penicillin given? In ancient times, or five hundred years ago, or during the Revolutionary War, or after the First World War? Few people realize that the first clinical administration of penicillin in a small English hospital was only seventy-five years ago.

The pioneering work of Pasteur, Lister, and Koch convinced scientists and physicians that germs were real. As Robert Koch microscopically elucidated their life cycles and interactions with humans, the dark veil of ignorance regarding infectious diseases was lifted. Semmelweis and Lister, among others, were able to show the advantages of handwashing and sterilization, and it is not surprising that public health institutions were created in the years after John Snow helped create epidemiology and Florence Nightingale influenced hospital design. Although improved sanitation and cleanliness dramatically decreased epidemics, there was still no answer for acute or chronic infections in individual patients.

The advent of modern chemistry coincided with the triumph of germ theory during the 1880s, in no small part because manufacturing dyes provided contrast and color to an otherwise drab and blurry microscopic world. The bourgeoning German industrial chemical companies began as dye manufacturers, only later turning to fertilizers, perfumes, photography, and pharmaceuticals. Paul Ehrlich (1854–1915), a Prussian-Jewish physician-scientist continued the proud German tradition of perfecting the art of histological staining, eventually gaining fame for differentiating the component cells in peripheral blood.1 A contemporary of Robert Koch, Ehrlich had a breakthrough insight when he considered the chemical processes that were occurring during the staining of tissues and bacteria. There was a primitive understanding that certain dyes had a special affinity for certain cells (and their constitutive parts); further trial-and-error testing with dyes by the Danish physician Hans Christian Gram (1853–1938) yielded the most important finding in the history of bacteriological microscopic analysis—that bacteria could be grouped into two main classes of cells that either stained purple (“Gram-positive”) or red (“Gram-negative”) in response to a series of staining steps with crystal violet and safranin stains.

Paul Ehrlich was intrigued by why different dyes were attracted to particular species of bacteria, but handicapped by primitive research tools, had no way of formulating a scientific response. However, demonstrating the type of keen insight that geniuses possess, Ehrlich skipped several steps ahead and wondered if the dye materials could be manipulated not to just embellish a slide but to kill bacteria. If a staining material could be identified that targets and binds with a particular class of bacteria, it made sense to the pioneering scientist that a dye could be used as a weapon.

Ehrlich traveled to London in 1907 to lecture to Britain’s Royal Institute of Public Health, delivering a lecture for the ages. He dreamed that one day there could be a “targeted drug, one that would attack a disease-causing microbe without harming the host suffering from the disease.”2 Ehrlich conceived of chemical compounds that would serve as magic bullets, just decades after researchers had finally proven the germ theory. Barely fifty years removed from John Snow’s revolutionary epidemiological research during the cholera outbreak of 1854, Ehrlich returned to the very London neighborhood that had been (literally) awash in diarrhea, stumping for magic bullets.

By the time Paul Ehrlich had traveled to London, he was already well on his way in the quest for the magic bullet. Modern chemistry was in full bloom, with Dmitri Mendeleev’s periodic table coming into focus and a developing appreciation of how atoms bind together to form complex molecules. For an extremely insightful researcher like Ehrlich, the mysteries of simple chemical compounds were beginning to dissolve at the turn of the 20th century, and as one of the fathers of histological staining, it’s not a surprise that he turned to azo dyes like methylene blue, congo red, and alizarin yellow in the search for a chemical breakthrough. Since the mid-1880s, Ehrlich had experimented with the azo dyes as potential therapeutic agents, and although he was inadvertently turning his patients’ eyes and urine various colors of the rainbow, he and his lab partners were able to show a response to malaria.

Azo dyes—aniline derivatives like the mauveine discovered by William Perkin in 1856—are chemically stable and not very changeable; Ehrlich and his cohorts were hoping to find another substance that acted like a dye (showing a propensity to bind with certain bacteria), but was more chemically unstable and easier to manipulate in the lab. Ehrlich knew of a chemical compound named atoxyl that had been shown to kill trypanosomes, single-cell parasites that cause diseases like African sleeping sickness. He was intrigued by atoxyl, particularly once he realized that it was a chemically unstable arsenic-based molecule and not a true aniline dye.

And so the testing began. Ehrlich and his colleagues Alfred Bertheim and Sahachiro Hata began to chemically modify atoxyl in 1907, feverishly altering the composition of the molecule bit by bit. Different versions were further modified, and a numbering system was generated based upon these modifications. The eighteenth version of the fourth compound (number 418) was effective in curing sleeping sickness, but was causing blindness in some of Hata’s lab animals and was therefore abandoned. By the summer of 1910, in what can only be described as crude experimental processes, Compound 606 had been created and tested. The sixth version of the sixth compound (606, arsphenamine) showed tremendous success in lab animals with various diseases, including syphilis.3

Syphilis likely was not present in Europe before explorers brought it back from the New World in 1495, and it raged for four hundred years across the continent with its slow-motion terror of blisters, aching testicles, sore throat, raised skin rash, and in its final stages, facial deformities and brain infections. With no effective treatment, mankind was defenseless against the corkscrew-shaped bacterium. Until Compound 606.

The German chemical company Hoechst AG, also located in the Frankfurt area, began marketing Compound 606 in 1910 as “Salvarsan.” Through trial and error, Paul Ehrlich had created a molecule that was part stain, part poison. The dye portion of arsphenamine would bind to the surface of the syphilis bacterium, whereas the arsenate portion killed it. In so doing, he had developed the world’s first synthetic chemotherapeutic agent. For good measure, Ehrlich coined the term “chemotherapy.”

Salvarsan rapidly became the most prescribed medicine in the world, leading to hopes that it would have broad application among many different types of bacteria. Unfortunately, Salvarsan, and its improved version, Neosalvarsan, had extremely narrow efficacy across the microbial world. This, paired with its significant side effects, made it a qualified success. More significantly, the development of Salvarsan was a false lead, as all future antibiotics (after the sulfonamides) would be “natural” molecules gleaned from nature—from fungi or bacteria—and not synthetically created from dyes or other simple chemical molecules. When sophisticated chemical engineering is performed by pharmaceutical companies in the search for a new antibiotic, it is upon naturally occurring chemicals already being produced by living organisms.

World War I (1914–1918) introduced horrific methods of combat, and while there were the predictable medical advances achieved from the theater of war, there was a transitory disruption in the German pharmaceutical industrial machine. The German biochemical revolution was fueled by rigorous academic programs at decentralized universities, a cultural identification with industriousness, and the creation of durable funding that was the envy of Germany’s European neighbors.4 There was a grand consolidation among German chemical and dye businesses following the conflagration, setting in motion the powerful chemical, agricultural, and pharmaceutical manufacturing enterprises. Familiar names like Bayer, Agfa, BASF, and Hoechst combined together to form IG Farben in 1925, resulting in the largest chemical company in the world.5 As will be seen, the German chemical corporations involvement in World War II was much more diabolical and vastly more damaging.

In the years leading up to World War II, the Teutonic drive for innovation in chemistry had led to great breakthroughs in fertilizer development, which even today accounts for half of the world’s crop production.6 Assembly-line manufacturing, pioneered by Henry Ford, was fundamental to the next wave of the Industrial Revolution in the early 20th century, but instead of making vehicles, the German research machine would use mass production organization to tackle scientific challenges with brute force. The testing of prospective chemical compounds was formalized on a grand scale, exposing huge numbers of potential drugs to various bacteria in what was described as an “endless combination game [utilizing] scientific mass labor.”7

Paul Ehrlich, the father of histological staining, immunology (he was the first to grasp antibodies), and chemotherapy, died in 1915, just as World War I was exploding. Wartime disruptions and the vacuum left after his visionary leadership led to a lull in chemotherapy discovery. The formation of IG Farben in 1925 and the arrival of Gerhard Domagk (1895–1964) in 1927 to Bayer set the stage for a muscular approach in the quest for a true antibacterial medicine. “If Ehrlich had tested dozens of different recipes in order to find the antisyphilis treatment, Bayer would try hundreds. Or thousands.”8 In a foreshadowing of the petrochemical polymer industry, Bayer chemists began producing thousands of chemical compounds from coal tar, the thick liquid that is a by-product of the production of coke and coal gas from coal.

Domagk, as a pathologist and bacteriologist, had gained a specialized understanding of the microbial enemy (including being a wounded soldier in World War I), and was critical in constructing the experimental framework, having identified a particularly virulent strain of Streptococci (Gram-positive cocci which links in twisted chains). Streptococcus, the pathogen famous for throat infections, pneumonia, meningitis, and necrotizing fasciitis, was an ideal test bacterium, not only because it was common, but because it killed laboratory animals so terrifyingly efficiently. Domagk, like his famous German predecessor Robert Koch, intentionally infected laboratory white mice with his test bacteria. Thousands of diseased mice died over the first few years of the project, helplessly succumbing to Strep despite being injected with myriad coal-tar derivatives from the Bayer chemists.

Trudging along, as science demands, the scientists continued tinkering with the azo dyes, chemically modifying the compounds with the addition of chlorine atoms, then arsenic, then iodine. Years of failure and almost no hope demanded a resiliency that was perhaps battle born, but a breakthrough did finally occur in 1932, when the team began linking the azo dyes with a sulfa-based molecule. The protocol that he had practiced for years yielded a monotonous outcome: injecting live Strep cultures into the abdomen of a mouse would result in death within a day or two. But in late 1932, outside Düsseldorf, Germany, twelve mice were administered a new drug—an azo dye amalgamated with sulfanilamide—shortly after being injected with the deadly bacteria. Concurrently, fourteen mice were injected with the same bacteria but were not given any medicine. All fourteen of these control animals were dead within days, while all twelve that had received the new compound, KL-730, had lived. The Bayer scientists had stubbornly forged ahead as the carcasses of rodents piled up, but in 1932, the world’s first antibacterial magic bullet had finally been crafted.

Bayer knew that their new medicine, KL-730, which they would name “Prontosil,” was effective against bacteria because of the unique marriage between the azo dye and sulfanilamide. Except that it wasn’t. What the Germans had never performed was an isolated test of sulfanilamide alone. A group of French scientists at the Institut Pasteur in Paris repeated an experiment with various sulfanilamides on a group of forty mice, including a treatment group with sulfanilamide alone and no azo dye.

After a few days, the Parisian team evaluated the response among the test animals. Almost all of the mice died who were treated with newer azo-sulfanilamide combinations, but all of the mice lived who were treated with Prontosil, Rubiazol, and sulfanilamide alone. The Bayer scientists had assiduously labored to protect their patent rights over Prontosil, sure that it represented a bonanza, but they had never considered that sulfanilamide alone might be the subjugator. At about the same time that the Institut Pasteur scientists made their discovery, the Bayer group was unearthing the same sobering fact. While it was a tremendous moment for mankind, it was a financial catastrophe for Bayer; the sulfanilamide molecule had been discovered (and patented) in 1908 by Viennese chemist Paul Gelmo, and was now in the public domain. The financial goldmine had evaporated before their eyes.

Bayer did profit from sulfanilamide. They marketed it around the world as Prontosil, even after realizing that sulfanilamide alone was the effective agent, without the need for the azo dye. (It also explains why Prontosil was only effective in vivo and not in vitro. In a test tube full of bacteria, Prontosil posed no risk. Only animals have the enzyme that separates the dye from sulfanilamide. If testing had only occurred in test tubes, and not animals, Prontosil would have appeared as a failure, and it was this and other drugs that educated the early pharmaceutical manufacturers that “pro-drugs” were genuine. At times, pro-drugs are ideal—a pro-drug is intentionally manufactured so it can survive digestion, turning into the active metabolite once in the bloodstream.)

Prontosil and other forms of sulfanilamide hit the world market in 1935, immediately making an impact. “Virtually overnight, mortality from childbed fever [Strep pyogenes] fell from 20 to 30% to 4.7%.”9 Physicians across the United States and Europe embraced the new drug, but the American public became intimately acquainted with the new sulfa drug in 1936 when Franklin Delano Roosevelt Jr., while a student at Harvard College, contracted a life-threatening streptococcal throat infection. Physicians in Boston administered the new magic bullet, saving his life, and in the process, helped propel America into the modern age. The New York Times trumpeted the news on its front cover, helping ignite a “sulfa craze” across the country, even leading to patients asking their physicians for the new wonder drug by name (a first). Even at the outset of the antibiotic revolution, overprescribing was a temptation.

The European quest for synthetic chemotherapeutic molecules was in full launch mode as the world tilted toward a second Great War. Chemists were obsessed with a haphazard survey of chemicals, believing that the new man-made particles could outsmart the bacterial enemy. While the modern pharmaceutical industry has created, de novo, chemicals that lower blood pressure, increase blood flow, and alter cholesterol levels, the source of antibiotics would be from mother nature, not from the minds of scientists. Unbeknownst to the chemists, several years before sulfanilamide was given to a human, an accidental discovery in London had already opened the vistas of future medical care.

Alexander Fleming was a young Scottish physician working at St. Mary’s Hospital in London, and although he was trained as a physician and surgeon, his talents in laboratory research had led him to an eventual career as a bacteriologist. Small and slight, Fleming had joined the inoculation department at St. Mary’s in 1906, soon turning his attention to Paul Ehrlich’s Salvarsan.

Bacterial researchers have always followed the pioneering example of Robert Koch, studying the lives and sensitivities of microbes by growing colonies of bacteria in Petri dishes in a nurturing environment. Fleming and his colleagues focused on important pathologic bacteria like staphylococcus and streptococcus, culturing the bacteria and evaluating the conditions that altered colony formation. In 1922, Fleming and a lab assistant were cleaning up Petri dishes that had been seeded with bacterial colonies when they noticed an odd pattern. Typically, in a Petri dish of bacterial colonies, there is widespread, even growth of bacteria across the dish; instead of seeing such growth, Fleming noticed that there were blank areas of no bacterial colonies. In a victory for everyone who has suffered from the common cold and a drippy nose, Fleming recalled that nasal mucous from his own nose had dripped onto the culture dish days earlier, and he rapidly surmised that his own nasal drippings had somehow hindered the growth of bacteria. The shy and reticent researcher concluded that there must be a substance in the nasal discharge that had inhibitory powers, naming it lysozyme. For the first time in world history, a purely organic substance had been characterized as having antibacterial properties.

Lysozyme became a fascination for Fleming, albeit a research dead-end. In time, researchers were able to show how lysozymes function to weaken the cell walls of bacteria, but more important, the recognition of a molecule that inhibited, or killed, microbes prepared Fleming’s mind for his revolutionary observation in 1928.

As summer turned to fall in 1928, Alexander Fleming returned to London from a holiday by the sea. When he arrived at his petite laboratory at St. Mary’s Hospital (preserved today as a memorial to the man and his momentous discovery that September 3), a jumbled stack of Petri dishes was on a tabletop, including a dish that had fallen off its perch and lost its lid. The story goes that he glanced at the Petri dish and quickly did a double take—dozens of round spots of staphylococci carpeted the dish but their spread was limited by a large island of white mold on one side of the dish. Recognizing a pattern similar to what he had seen five years earlier, the blotch of mold had a surrounding beltway, a demilitarized zone of sorts, where there were no bacterial colonies and no fungus.

Fleming muttered softly to himself, “That’s odd.”

For thousands of years, humans had unwittingly harnessed mold to make wine and beer and bacteria to make cheese. Fewer than one hundred years before Fleming’s discovery, Louis Pasteur had solved the riddle of fermentation, and less than half a century before, Koch had demonstrated that bacteria were real. Fleming had already concluded five years earlier that lysozymes from human fluids retained antibacterial properties, and now, perched in his little lab above Praed Street, began conceptualizing that the mold itself was making a substance that was deadly to the staphylococcus.

The name of the mold? Penicillium. (Read that carefully. It doesn’t say “Penicillin.”)

The Penicillium mold was likely a contaminate in the building or from the air from an open window. There has been much conjecture about the source of the mold—was it from a nearby lab, was its presence a hallmark of sloppiness of research, did it taint the bacterial culture because Fleming’s assistant was slovenly?—but in the final analysis, Penicillium is a common mold that has been making its own special chemical as a defense, likely for millions of years. How it got into that lab is not important, but the fact that Fleming paused to consider its actions is significant.

Correctly ascertaining that Penicillium was producing a substance that inhibited bacterial encroachment, Fleming and his assistant, Stuart Craddock, (initially) became obsessed with farming Penicillium and harvesting the resultant “mold juice.” Fleming then tested this concentrate on other bacterial samples and found that it was effective against staphylococci and streptococci, finally settling on the name “penicillin” as the name of the substance that would make him world famous. In March 1929, Fleming published an article titled, “On the Antibacterial Action of Cultures of a Penicillium, with Special Reference to Their Use in the Isolation of B. Influenzae.” This predates, by several years, the German discovery of sulfanilamides, but Fleming and his team lost out on the designation as providers of the first antibiotic because they could never adequately cultivate the finicky mold in sufficient quantities to make it clinically significant.

In fact, Penicillium was so persnickety that Fleming gave up. It is confusing today to reconcile Fleming’s abdication on mastering the development of (arguably) the most significant drug ever discovered, but the lack of sophisticated research tools, lab space, manpower, and most important, intense drive to corral the fungus meant that it would be up to another team, more than a decade later, to harness the power of Penicillium. Amazingly, Alexander Fleming walked away from Penicillium and never published on it again.

Eight years passed after Alexander Fleming’s publication with no success, by Fleming or any other researcher, in cultivating Penicillium and producing penicillin. While several scientists had been inspired by Fleming’s 1929 article, none could overcome the same technical challenges in understanding its actions, including George Dreyer of the Oxford University Dunn School of Pathology. “The Dunn” had been founded in 1922 with a £100,000 gift from Sir William Dunn, a Scottish merchant banker and politician who had made a fortune in South Africa. The institution would become world famous for disease process and bacteriological research, and by the time the building was completed in 1927, an impressive roster of resourceful minds was being assembled.

Two industrious, ingenious, fatherless, and indomitable researchers arrived at Oxford in the mid-1930s, one from Australia and the other from Germany. Together, they would tame Penicillium, perfect the production of penicillin, and conspire with researchers around the globe to make the breakthrough medicine available just when the world was bent on collapse.

Howard Florey initially voyaged to Oxford as a Rhodes Scholar having just graduated from medical school in Adelaide. His father had died a few years earlier, and the ambitious young Australian made the first of many career moves when he began a three-year study program in Pathology. Florey was awarded numerous scholarships during his academic matriculation; in addition to the Rhodes scholarship he was awarded the Rockefeller Foundation fellowship, which led to intermittent research trips to New York, Chicago, and Philadelphia during his graduate work. Brief stints in Copenhagen, Vienna, and Madrid, combined with an eventual doctorate from Cambridge in 1927, provided him with an unmatched educational background. In 1935, he was named the second director of the Dunn School of Pathology, turning his attention to bacterial gut impermeability and investigating whether or not lysozymes were involved in the protection of the gastrointestinal tract against bacteria. Florey was honing in on an area of expertise, having demonstrated prodigious drive, intelligence, and leadership skills; all he needed was a comrade who had similar soaring ambition and talents.

Ernst Chain was born in Berlin in 1906 to Russian-Jewish immigrant parents. Like Florey, Chain’s father died while he was still in school (in Chain’s case, when he was thirteen). Similar to Florey’s athletic achievements (he excelled in tennis, cricket, and football), Chain was a piano virtuoso who gave concerts on several continents. Chain graduated from Friedrich Wilhelm University (now Humboldt University of Berlin) and the Institute of Pathology at Berlin’s Charité Hospital in 1930. In photos, Chain bears a striking resemblance to Albert Einstein, and the young pathologist, looking in every way like a true genius (he was), began work in the chemical pathology lab at University College Hospital in April 1930. A few years later Chain landed a research position in Cambridge, and after a couple years there, was offered the job of biochemist at the Dunn School, working under Howard Florey.

Howard Florey had succeeded in hiring a world-class-trained (read: German chemist) scientist who could help him investigate the biological aspects of infection and immunity. He could not have found a better colleague—Chain later wrote that his “principal motivating principle … was always to look for an interesting biological phenomenon which could be explained on a chemical or biochemical basis, and attempting to isolate the active substances responsible for the phenomenon and/or studying their mode of action.”10

Those who work in a research laboratory understand that research meetings, usually held weekly, are the lifeblood of the investigative effort. During that meeting, the lab director will ask for updates on specific experiments, and will invite commentary from various members of the team as to the meaning of the results. Unanticipated results are a major area of focus, because they represent either possible failure or potential new avenues of examination. Another occasional agenda item in the weekly meeting is the consideration of a completely new area of investigation, usually based upon a newly published article or podium presentation. A fresh research prospect is scintillating to a lab that craves a breakthrough, and sometimes the best spark for a new idea is to dig up old research publications and dust off an inadequately explored concept.

The stories about the penicillin pioneers seem a bit apocryphal at times, but in a well-remembered afternoon tea discussion among lab workers at the Dunn, Florey and Chain discussed the dead-end paper of Fleming from 1927. While no research team had achieved success in investigating the byproducts of Penicillium, it had been contemplated by Florey’s predecessor, who had frozen away samples of the Penicillium and other microorganisms as potential sources of antibacterial substances. So in 1937, one year after FDR Jr. had received his lifesaving sulfa medicine, Florey and Chain began the nearly impossible task of efficiently growing Penicillium and producing penicillin. The gauntlet was laid down to the team: while it must be impossible to achieve, an inspired effort was demanded if they hoped to perceive the mechanism of defense of the fluffy-white mold. Impromptu or not, it was a lab meeting for the ages.

Norman Heatley was a contemporary of Ernst Chains in Cambridge, and although he was a PhD scientist, his exceptional skill was in building laboratory equipment out of random parts and castoffs, a second coming of James Hooke. With the lab budget so severely limited (“to call the Dunn experimental program impoverished is to flatter”11), Heatley was essential. No one in the world knew how to successfully grow Penicillium, and it was going to take creativity, stubbornness, keen insight, and luck. Heatley—humble, elegant, tall, and slender—did not waste time in deciphering the ideal conditions for Penicillium cultivation.

The life cycle of Penicillium came into clear view in 1939. The Penicillium culture would grow a thin white carpet over the agar, and as it matured, the branchlike mycelia would grow and generate penicillin-rich droplets that yellowed as they dried. These droplets could be harvested with a pipette, but harvesting too early limited the yield; waiting too long oversaturated the fungus, and further growth was squelched.

The mold grew adequately on agar, but precious little “mold juice” was produced without extra nutritive substances. Heatley turned to different growing containers and altered the temperature. Fertilization with nitrates, salts, sugars, glycerol, and meat extracts, combined with enriching the air with oxygen and CO2, was performed. Brewer’s yeast was added, and when one reads of Heatley’s maneuverings you are left questioning whether he felt more like a chef, horticulturist, brewer, or scientist.

Adding urgency to the effort was Hitler’s invasion of Poland on September 1, 1939. Chain could no longer travel back to Germany, and, unable to rescue his mother and sister, both perished in Nazi concentration camps. Great Britain and France declared war on Germany within days of the invasion, further heightening the urgency of the penicillin production and testing. Refinement of the growing process and the expansion of penicillin production continued into 1940, but no testing of the finished product had been performed.

On March 19, 1940, the first suitable batch of penicillin was finally processed and tested for stability. Chain, as the expert chemist, set to work on determining what type of molecule penicillin was. Primitive testing conditions notwithstanding, to Chain’s great surprise, the molecule was not a protein, but what it was was not immediately obvious. The first step in clinical analysis was to inject two mice in the abdomen with the entire amount of collected penicillin. To the team’s great relief, the mice tolerated the injections without incident and more amazingly, excreted it in their urine, unaltered.

Production of penicillin continued, and within two months, on May 25, 1940, an experiment with groups of infected mice was carried out at Oxford’s Dunn School. Eight mice were infected with streptococcus, with four treated with a series of penicillin injections and four untreated as controls. By the next morning, all four of the untreated mice were dead, while all four of the mice who received penicillin were alive and well. The magic bullet had been derived from nature and none too soon. The following day, the evacuation of Dunkirk began, and it didn’t take too much imagination to consider how a war effort could be facilitated with an antibiotic that, for the first time, was widely tolerated and highly effective.

Penicillin production would continue to be a great logistical challenge for the Oxford team, particularly considering the lack of material support the English enjoyed as the Nazi noose tightened around the British Isles. By the onset of 1941, the primitive apparatus at the Dunn School had ramped up creation of the antibiotic to levels sufficient to test on the first human guinea pigs.

In the fall of 1940, an Oxford policeman, Albert Alexander, scratched his face with a thorn from one of his rose bushes. Simple cleansing of the wound proved useless, and a secondary infection with Gram-positive bacteria developed in his face and scalp. As the English winter, with low cast gray clouds and short days of light, dragged on, Albert’s infection spread to his torso, arms, lungs, and left eye. Treatment with sulfa medicines was ineffective. Abscesses with oozing pus had cropped up all over his body, and surgery to remove his left eye was mandated. After months of suffering, and with death imminent, Mr. Alexander became the first person in the world to receive penicillin for an infection on February 12, 1941.

The intravenous injection of penicillin was started in the morning, with doses given every three hours. By the next day, the patient’s face was no longer swollen, his fever had normalized. The gush of pus almost immediately had slowed down, leaving everyone ecstatic and the policeman able to eat. It must have seemed like a miracle, but the sobering reality of their production failings tempered the sense of triumph, particularly when a second patient, a fifteen-year-old boy named Arthur Jones, contracted a life-threatening infection after a hip operation. Alexander had been given a five-day course of penicillin, essentially exhausting the stockpile held by Florey and Chain; another ten days passed with his condition remaining stable.

Both patients had received injections around the clock; dire shortages of the medicine mandated that Alexander’s urine be collected and reprocessed. A bicycle brigade between the Radcliffe Infirmary (now the site of the Radcliffe Humanities, a teaching space that houses faculty offices, a library, and classrooms, located on Woodstock Road) and the Dunn laboratory maintained the lifeline for the first recipients. Arthur Jones was given penicillin recovered from Alexander’s urine and the newly harvested penicillin from Heatley’s manufacturing contraptions.12 After a month’s struggle, Albert Alexander succumbed to his ancient enemy, but not without demonstrating a profound response to penicillin. On the other hand, young Arthur Jones lived.

Florey and Chain rightly concluded that their little molecule might be a stupendous breakthrough; the type of discovery that earns fame and trips to Stockholm. But a more pressing demand was to improve upon large-scale manufacturing. The British Commonwealth was incapable of meeting the demand; Germany, Japan, and Italy were enemies.

Alternatively, the United States was rapidly becoming the world’s lone superpower, even before World War II. Manufacturing giants in America had transformed the relatively young country into a GDP colossus, and while the United States was relatively inexperienced in chemistry and science (compared to Germany), “the sophistication and productivity of America’s agricultural sector was like nothing else in the world.”13 So, in June 1941, Florey and Heatley flew from England to Lisbon, Portugal, and after three days’ layover, boarded a Pan American Boeing Dixie Clipper for the transatlantic flight to LaGuardia Airport’s Marine Air Terminal, landing on July 2.

The voyage to the United States, in retrospect, was an out-and-out triumph. The young pharmaceutical companies in the greater New York City area, like Pfizer, Squibb, and Merck, were eager to meet with Florey and Heatley, having read the August 1940 publication, “Penicillin as a Chemotherapeutic Agent,” in the Lancet. But the greatest collaborative relationship would be with the scientists at the United States Department of Agriculture (USDA) research labs.

The USDA labs were (and still are) tasked with improving agricultural production while ensuring the safety and wholesomeness of crops, meat, poultry, and eggs. The Northern Lab of the USDA is based in Peoria, Illinois, and prior to the arrival of Florey and Heatley, had received dozens of samples of Penicillium mold from around the world. Although the Oxford team was an assemblage of masterminds, none were mycologists (fungus scientists). The challenge of identifying the most potent strain of Penicillium and the most efficient means of production of penicillin was entirely in the wheelhouse of the Peoria USDA lab, and in a matter of months, production of penicillin had improved one thousand fold. After rounds of testing, a strain of Penicillium was isolated that became the “ancestral source for virtually all of the world’s penicillin,”14 originating from a cantaloupe from a local Peoria market.15

The USDA lab succeeded in discovering an ideal strain of Penicillium and also in improving fermentation. They prevailed over the “better seed, better soil, better cultivation and harvesting”16 contest—and it would fall upon the pharmaceutical companies in the United States and England to utilize those techniques to meet the demand. To entice drug companies to participate in the penicillin production challenge, the US government developed an unprecedented system of financial support and patent protection for the burgeoning corporations that established firm foundation to their explosive growth in the immediate postwar period. The wartime Office of Scientific Research and Development (OSRD) and Committee on Medical Research (CMR) undertook a comprehensive program to confront the medical problems of the war, in essence, weaponizing American science against the Axis powers. In the 1950s, a dizzying confluence of government funding, research sophistication, new hospital construction, and surgical know-how launched the modern medicine revolution; all of these developments can, in part, be blamed on penicillin’s mandated industrial cultivation.

The American production of penicillin expanded exponentially from 1942 to 1945, but oddly, German development of antibiotics was virtually nonexistent. Decades before, Lister’s carbolic acid treatment of wounds (and the German adoption of that method) had changed the balance of power in the Franco-Prussian War. German soldiers survived their battle wounds; French soldiers didn’t. However, as war raged on during World War II, tens of thousands of German soldiers died from septic wounds, while American manufacturing ramped up penicillin production in preparation for D-Day.

Why did the Germans spend so little time and money on antibiotic development? Were they not the greatest chemists in the world?

Part of the answer lies in the need for fuel. Outside of Romania, there wasn’t a decent-sized oil field between the Atlantic and the Urals, mandating that the Nazi state exhaust their scientific resources in the development of synthetic oil and synthetic rubber manufacturing. The Germans clung to a partially effective class of drugs, the sulfa-based antibacterials, and spent the rest of their assets and energies on propping up their war machine.

Another major reason for the Nazi scientific failure was the reversal of the educational system that made it the envy of the world in the first place. The scientific autonomy that had been earned over the course of decades suddenly vanished under Nazi control, while “American scientists, universities, and the medical profession … performed under minimal control primarily in independent institutions, rather than in government laboratories.”17 In addition, the loss of so many gifted Jewish scientists, either through murder or defection, weakened the talent pool of the formerly proud German institutions.

They have never recovered their world leadership in chemistry and biology.

On December 10, 1945, with Europe in tatters after the conclusion of World War II, Alexander Fleming, Ernst Chain, and Howard Florey were awarded the Nobel Prize in Physiology or Medicine for their discovery and development of penicillin. Although sulfanilamide was antibacterial, it was not a molecule made by an organism. Sulfa drugs are therefore not “antibiotics,” as coined by scientist Selman Waksman, which he defined as “a chemical substance produced by microorganisms.”18 One year before the Nobel ceremony in Stockholm, a tuberculosis patient at the Mineral Springs Sanatorium (near the Rochester, Minnesota, site of the Mayo Clinic), received the first dose of streptomycin, forever changing the treatment of tuberculosis, the development of antibiotics, and the world.

Streptomycin was discovered by Selman Waksman and Albert Schatz, soil scientists who specialized in the study of actinomycetes, a sub-order of bacteria that make the soil their home and have mold-like branching filaments. Additionally, actinomycetes were presumed to secrete antibiotic-like molecules, due to their ability to fight off other bacteria in their loamy world. Waksman and his colleagues at Rutgers University had spent the 1920s and 1930s collecting soil samples and testing the thousands of bacteria that were dredged up. A teaspoon of soil might contain billions of bacteria—competing for scarce resources and evolving molecular weapons to defend themselves from other bacteria and various members of the plant and animal kingdoms.

The first compounds isolated from soil-borne bacteria, in 1939 by French-American René Dubos (who later won the 1969 Pulitzer Prize for his book So Human an Animal), were effective against other bacteria, but were also toxic to mammalian cells. It would become obvious (in time) that the most effective antibiotics would target structures and machinery peculiar to bacteria, sparing animal tissues. Although the 1939 substances were clinical failures, they did inspire the microbe hunters to continue the quest. The overwhelming challenge seemed beyond scale: investigate thousands and thousands of bacterial species and somewhat arbitrarily test for antibacterial properties.

Despite the daunting task, Waksman and his followers developed a protocol for isolating a bacterium that produced a minimally toxic, yet clinically effective drug. Years later, he said, “We isolated one hundred thousand strains of streptomycetes (as actinomycetes were then known). Ten thousand were active on agar media, one thousand were active in broth culture, one hundred were active in animals, ten had activity against experimental TB, and one turned out to produce streptomycin.”19 While these numbers were rough approximations, they do capture the concentric circles of possible antidotes to disease; more intriguingly, although Waksman is the genius who pioneered antibiotic research (and rightly earned the Nobel Prize in 1952), the quote is very likely inaccurate, as it seems that Schatz himself carried out the critical research on streptomycin in isolation, from June to October 1943.

The Merck-funded research at Rutgers came to the attention of Mayo Clinic researchers William Feldman and Corwin Hinshaw. The Mayo Clinic had transformed from a father-and-sons practice in the tiny town of Rochester in the 1880s into one of the world’s greatest research institutions. This was accomplished by embracing Listerism, implementing modern cellular pathology, nurturing collaboration among scientists and physicians, and the unselfish reorganization into a not-for-profit charity.

At the Mayo Clinic, William Feldman was a world-class veterinary pathologist and Corwin Hinshaw was a physician with an interest in bacteriology. Feldman and Hinshaw were obsessed with lung disease and in particular tuberculosis, the deadliest infectious disease in history. TB had killed one-seventh of all human beings—roughly fifteen billion people. The cure that had eluded Robert Koch became the obsession for Feldman and Hinshaw. The two Mayo researchers communicated with Waksman after reading his initial 1941 streptothricin paper20 (which proved to be an experimental disaster after it was realized that streptothricin was dangerously toxic to kidneys), hoping to collaborate on the testing of any future antibiotic discoveries.

Albert Schatz’s discovery of streptomycin is the story of stubborn dedication and personal subordination. While working in seclusion in a basement lab in Waksman’s research building at Rutgers, scouring soil samples for a bacterium that could defeat TB—specifically against the most dangerous strain that his Mayo colleagues could provide—Schatz isolated two variants of Streptomycetes griseus. One sample was from a heavily manured field soil, and the other was swabbed from the throat of a chicken. Both samples of Streptomycetes griseus were antagonistic to TB in vitro, but in vivo testing would clarify if streptomycin was an effective and safe antibiotic.

Feldman and Hinshaw were among the first to receive an advance copy of Schatz’s and Waksman’s famous 1944 paper trumpeting the arrival of streptomycin,21 and by April 1944, the two Mayo researchers began testing streptomycin in guinea pigs infected with a variety of diseases: bubonic plague, tularemia, shigellosis, and TB. By late June 1944 it was obvious that streptomycin was a miracle drug; it was curing every guinea pig of every disease, including TB. Additional testing was performed in the following months, and by the fall of 1944, Hinshaw was prepared to administer the first dose of streptomycin to a human. On November 15, 1944, Patricia Thomas became the first patient to receive the wonder drug. Severely infected with TB, and with no hope of survival, Patricia received five courses of streptomycin over the next five months, with dosages based upon a patchwork of early science and guesswork. Not only did Patricia Thomas live, she married and had three children, living another twenty-two years.

To really determine if streptomycin worked as well as was initially believed, a groundbreaking analysis was needed. While there had been simple trials comparing diets and primitive drugs (dating all the way back to the story of Daniel in the Hebrew Bible), including the important 1793 experiment of Scottish surgeon James Lind, in which a controlled trial of citrus fruit administration was shown to be effective in the prevention of scurvy, a true randomized controlled trial had never been performed prior to 1948.

“Alternate allocation” trials, in which every other patient is given an alternating experimental remedy, are prone to error because clinicians cannot undo selection bias when assigning patients to treatment arms, no matter how stringent the assignment of patients is. The British epidemiologist-statistician Austin Bradford Hill realized the shortcomings of previous experimental designs, concluding that the only sound way of evaluating a drug would be to blind both the clinicians and the patients. A trial for streptomycin was designed in early 1947 in which a triple-blinded design (patients, treating clinicians, and evaluators) would all be ignorant to whether patients received the actual antibiotic or placebo medicine.

With the war having just ended and British funds at paltry levels, there simply weren’t the resources to treat a large number of patients. In fact, with supplies of streptomycin almost nonexistent and without much research financing, a randomized trial of patients getting no medicine was not only scientifically intriguing, it was necessary. Hill later wrote, “… in this situation it would not be immoral to do a trial—it would be immoral not to, since the opportunity would never come again.”22 The world’s first randomized controlled trial, intentional but fortuitous, was concluded six months later, and the results were undeniable. Of the fifty-five patients who received streptomycin, only four had died (and twenty-eight improved); in the control group of fifty-two patients who received only placebo drug, fourteen had died. (A follow-up study demonstrated a reversal of the trend, and researchers would later conclude that resistance was building to streptomycin. Later studies showed improved outcomes if aspirin was concomitantly given, thus bolstering the case for streptomycin.)

Streptomycin became an immense success story but not without its controversies, including who should receive the credit for its development. The most important wrinkle in the story of streptomycin is that it paved the way for future chemotherapeutic development—both for antibiotics and anticancer medicines. Most antibiotics are derived from bacteria from the soil (not industrial dyes and chemicals), and increasingly, from bizarre places in the world, including the ocean depths and from the air. But scientists learned that grinding hard work and a little good luck achieved something that physicians and scientists would have thought impossible a mere fifty years earlier: the control, if not cure, of TB, and the ability to address almost any infection that may arise … at least until drug resistance and bacterial evolution outfox modern intellectuals. Cockroaches have nothing on bacteria.

The ability to identify, stain, culture, and test bacteria helped lead to the formation of a new industrial behemoth of pharmaceuticals as we know them today. It is stunning to realize that “penicillin, streptomycin, the versions of tetracycline, chloramphenicol, and erythromycin had all been introduced between 1941 and 1948,”23 simply by working the soil. I’ve asked dozens of patients how drug companies bring new antibiotics to the market, and most draw a blank expression and say something like, “Don’t they design them in their corporate drug offices?” The fact is that pharmaceutical scientists rely on billions of years of evolution among the tiniest inhabitants of our world, deciphering which molecules have novel methods of defense and confrontation, and utilize these newcomers in the battle against our attackers.

Given the stakes, both financial and of legacy, one would think that hundreds of drug companies would have discovered and modified thousands of antibiotics over the last seventy-five years, but from “1938 to 2013, only 155 antibacterial compounds received FDA approval. Because of resistance, toxicity, and replacement by a newer-generation derivative, only ninety-six antibiotics remain available today.”24 The partnership among microbiologists, chemists, statisticians, physicians, and businessmen has yielded a relatively limited palette of weapons to prevent and fight infections, but if modern man is incompletely garrisoned, we can glory in the fact that, over the last several generations, mankind is no longer vulnerable to the caprices of microbes. The antibiotic revolution, building on the breakthroughs of the understanding of the organ and cellular basis of disease, and the founding of bacteriology meant for the first time ever that it was worth going to a doctor when you were sick.

Pathetic treatments with Dr. Rush’s heavy-metal laxatives (so-called “thunderbolts”), snake oil patent medicines, arsenic poisoning, toxic industrial solvents, noxious animal feces, and lethal plant material were in their twilight by mid-century. (Though never gone—witness the never-say-die cottage industry of home remedies, alternative medicine “experts,” and television infomercials for cures the “medical establishment doesn’t want you to know about.”) As flimsy medical interventions crumbled under their own weight, the reputation of doctors began to grow, and as Paul Starr has so thoroughly described, the “social transformation of American medicine” meant that Americans were able to couple their new postwar prosperity with a renewed interest in health, with a shift from the (necessary) preoccupation with infectious diseases to a fixation on the betterment of chronic illnesses, like cancer, heart disease, neurosis, and arthritis.25

Infant mortality plummeted, life expectancy doubled, dread diseases were alleviated, and, on occasion, cancer was able to be cured in the decade of the possible in the 1950s. Perhaps the antibiotic revolution’s greatest contribution in the transformation of the philosophical outlook of Westerners was not a loss of fear of infection but an openness by physicians (and their patients) that antimicrobial medicines were making it safe to implant foreign materials into human beings. “In 1950, about 230,000 physicians were practicing in the United States, and the overwhelming majority had left medical school well before the first antibiotics appeared.”26 Yet, it was those physicians who pioneered the use of implants as their scientific partners simultaneously innovated implant materials, like alloys, plastics, and transistors.

A certain clairvoyance was shared among the healers and dreamers—an idea that started four hundred years ago—an inclination that a line and race of inventions could be synthesized for implantation into the fabric of the human body, under the auspices of antibiotics.