6 AN UNLIKELY TRIO

Post–World War II America was a country with a newfound self-assurance. The Great Depression that ended in 1939 was a distant memory as total victory over Germany and Japan and a booming domestic economy resulted in what Rutgers historian William L. O’Neill dubbed the “American High.” The fifteen-year span that began in 1945 was “our time of greatest confidence.”1 The pharma industry also emerged from war with a renewed sense of self-assurance. Its turn-of-the-century peddling of addictive drugs was long forgotten in the glow of its collaborative wartime penicillin program. The drug business also benefited from a widespread perception that technology and science were on the verge of a historic golden age. Physicist Alvin M. Weinberg later coined “Big Science” to describe the belief that everything from routine space travel to eradicating all disease was possible.2

Inventions and new products would come in a steady stream after the war. Some of the innovations included the microchip, videotape recorder, musical synthesizer, bar code, black box flight recorder, solar cells, optic fiber, supercomputers, and hard disks. A few of the least important inventions had the greatest impact on daily lives. People raved about the household revolution wrought by Teflon-coated cookware and super glue and halogen lamps. Power steering and radial tires added safety to cars while making it easier to drive. Transistor radios meant that broadcast programs and music no longer had to be listened to at home in a radio the size of a piece of furniture. And technology also introduced a way for people to buy the new devices, even if they did not have enough money to afford them: the first ever credit card (1950, Diners Club). And buy they did; in just five years after the war, Americans purchased 22 million cars, 20 million refrigerators, and 6 million stoves.

Medicine also flourished. Within ten years there would be a series of major breakthroughs: the first heart and lung machine; mechanical heart valve; discovery of DNA; open-heart surgery; introduction of a cobalt-ray machine to treat cancerous lung tumors; electric shock to revive a patient; kidney transplant and kidney dialysis equipment; microscopic brain surgery for seizures; ultrasound and coronary angiography; and the cardiac pacemaker.3

All the historic innovation created high expectations for the pharmaceutical industry. Antibiotics had reversed the public’s pre–World War II opinion that drug companies profited with mediocre-at-best medications. A growing number of Americans had survived heart valve and bone infections, meningitis, scarlet fever, and other previously fatal diseases. Combined with plenty of press coverage over the many new in-house drug research departments, many hoped pharma would use science to conquer intractable diseases such as cancer.

C. Everett Koop, decades later Ronald Reagan’s surgeon general, had been the custodian for the distribution of penicillin in Philadelphia when he was a surgical resident at the University of Pennsylvania in the late 1940s. He witnessed how penicillin saved lives: “I never lost the sense of wonder when I saw a youngster’s fever and infection controlled by the antibiotic.”4 In his role as a surgeon he learned that almost all his patients thought that pharmaceutical breakthroughs “appeared to arise from nowhere,” that they were accomplished “with ease.” Although he knew that not to be true, it made Koop keenly aware that the industry’s wartime achievements had created unrealistic public expectations that the pace of drug innovations would continue unabated.

Harry Truman and Dwight Eisenhower did their best to ensure the federal government helped make the pharma narrative a reality. They directed large federal investments into biomedical and drug research, much of it into the National Institutes of Health (NIH).5 When Eisenhower created the Department of Health, Education, and Welfare in 1953 it was the first new cabinet level department in forty years.6 HEW helped consolidate the NIH’s various research agencies.

Although penicillin and streptomycin had made the pharma industry popular with the public, the two drugs were not having a good postwar effect on the firms’ bottom lines. Production of penicillin and streptomycin outpaced demand. Prices had gone into a free fall.7 A penicillin dose that cost $20 in 1944 had plummeted to only 30 cents three years later. Without any patent protection, the companies making and selling penicillin could only differentiate themselves by price. Each undercut the other to land orders. Streptomycin soon followed suit, dropping seventy-fold in price between 1946 and 1950.8

Despite the intense competition, Squibb’s penicillin profit margins were 10 percent better than the average of its four top rivals. That was because it was the only vertically integrated firm, not only manufacturing the drug, but packaging it before selling it to hospitals, clinics, and physicians. Competitors only made it and then relied on packagers and wholesale distributors. Starting in the late 1940s, Pfizer, Merck, Lilly, and Parke-Davis created internal divisions to eliminate the middlemen.

Savvy CEOs knew, however, that changing their corporate structure would not alone boost their bottom line. What was needed were new patentable drugs that fueled big sales at higher prices. That required a two-pronged campaign. First, the firms cooperated in influencing Congress to pass laws that made it easier to obtain patent protection for laboratory discoveries. Next, they invested heavily in research and development as part of the frenzied race to find the next broad-spectrum antibiotic.

Merck hired some of the country’s leading biochemists.9 Pfizer and Lilly battled one another for recruiting noted scientists.10 Pharma analysts predicted that one of the firms that had been at the forefront with penicillin and streptomycin—Merck, Squibb, Pfizer, Eli Lilly, Abbott, Upjohn, and Parke-Davis—would be the first with the next great antibiotic.

Scientists knew, however, that as penicillin had demonstrated, drug discovery sometimes required a bit of luck. History was about to be made at a place to which no one was paying attention. New Jersey–based Lederle was a small company that had been founded in 1906 by a former New York City health commissioner. Incorporated as Lederle Antitoxin Laboratories, it specialized in vaccines and antitoxins.11 Chemical conglomerate American Cyanamid bought Lederle in 1930 and gave it an unusual degree of independence. Lederle had played a small role in the wartime penicillin project, never having made the manufacturing changes required to produce large quantities. Its antitoxin background, however, allowed it to provide the armed forces a quarter of all its blood plasma, a third of the flu vaccine, half of the tetanus inoculations, and half the gas gangrene antitoxin.

Lederle had one of the industry’s smallest and least celebrated research departments. No one there was a Nobel Laureate or distinguished scientist. The company’s researchers did not publish many articles in leading medical and pharma journals. Nor did Lederle have a formal relationship with the National Institutes of Health or any medical facility, such as the Mayo Clinic, by which promising lab discoveries could be tested clinically.

The three men who were ultimately responsible for the next drug breakthrough were quasi-outcasts; one a foreigner barred by U.S. immigration from becoming an American citizen, another judged too old by colleagues, and the third an African American surgeon at a time when the medical profession was nearly all white.

Lederle’s chief of research was an Indian-born physiologist and physician, Yellapragada SubbaRow.12 He was one of seven children from a poor family in Bhimavaram, an eastern Indian city best known as a major Hindu pilgrimage site. At thirteen, SubbaRow ran off to become a banana trader to the visiting pilgrims. His father, a tax collector, brought him home.13 When his father died when SubbaRow was eighteen, he ran away again, this time to become a monk. His mother ordered him back to school, and local charities helped him afford Madras Medical College. But lack of money was always a problem. When he was twenty-four, he wed the fifteen-year-old granddaughter of a respectable merchant as part of an arranged marriage. His father-in-law paid for the last two years of tuition.14

SubbaRow was at school when the movement for Indian independence from Great Britain gained momentum inspired by Gandhi. He refused the British surgical gown given him at school and instead donned one made of a traditional and simple cotton khadi. That act of defiance cost him the college degree necessary to enter the State Medical College. Instead, he joined a local Ayurvedic college as an anatomy lecturer. It was there that an American doctor on a Rockefeller scholarship working on an anti-hookworm campaign encouraged SubbaRow to apply to Harvard’s School of Tropical Medicine.

Harvard considered his Ayurvedic work disqualifying and rejected him. In 1922, he tried again, but then withdrew his application to help his mother and siblings after two brothers died from Tropical Sprue, a rare infectious digestive disease. Not dissuaded, he applied again the following year, emphasizing his anatomy training. Harvard accepted him.15

SubbaRow was twenty-eight when he left behind his wife, pregnant with their first child, and emigrated to the U.S. (his child, a son, died of a bacterial infection before he was one; not only did SubbaRow never see his child, he never again saw his wife).16 He was ineligible for a scholarship because his Indian degree did not meet Harvard’s standards. The same was true when he applied for an internship at Boston hospitals. He finally landed some odd jobs at one.17 Industrious and hardworking, he earned his diploma in Tropical Medicine in 1924 and started on his PhD in biochemistry at Harvard Medical School.

In 1929, he coauthored his first scientific paper about his development of a simple color test to determine the amount of phosphorus in biological tissue.18 The following year he was the first Indian in Harvard history to earn a biochemistry PhD and began working as a teaching fellow at Harvard Medical School. Although it was common for a scientist of his talent to use research assistants and collaborate with peers in the lab, Harvard directed he work alone. Although he made advances in phosphorus compounds connected to RNA synthesis, he was not allowed to publish his results.19 In 1935, he had to disown the extent of his role in the discovery of the color test related to phosphorus, instead giving the credit to his coauthor, who was being considered for promotion to a full Harvard professorship.20

Denied tenure and tired of his second-class status, in May 1940 he accepted an offer to become Lederle’s associate director of research at their lab in Pearl River, New York. When the previous director retired at the end of that year, SubbaRow became the chief.

SubbaRow had joined Lederle just in time to be part of the penicillin project. He was the company’s representative when dealing with the government or other pharma firms. Although the penicillin work left little time in the laboratory, he was the first researcher to synthesize amethopterin, a chemical analog of folic acid. That research was prompted by a study that showed children with leukemia got significantly worse when fed a diet rich in folic acid. Scientists like SubbaRow wondered if an analog that was hostile to folic acid might have the opposite effect. Sidney Farber, a pathologist at Boston’s Children’s Hospital, made a breakthrough with SubbaRow’s analog, developing in a few years the first effective chemotherapy agent.21

In 1942, SubbaRow hired seventy-year-old Benjamin Duggar, a plant physiologist, who had been forced to “retire” by the University of Wisconsin as he was “too old to teach.”22 Other pharma firms had politely turned Duggar away, saying there were no positions or that his specialty was not what they needed. SubbaRow saw experience in what others judged “too old” and he thought that in Duggar he might have found a valuable “antibiotic hunter.”

At Lederle, coworkers came to know the Alabama native as amiable and eccentric. Every day at the lab Duggar slowly but methodically sifted through soil samples looking for antibiotic-producing fungi. At his suggestion, Lederle requested the Army have soldiers returning home bring a small amount of soil from wherever they had served. By 1944, a small shack outside the lab held thousands of samples from more than twenty countries on three continents. Duggar sometimes isolated antibiotic organisms from them. He and SubbaRow carried out tests to see if they had any effect in petri dishes on a broad range of bacterial pathogens.

Some days Duggar stayed away from the soil and instead reverted to the professor he had been for many years. He stopped by the lab but only to give a lecture to his younger colleagues.23 Most evenings Duggar left no later than 5:30 so he could play golf at a country club before it got dark. A chain-smoker, he also spent free time tending to a makeshift garden he created near an abandoned stable. He shared little about his personal life with coworkers.

The fourth of five sons born near the end of Reconstruction, Duggar was raised in a devout Episcopalian household. His father, Reuben Henry Duggar, was a prominent physician.24 No one in the Duggar family talked about the days before the “War of Northern Aggression.” Union troops had seized their large plantation, Frederickton, outside of Macon. His father had served on the state medical board that passed on the fitness of volunteer doctors to serve as Confederate medical officers.

A savant at school, Duggar entered the University of Alabama at fourteen.25 After he graduated from Cornell with a PhD, his father shared with him the only story he ever told about the Civil War. During the last year of fighting, he had served as a surgeon at a field hospital in Talladega, Alabama. Malaria was rampant. Thousands of mosquitoes from adjacent marshlands plagued the camp. Duggar ordered that all fires be built on the camp’s windward side and then be extinguished at dusk. That caused the smoke to blow over the camp, clearing the mosquitoes. That was a couple of decades before science confirmed that mosquitoes were malaria carriers. The lesson, his father told him, was that even as a doctor, it was sometimes important to go with your intuition. Duggar recalled that lesson at Lederle.

One day in 1945, while extracting molds from soil samples from a dormant hayfield on the University of Missouri campus, he noticed one was an unusual gold color. Duggar had isolated hundreds of what he called “ultra-molds” during his three years at Lederle.26 Somehow, he had a hunch this one was special. With SubbaRow overseeing his work, Duggar tested the mold he labeled A-377. To their elation, A-377 proved effective in halting the growth of both Gram-positive and Gram-negative bacteria, including the microbes responsible for bubonic plague, tuberculosis, typhus, and Rocky Mountain spotted fever.27 They had discovered the first broad-spectrum antibiotic since streptomycin.

Although SubbaRow and Duggar put it on a fast track, that did not translate into much speed given Lederle’s small staff and limited resources. It took three years of additional testing, until 1948, before Duggar was confident enough to publish a paper about his finding.28 There, he dubbed the antibiotic organism he discovered Streptomyces aureofaciens, the “gold maker.” Lederle executives liked that so much they gave the drug its brand name, Aureomycin (áureo is Latin for “gold”).

Before there could be certainty that Aureomycin was a wonder drug, Duggar had to produce enough of a purified version for human testing. SubbaRow and Duggar picked New York’s Louis Tompkins Wright to run clinical tests on the most important drug in the company’s history. A decade earlier Life had dubbed Wright the “most eminent Negro doctor in the U.S.… [the] surgical director of Harlem Hospital, [and] only colored Fellow of American College of Surgeons.”29 A lower-caste Indian rejected for tenure at Harvard had selected the son of a Confederate officer and doctor, and together they picked the most famous black doctor in America—the son of a slave who had himself become a physician—to conduct the clinical trials.

The fifty-seven-year-old Wright grew up in rural Georgia where his family set the example that nothing was impossible because of the color of his skin. Although both his grandfathers were white, his father was born a slave. When his father died not long after Louis’s birth, his mother met and married William Fletcher Penn, the first black graduate of Yale Medical School. Penn was one of only sixty-five black doctors in Georgia at the turn of the twentieth century.30 Doctors nationwide had to be members of the American Medical Association to practice at most hospitals, but the AMA left membership decisions to local chapters. Those in the Deep South refused to admit black doctors (the AMA did not change its national policy until 1950). The result was that many black doctors began operating rudimentary clinics to serve local patients, sometimes even running them from their homes.31

After graduating as the valedictorian at nearby Clark University, Wright took a train to Cambridge. He had sent his college transcript and a cover letter to the admissions department at Harvard Medical School. An interview with the Medical School dean was set. The dean mistakenly thought the Clark University on the transcript was the exclusive whites-only university in Worcester, Massachusetts. Upon meeting Wright and realizing his mistake he sent him to the chair of the chemistry department. That professor was a tough, no-nonsense academic who, the dean thought, would disabuse Wright of any thought he had of attending Harvard Medical. The chemistry chair tried to get rid of him by giving him an oral exam on the spot. Wright secured his admission, however, by correctly answering every question.32

During his second year, he could not do his obstetrics clerkship at the traditional Boston hospital for Harvard medical students but instead had to complete it with a black physician. “That is the way all the colored men get their obstetrics,” he was told.33 Instead of going along quietly, he protested and rallied the support of classmates. Harvard reversed itself. It was only one of many racial barriers he encountered. Despite graduating fourth in his 1915 class, he failed to get an internship at the city’s top hospitals.34 Wright did not want to intern at one of Boston’s black hospitals because he knew they had antiquated equipment and there was no opportunity to pursue clinical research, one of his passions.35 Ultimately, he had no choice. When no white hospital allowed him to complete his internship, he left Boston and started at Washington, D.C.’s, blacks-only Freedmen’s Hospital.36

Wright returned to Atlanta to work with his stepfather after his D.C. internship. Walter Penn had recently cofounded the Atlanta chapter of the National Association for the Advancement of Colored People. Maybe the NAACP could help change things, he told Wright, who joined (for Wright it was the start of thirty-six years of storied activism with the NAACP).37 Frustrated, though, in Georgia, Wright joined the Army Medical Corps and was dispatched to France for the remainder of World War I.I38

After two years of frontline surgical experience, he left the Army and opened his own practice in Harlem. In New York Wright discovered that there were layers of segregation that were more subtle than the South’s, but no less ingrained and rigid. Jewish and black doctors had their own hospitals and were mostly barred from integrated ones.39 On January 1, 1920, Wright was the first African American to join the staff of Harlem Hospital. Although his job was the lowest possible for a physician—“clinical assistant in the Outpatient Department”—four white doctors resigned in protest. The person responsible for Wright’s appointment was demoted to the information booth at Bellevue Hospital.40

Determined to demonstrate how good he was, Wright excelled. And soon it seemed “he was the first” had become part of his name. He was the first African American police surgeon in New York (1928); first admitted to the American College of Surgeons (1934); first director of surgery at Harlem Hospital (1943); and first president of the Hospital Board (1948).41

Lederle Laboratory’s SubbaRow and Duggar probably cared little about Wright’s trailblazing when it came to race and medicine. They chose him to conduct Aureomycin’s clinical trials because they considered him eminently qualified. He had by then published nearly ninety papers in leading scientific journals, thirty-five of them about antibiotics.42 And Wright, who had just returned to work after a three-year leave of absence to recuperate from a severe bout of tuberculosis, was enthusiastic about doing the testing. He had long been interested in LGV, a sexually transmitted infection of the lymphatic system. Aureomycin was his chance to discover whether a drug could help patients with the painful, chronic condition. Over two months in the spring and early summer of 1948 Wright conducted the first human experiments. Lederle’s drug destroyed the chlamydia bacteria responsible for the disease. It was also effective on a nasty viral variant of pneumonia. Wright uncovered few side effects and certainly nothing toxic. His report to Lederle was that their drug was ready for public release.43

By the time Aureomycin was ready to go on sale in December 1948, Lederle’s chief of research, Yellapragada SubbaRow, was not alive to savor it. He had died the previous August of a heart attack at the age of fifty-three.II SubbaRow would have liked Lederle’s strong launch, advertising the drug as “the most versatile antibiotic yet discovered, with a wider range of activity than any other known remedy.”44 The company spent a record $2.4 million in promotion, including a first, 142,000 free samples to doctors nationwide.45 It took until September of the following year before the U.S. issued Aureomycin patent number 2,482,055. That was what every pharma competitor had been waiting for. Lederle had proven that it was possible to obtain a patented monopoly on a broad-spectrum antibiotic. Its competitors were determined that Aureomycin would not have the market to itself for very long.

As for the three men responsible for discovering and testing Aureomycin, the public recognition they received for the accomplishment was imbalanced at best. After SubbaRow’s death, Lederle named a library block on its campus for him, as well as a fungus (Subbaromyces splendens). The Indian government issued a commemorative stamp in 1995 on his birthday centennial. When Louis Wright was nominated in 1952 for a Distinguished Service Medal, he got only one vote from the National Medical Association. A few months later when he died unexpectedly at the age of sixty-one, his death went mostly unnoticed except in the African American press and some medical journals. The New York Times did not even run a stand-alone obituary, instead listing him with other deaths for that day.

In contrast, when Benjamin Duggar died in 1956, his passing was covered widely for his breakthrough drug discovery. This time the Times did a story in its national section, titled “Dr. Benjamin Duggar Dies at 84; Led in Discovery of Aureomycin; Conducted Antibiotic Research After Being Retired from a Teaching as Too Old.”46

I. Wright watched in frustration as the federal government established the Veterans’ Bureau in 1921 and opened a Veterans’ Hospital for blacks only in Tuskegee, Alabama (a decision protested by the KKK in large street demonstrations). It was impossible, Wright knew, that a single hospital could meet the medical needs of the 385,000 black soldiers, overwhelmingly from the South. Eleven years later, that Veterans’ hospital became one of three local clinics involved in the Public Health Service’s notorious “Tuskegee Study of Untreated Syphilis in the Negro Male.” It turned into a forty-year experiment on hundreds of black men about the ravaging effects of untreated syphilis. The men, many poor sharecroppers from Macon County, were never treated with any medications.

II. Few doctors then believed there was any connection between lifestyle and heart disease. If it had been a more prevalent theory, someone might have taken notice that SubbaRow had arrived in America as a practicing Hindu, a nonsmoking vegetarian. In the U.S., he adopted an American diet, high in saturated fat and lots of red meat. He had also become a heavy smoker.