2 THE POISON SQUAD

The American pharmaceutical industry emerged in the mid-nineteenth century in response to an unprecedented surge in demand for antiseptics and painkillers for combat troops. The Mexican-American War that ended in 1848 had taught the United States a painful lesson. Adulterated medications meant that frontline soldiers died needlessly; the failure to treat dysentery, yellow fever, infections, and cholera resulted in 87 percent of the fatalities.1 Many who survived also suffered unnecessarily since the painkillers sent to treat battlefield wounds were often defective. No American company was then capable of large-scale manufacturing of morphine, the era’s most powerful painkiller.

It had only been forty years since a twenty-one-year-old German pharmacist’s apprentice had isolated the morphine alkaloid from the opium poppy. He called it Morpheus, after the Greek god of dreams, but his findings were mostly ignored after he published them in a little-read medical journal.2 It was a decade before a French chemist realized its importance and not until the Roaring Twenties that Heinrich Emanuel Merck sold a standardized dose of morphine at his Engel-Apotheke (Angel Pharmacy) in Darmstadt, Germany. Morphine was inexpensive to produce and it became a key product at several new family-run German companies, including Ernst Christian Friedrich Schering’s eponymously named Berlin company and Friedrich Bayer’s chemical factory in Wuppertal.3

A year after the Mexican-American War, two German American cousins used $2,500 in savings and a $1,000 mortgage to launch Charles Pfizer and Company. It was a chemicals business in a two-story brick building on Brooklyn’s Bartlett Street.4 Their timing was good. Once the Civil War began, Pfizer had trouble keeping up with the demand for morphine.

Pfizer’s competition came from Edward Robinson Squibb, who had opened E. R. Squibb & Sons, a pharmaceutical manufacturing plant, also in Brooklyn. Squibb was personally aware of the importance of quality and consistency in drug production. As a wartime naval surgeon he had personally tossed overboard crates of substandard medications sent to the front.5 A year into the Civil War, pharmacist brothers John and Frank Wyeth opened a Philadelphia pharmacy and drug distributorship. The contract they got to supply medicine to the Union Army was so lucrative that after the war they sold their pharmacy and focused on mass-manufacturing drugs.6

Morphine was the most effective painkiller but not the only one. Dr. Samuel Pearce Duffield, chief of Detroit’s health department, sold an ether and alcohol solution to Union troops. When he retired in 1871, an ex–copper miner turned investor, Hervey Coke Parke, and the company’s twenty-six-year-old salesman, George Solomon Davis, incorporated Parke-Davis.

Eli Lilly, a chemist, missed the opportunity to cash in on the Civil War demand for morphine, but as a Union Army colonel he learned how critical medications were to the war effort. He left the military convinced that his future lay in the eponymously named laboratory that started manufacturing drugs in 1876.7 Two other American pharmacists, Silas Mainville Burroughs and Henry Solomon Wellcome, also saw opportunity in the drug business. Deciding there was less competition in Britain than the U.S., they launched Burroughs Wellcome in London in 1880. It manufactured everything from cod liver oil to malt preparations to face creams to opiate-based pain compounds.8

Those pioneers entered a drug industry in its infancy. The highly addictive nature of their products, coupled with no government oversight and regulation, was good for sales. And they benefited also from the ignorance about what caused illnesses and chronic diseases or how to treat them. It had been only a few decades since French chemist Louis Pasteur had proven with a series of experiments on spoiled meat and sour milk the existence of microbes too small for the human eye to see. The emergence of “germ theory,” that invisible microbes might cause disease, was greeted with considerable skepticism in the nineteenth century. Even if true, scientists did not know how to go about countering bacterial pathogens.

It took until 1882 before a German bacteriologist discovered that microorganisms caused tuberculosis; until then it was considered an inheritable illness.9 In the U.S., cholera was thought to be a disease brought by immigrants, particularly the Irish. (The Immigration Act of 1891 addressed this by requiring a physical exam for all arriving migrants to exclude “all idiots, insane persons, paupers or persons likely to become public charges, [and] persons suffering from a loathsome or dangerous contagious disease.”)10 Before 1900, when an American Army surgeon, Walter Reed, demonstrated that mosquitoes spread yellow fever, it was thought to be passed only by contact with someone infected.11

Until 1900 there was no national medical licensing law; in most states anyone could call themselves a “doctor,” open a practice, and treat patients.12 The lack of basic medical knowledge meant there were few boundaries for promoting a drug. That was the case with the first inexpensive but powerful central nervous system stimulant, cocaine. It had been discovered by a German doctoral student whose chemistry dissertation was about how he had isolated the pure alkaloid from coca leaves. He named the alkaloid cocaine—from the Latin ina, from; it simply means from coca. (That student later went on to develop World War I’s deadliest chemical warfare agent, mustard gas.)13

Merck, one of the first firms to concentrate on cocaine, touted it in products for everything from a numbing anesthetic to a cure for indigestion and hemorrhoids, even as an aid in eye surgery (it reduced bleeding by tightening blood vessels).14 I 15 Cocaine was the officially sanctioned remedy of the United States Hay Fever Association.16 The U.S. surgeon general said cocaine was effective for treating depression. Tobacconists sold cigars laced with 225 mg of cocaine for “soothing nerves,” while dentists peddled cocaine-infused lozenges for toothaches. Asthma sufferers bought inhalants that were pure cocaine and instructed to “use them as needed.”17 A gram of pure cocaine cost on average 25 cents at any druggist.18 The largest mail order catalog of the era, Sears & Roebuck, sold a hypodermic syringe—a Scottish doctor had invented it only a few decades earlier—and a small amount of cocaine for $1.50.19

The boom in cocaine-based remedies meant that over a two-year span Merck went from producing less than a pound of cocaine annually to more than 180,000 pounds a year.20 Parke-Davis chemists patented a refined process that increased the drug’s purity and extended its shelf life. It introduced coca cheroots, coca-leaf cigarettes, and an alcohol-and-cocaine-mixed syrup, all under the motto Medicamenta Vera (True Medicine).21 Squibb manufactured and sold one of the strongest cocaine concentrations dissolved in a clear liquid base and used as a tincture. In less than a decade, cocaine became one of America’s five top selling drugs.

Simply because cocaine had become popular did not mean U.S. pharma companies had lost their enthusiasm for opiates. Merck boasted of the purity of its powdered morphine and one of its most popular drugs was opium-laced cough lozenges.22 Squibb and Pfizer sold nearly a dozen variations of opium tinctures.23

Drug firms such as Merck, Squibb, Pfizer, and others competed against each other. However, their stiffest competition came from so-called patent medicines, some fifty thousand homemade remedies marketed as miracle cures.24 They were not actually patented (the U.S. did not start issuing chemical patents until 1925).25

Since there was no legal requirement that drugs substantiate any purported benefit—like today’s supplement industry or legal cannabis markets, in which claims are unchecked—shameless pitches played on people’s worst fears and ignorance. Compounding the problem was that there were no controls over ingredients, purity, or consistent dosing. While the nostrum makers used American copyright laws to protect their names, shapes of their bottles, and even the label designs, each kept their formulas secret. There was no requirement in the United States for a prescription for any medication. Nor was it necessary to see a doctor to get a drug.26

The best-selling nostrum companies bombarded consumers with a deluge of salacious ads in newspapers and magazines touting phenomenal healing powers.27 Manufacturers secretly paid for breathless testimonials and advertised their “miracle elixirs” and tonics on tens of thousands of roadside billboards, posters strung along country fences, even makeshift yard signs.28

The owners of the most popular patent medicines earned huge personal fortunes. German immigrant William Radam became rich from his proprietary blend dubbed “Microbe Killer,” which he claimed to “Cure ALL Diseases.” The pink liquid was sulfuric acid diluted with red wine and returned a profit of 6,000 percent on each bottle.29 Equally successful was a Quaker abolitionist, Lydia Pinkham, whose remedy of ground herbs and alcohol was made in her cellar kitchen in Lynn, Massachusetts, and marketed as a wonder remedy for women.”30 Dr. Jacob Hostetter’s best-selling home-brewed remedy, “Hostetter’s Celebrated Stomach Bitters,” took advantage of the false but widespread belief that whiskey killed bacteria.31 It was a market vegetable extract in 94-proof whiskey that promised a thorough detox as well as protection or cure against dozens of illnesses.32 A Connecticut street peddler and former Texas farm hand used fire eaters and sharpshooter contests at traveling road shows to sell tens of thousands of “Indian Sagwa” to “purify the blood.” It was moonshine mixed with common garden herbs for flavoring.33

Parents were particularly susceptible to nostrum pitches since 20 percent of all children did not survive to age five.34 The company behind the top selling “Kopp’s Baby Friend—the King of Baby Soothers”—checked daily newspapers for birth announcements and sent free samples to new mothers. Its secret formula was a solution of one-third pure opium, and over time it was responsible for dozens of lethal infant overdoses.35

Most pharmacists and doctors denigrated patent elixirs as the province of snake oil salesmen and traveling medicine shows. Trade publications such as Druggists’ Circular exposed some of the most dangerous remedies. However, with a tiny circulation to medical professionals, those publications did nothing to slow the demand of enthusiastic lay consumers.36 And despite the widespread scorn of many physicians and druggists, the money to be made was too tempting to a few who stocked top selling nostrums and marked up the price.37 Crowded and filthy slums that were by-products of fast-growing cities in the late nineteenth century had become breeding grounds for a succession of epidemics, from smallpox, tuberculosis, typhus, and yellow fever to cholera. Each resulted in a deluge of profitable new nostrums, all promising instant cures.

In 1890, members of the American Pharmaceutical Association published the first United States Pharmacopeia and the National Formulary (USP/NF). Advances in chemical testing and machine manufacturing had made it possible for American pharmaceutical firms to produce drugs of improved purity and generally reliable quality.38 The USP/NF was a somewhat rudimentary list of about two hundred “ethical pharmaceuticals” intended to be the gold standard for doctors and pharmacists.39 II 40 The list was an easy guide for those wanting to avoid useless nostrums.41

Traditional pharmaceutical companies looked with disdain on their patent competitors. Still, the general lack of knowledge about the medicines they sold meant that ethical drugs could also sometimes be disasters. There was no better example than Heroin, a trademarked drug developed by Germany’s Bayer. In 1898, the same Bayer research team credited with isolating salicylic acid (trademarked as Aspirin) added two acetyl groups to the morphine molecule and produced an opiate ten times as powerful. Bayer’s director of pharmacology insisted the company not select “too complicated a name” so it chose the German heroisch, or “heroic.”42 Heroin went on sale in America in 1900 and was immediately listed on the USP/NF. Anyone over the age of eighteen could buy it. Bayer claimed it was much better at alleviating pain than morphine. It was ten times more effective for the relief of coughs and colds than codeine, contended Bayer, with only a tenth of codeine’s toxic side effects.43 The company also promoted it for treating epilepsy, stomach cancer, multiple sclerosis, asthma, and schizophrenia. Bayer’s advertisements claimed it safe for children. It even sold it as a fast cure for morphine addiction, which by then was becoming a problem.44

Some states that had passed laws to cover adulterated foods also addressed drugs, but only with generic provisions that barred nostrum makers from selling lethal poisons.45 The result was a hodgepodge of rules that were confusing, sometimes contradictory, and seldom enforced. It was impossible to address the booming interstate traffic without a federal law. Through the 1890s, Congress failed to pass a series of regulatory bills to empower federal oversight of both food and drugs.46 The driving force behind the movement for a national law was a chemist and physician, Harvey Washington Wiley. He was the head of the Department of Agriculture’s Division of Chemistry (a predecessor agency to the Food and Drug Administration).47 Politicians and industry lobbyists mostly ignored his zealous appeals to address the unreported dangers of adulterated food. They dismissed him as an inexperienced idealist and felt he was a powerless bureaucrat in an obscure government agency.

Wiley had, however, often been underestimated by others and repeatedly defied expectations. He was the deeply religious son of a self-educated Indiana farmer, Preston Wiley, who was also an evangelical preacher for a revivalist nineteenth-century Christian sect. The senior Wiley was the headmaster at the tiny town’s single-room school.48 His mother, Lucinda, worked the farm and tended to her seven children, all of whom she had given birth to on the dirt floor of the family’s two-room log cabin (they had no running water, heater, or working toilet).49 Second-generation Americans of Irish and Scottish ancestry, his parents worked long days on farmland outside Kent, a town of several hundred poor whites along the Kentucky border. It was a bare-bones, tough existence that had pushed many of Wiley’s neighbors to the edge of desperation.50

Wiley grew up in a household where corporal punishment was meted out with a wooden rod for indulging in “devices of the devil,” such as playing with other kids, singing, dancing, or celebrating holidays.51 His parents expected him to take over the family farm when he turned eighteen. Instead, he surprised them by passing his college exams and even earning a scholarship. At twenty-six he received his medical degree and graduated near the top of his class at Indiana Medical College. When he moved east to study chemistry at Harvard he was captivated by the emerging science about diet and nutrition and a related study of safety concerns regarding food additives and preservatives.52 In an era before refrigeration, as more food was shipped long distances from processing plants, producers constantly tested new preservatives. The food industry hired chemists to extend the transport and shelf life of perishable goods and to find chemicals that removed unpleasant odors and enhanced the color of food (red lead for beef, lead chromate for mustard, arsenic for green vegetables). Few scientists were studying the possible dangers in the new methods.53

Wiley had a reputation as a smart advocate for pure food. In a series of articles for Popular Science Monthly he highlighted potential risks in the U.S. food chain.54 When he became the chief at the Division of Chemistry in 1883, the department had only six employees and a paltry budget of $40,000.55 It seemed an unlikely place from which to launch a successful campaign for a pure food law or to wield influence to tame the unregulated pharma industry.

What the Division of Chemistry lacked in manpower and money, Wiley made up for with a talent for generating public attention for his campaign. He promoted his agenda in articles and testimony before congressional committees. Wiley traveled the country to dozens of women’s clubs and social organizations delivering fiery speeches warning about the dangers of adulterated food.56 At those events, Wiley seemed more an itinerant preacher than a scientist. The politicians and lobbyists who had dismissed him when he arrived in D.C. realized that his dramatic flair was a good complement to the quest of what he called “extensive and exhaustive investigations of adulterations and misbranding of foods.”57

It was four years before Wiley and the Division of Chemistry published the first volume of a series of reports titled “Foods and Food Adulterants.” The initial one focused on health risks to the nation’s dairy products.58 Half the milk samples tested had been thinned with water and chalk and were swarming with bacteria. Much of the butter sold contained no dairy at all. Over the next five years, Wiley and his small team of chemists issued nine more reports.59 Among their findings was that nearly 90 percent of all ground coffee was adulterated, usually cut with sawdust and even dirt. “Embalmed beef” was sold in tin cans, made in part from lead, and infused with so many powerful preservative chemicals that it smelled like formaldehyde. There was a fury when children died in Nebraska and Indiana of contaminated milk. Some dairies used formaldehyde to mask the odor of sour milk and had sold it to orphanages.60

Wiley’s reports—the first intensive federal investigation into potential health risks in the food supply—were a milestone. Flattering press coverage added to his reputation as an incorruptible champion for the common good.61 The Washington Times was typical: “When he took up the gauntlet thrown down by a crowd of greedy parasites who were making huge fortunes by selling to the public foods not what they seemed, he determined to quash their nefarious practices.”62 The media attention cemented Wiley’s public persona as “the pure food man,” a lone government crusader arrayed against vast and powerful special interests intent on putting profits above safety.63 He realized his popularity presented an opportunity to expand his influence. During his tenure, the secretary of agriculture upgraded his division to a bureau, a designation that imbued it with more autonomy. It grew from six to more than six hundred employees and its budget increased twenty-fold. The newly christened Bureau of Chemistry had its own building by 1902, and Wiley ran it as his personal fiefdom.64

That same year he convinced Congress to appropriate $5,000 to study potential health risks in common food preservatives and dyes.65 Determined that the results would not get buried into some little-read official report that gathered dust on the back shelves of his Bureau of Chemistry, he planned a showstopper of a study. Relying on what he cited as inspiration from the biblical book of Daniel, Wiley decided to experiment not on animals but on humans.66 He created a twelve-man “Hygienic Table Trial.” Its volunteers, including a scientist, a former high school captain of a cadet regiment, and a Yale sprinter, were either employees from the Bureau of Chemistry or Georgetown Medical College students attracted by free room and board.67 Before Wiley approved each, staffers screened them for good moral character, little or no alcohol use, and abstinence from medicines. “I wanted young, robust fellows, with maximum resistance to deleterious effects of adulterated food,” he later noted.68 The recruits promised to stay at least a year and waived all rights to sue the government if the trial proved harmful or deadly.

Wiley built a kitchen and dining room in the Bureau of Chemistry’s basement. He bought all the food and drinks and served three meals daily for the twelve volunteers, all of whom dressed in formal attire for dinner. A chef, who boasted he had been the personal cook for the queen of Bavaria, prepared meals that included steadily increasing doses of preservatives and coloring agents that Wiley suspected as toxic.69

Wiley recorded the temperature and pulse of every man before each meal. He regularly checked their weight and collected urine and stool samples. He allowed some journalists from popular newspapers and magazines to observe the experiment. The ensuing coverage was sensational: “Young men of perfect physique and health” who were “martyrs of science,” willingly ate potentially deadly food served by “a bespectacled scientist.” The volunteers adopted the motto, “Only the Brave Dare Eat This Fare.” A Washington Post reporter gave the project a name that stuck: The Poison Squad.70 The element of danger totally captivated the public. Wiley worried that the popular frenzy might predispose the scientific community to dismiss the seriousness of his tests.

The Poison Squad, however, was far more than a hit turn-of-the-century reality show. The startling results over several years confirmed Wiley’s worst fears about hidden dangers in America’s food supply. Intensifying ailments afflicted his volunteers.71 Preservatives such as borax and salicylic acid caused headaches and digestive problems. Formaldehyde that prolonged the life of dairy products caused weight loss, insomnia, and scarred kidneys. Benzoate caused severe heartburn and damaged blood vessels. Copper sulfate that enhanced the color of canned vegetables caused vomiting and liver damage.72 Sulfites, the by-products of many preservatives used in wine, molasses, and cured meats, made the volunteers sick with dizziness and splitting headaches. Starting in 1904, Wiley released the first of five reports titled “Influence of Food Preservatives and Artificial Colors on Digestion and Health.” In total, it was a damning one-thousand-page indictment.73

Wiley began winding down his audacious public spectacle in 1905 following the death of a weakened volunteer from tuberculosis. By then, however, the Poison Squad had earned an almost mythic place in American medical history. And Wiley knew it had reenergized his quest for a federal pure food law.

I. Many prominent public figures—Sigmund Freud, Pope Leo XIII, Robert Louis Stevenson, Queen Victoria, to name a few—waxed enthusiastic about the energy and fleeting euphoria cocaine produced. Its recreational use surged in the second half of the nineteenth century.

II. The term “ethical pharmaceuticals” made them sound as if they were more trustworthy medications than nostrums. The term later came to mean those not advertised to the public, a concept the American Medical Association pushed since it considered that ads directed to the public encouraged self-treatment and threatened the authority of doctors. Although there was no requirement yet for drug prescriptions, the AMA hoped that patients might seek the advice of physicians for selecting the right ethical drug.