We sit at a table delightfully spread
And teeming with good things to eat.
The sixth of seven children, Harvey Washington Wiley was born on April 16, 1844, in a log cabin on a small farm in Kent, Indiana, about a hundred miles northeast of the farm where Abraham Lincoln had grown up a few decades earlier.
The humble timber dwelling was an icon of authenticity for nineteenth-century Americans, particularly because Lincoln (born in adjacent Kentucky), as well as presidents such as Andrew Jackson and Zachary Taylor, had made their log-cabin beginnings a keystone of their respective political images. In later years Wiley liked to joke about his similarly modest origins. “I am not possessed with a common prejudice that a man must be born in a log cabin to attain greatness in the United States,” he said.
But like those political luminaries, Wiley grew up working the land. By age six he was driving the family cows back to the barn for milking each day. At ten he was behind a plow. His father, Preston, was one of the first Indiana farmers to grow sorghum cane, and with curious son Harvey helping, the boy learned to boil down juice pressed from the grassy grain crop into a sweet syrup. That transformation helped spur his interest in food processing and in other types of sugars, one of the inspirations for his later career.
Preston Wiley had little schooling but valued learning, another strong influence on his second-youngest child. The father, who was a lay minister as well as a farmer, had even taught himself Greek. A fierce opponent of slavery—he made a point of gathering his children around for evening readings of the powerful abolitionist novel Uncle Tom’s Cabin—farmer Wiley also believed in acting upon one’s principles. Only three miles from the Ohio River, the family farm became Indiana’s southernmost stop on the Underground Railroad. Escaped slaves from Kentucky, once they’d made it across the water, knew to seek out Preston Wiley. Under cover of darkness, he would escort them safely to the next stop, eight miles northward.
Harvey Wiley was preparing for college when the Civil War broke out, and his parents, despite their antislavery stance, were determined that he continue with it. He enrolled at nearby Hanover College in 1863 but a year later decided he could no longer sit out the war. After joining the 137th Indiana Infantry, he was deployed to Tennessee and Alabama, where he guarded Union-held railroad lines and spent his spare hours studying anatomy, reciting daily from a textbook to a fellow soldier. It was only a few months later that he and many of his fellows fell ill in a plague of measles sweeping through the camp. He was still ailing when his regiment returned to Indianapolis that September. He received a discharge, went home to recuperate, and then returned to Hanover, where he earned a bachelor’s and then a master of arts degree in 1867. But he had by then, influenced by his army days, determined to become a physician. In his graduation address, he spoke of his then-chosen profession with typical over-the-top exuberance. The medical man, Wiley declaimed, “can not climb to Heaven and pull down immortality,” but he can help achieve a longer life “full of health and happiness and hope.”
To earn money for medical school, Wiley first taught Latin and Greek at a small Christian school in Indianapolis. He spent the following summer apprenticed to a country physician in Kentucky, then enrolled in Indiana Medical College, where he earned an MD, graduating in 1871. By that time, though, he’d learned that although he admired the work of medical practitioners, he did not enjoy caring for sick people. He accepted an offer to teach chemistry in the Indianapolis public high schools and there began to appreciate the insights offered by that branch of science or, as he came to see it, the “nobility and magnitude” of chemistry. Realizing he had a passion for the rapidly advancing field, he went back to school yet again, this time to study chemistry at Harvard University, which—as was typical at the time—awarded him a bachelor of science degree after only a few months of study. In 1874 he accepted a position at Indiana’s newly opened Purdue University as its first (and only) chemistry professor.
“I find so many things that I do not know as I pursue my studies,” he wrote in his diary during that first year at Purdue, as he struggled to assemble a working laboratory. “My own profession is still a wilderness.” During the following years, though, Wiley developed a reputation as the state’s go-to scientist for analyzing virtually anything—from water quality to rocks to soil samples—and especially foodstuffs. This was accelerated by a working sabbatical in 1878 in the newly united German Empire, considered the global leader in chemical research. He studied at one of the empire’s pioneering food-quality laboratories and attended lectures by world-renowned scientist August Wilhelm von Hofmann, who had been the first director of the Royal College of Chemistry in London. Von Hofmann was a pioneering industry chemist. Famed for his 1866 discovery of formaldehyde, he would later do work leading to the development of industrial dyes that the food industry embraced in the late nineteenth century. When Wiley returned to Purdue, he brought back specialized instruments for analyzing food chemistry, acquired in Germany and paid for with his own savings when the university refused to do so.
European governments—especially those of Germany and Great Britain—had been far quicker than the U.S. government to recognize and to address problems of food adulteration. In 1820 a pioneering book by chemist Fredrick Accum, titled A Treatise on Adulterations of Food, and Culinary Poisons, had aroused widespread public outrage when it was published in London. Accum minced no words: “Our pickles are made green by copper; our vinegar rendered sharp by sulphuric acid; our cream composed of rice powder or arrowroot in bad milk; our comfits mixed of sugar, starch and clay, and coloured with preparations of copper and lead; our catsup often formed of the dregs of distilled vinegar with a decoction of the outer green husk of walnuts, and seasoned with all-spice,” he wrote.
As Accum noted, the poisonous practices of his time dated back many years. Long before the nineteenth century’s new industrial dyes, merchants and processors used various colorful substances to make their wares look more enticing. Confectioners often turned to poisonous metallic elements and compounds. Green came from arsenic or copper, yellow from lead chromate, cheerful rose and pink tones from red lead. In 1830 an editorial in The Lancet, the British medical journal, complained that “millions of children are thus daily dosed” with lethal substances. But the practices continued, largely due to business pressures on would-be government regulators.
By midcentury, though, casualties were starting to mount in Britain. In 1847 three English children fell seriously ill after eating birthday cake decorated with arsenic-tinted green leaves. Five years later, two London brothers died after eating a cake whose frosting contained both arsenic and copper. In an 1854 report, London physician Arthur Hassall tracked forty cases of child poisoning caused by penny candies.
Three years later, twenty-one people in Bradford, Yorkshire, died after consuming candy accidentally laced with deadly arsenic trioxide—“accidentally” because the confectioner meant to mix in plaster of paris instead. Although he had noticed his workers falling ill while mixing up the stuff, the business owner had put the candy on sale anyway. He was arrested and jailed, as was the pharmacist who’d mistakenly sold him the poison in place of plaster. But they could not be convicted of any crime. Britain had no law against making unsafe—or even lethal—food products.
Fury over the Bradford incident spurred the passage in 1860 of Britain’s Act for Preventing Adulteration in Food and Drink. Business interests managed to limit the fine for poisoning food to a mere £5, but at least it was a precedent.
Although not yet nearly loud enough to prod Congress, there were voices of outrage in America too, where journalists like John Mullaly railed against “milk-poison” and George Thorndike Angell, a Massachusetts lawyer and philanthropist better known for his work against cruelty to animals, loudly derided dishonest food producers. In an 1879 speech to the American Social Science Public Health Association, Angell recited a disgusting list of commercially sold foods that included diseased and parasite-ridden meat and processed animal fat passed off as butter and cheese.
“They poison and cheat the consumer; affect, and in many cases destroy, the health not only of the rich but of the poor,” Angell charged, blasting dishonest food producers as little better than “the pirates who plunder our ships on the ocean or the highway men who rob and murder on the land.” For good measure, he mailed the text of his speech to newspapers nationwide—and to the dismay of food processors, it received prominent display. The American Grocer, a trade publication, dismissed him as sensationalist and “doing a disservice to consumers.” But the Grocer acknowledged that some problems were real, especially the too-often poisonous nature of milk and colored candy, and the reputation harm done by fraudsters. That year, in response to Angell’s concerns, Virginia congressman Richard Lee T. Beale introduced legislation that would have banned interstate commerce in chemically altered foods. A report to the Committee on Manufactures warned: “Not only are substances of less value commingled with those of greater, but such as are injurious to health, and we have no doubt often destructive of life, are freely used in manufacturing and preparing for consuming the necessaries and luxuries of life.” The bill was referred out of committee, where it promptly died through lack of further action. But an uneasy awareness of a troubled food supply was starting to grow.
In 1881 the Indiana State Board of Health asked Wiley to examine the purity of commercially sold sweet substances, particularly honey and maple syrup. At Purdue, Wiley had been studying potential new crops and methods for making sweeteners. Inspired by his father’s venture into sorghum, he had even worked out an improved process for getting syrup from its woody stalks. After his studies in Germany, Wiley possessed the right training and tools to conduct the study, and through presentations at the American Association for the Advancement of Science he’d gained a reputation as one of the country’s leading sugar chemists. The investigation requested by the state—and the political fallout—would serve to plunge him further into his life’s work. It was, as he later remembered, “my first participation in the fray.”
A report from the National Academy of Sciences had already warned that jars labeled “honey” were often tinted corn syrup, with a scrap of honeycomb tossed in to complete the deception. Corn syrup—not the much later “high-fructose” version—was a nineteenth-century innovation. Russian chemist Gottlieb Kirchhoff had in 1812 devised an inexpensive process for turning cornstarch into the sugar glucose. He combined the starch with diluted hydrochloric acid and heated the mixture under pressure. The process proved hugely profitable in the corn-rich United States. By 1881 almost two dozen factories were operating in the Midwest, turning 25,000 bushels of corn a day into sugary products. “The manufacture of sirup and sugar from corn-starch is an industry which in this country is scarcely a dozen years old and yet is one of no inconsiderable magnitude,” Wiley wrote in his report.
In much of the English-speaking world, “corn” could mean any kind of grain crop—barley or wheat, for example. But in English-speaking North America it had long meant maize, a staple of indigenous people in the Western Hemisphere for thousands of years. When Europeans arrived, they called it Indian corn and began growing it for themselves. By the mid-1800s corn had become a primary crop of farmlands from Pennsylvania to Nebraska, from Minnesota to Missouri and beyond—and engendered a whole new array of manufactured food products.
“Corn, the new American king,” Wiley wrote, “now supplies us with bread, meat, and sugar, which we need, as well as with the whiskey we could do without.” He estimated that corn-derived glucose had about two-thirds the sweetening power of cane sugar; it was also far cheaper, produced for less than half the cost.
Those who made and sold the sweetener often labeled it either “corn sugar” or “corn syrup.” This was after European practice. Germany had “potato sugar,” for example, and France produced a “grape sugar.” But Wiley, always a stickler for accuracy (a trait that would over the years irritate more plainspoken colleagues, including President Roosevelt), thought the corn-based product should be called glucose or glucose syrup. This, he emphasized, both was the technically accurate term and also clearly differentiated it from traditional sugars made from cane or beet. (In the twenty-first century, amid a diabetes epidemic, many think of “glucose” in terms of human blood sugar levels. But the sugar product derived from cornstarch—or from wheat, potatoes, and other starches—does bear the same molecular signature.)
In Wiley’s day such scientific precision could seem essential to maintaining a sense of order in research. Chemists of the midnineteenth century had only begun to tease out the nature of molecular bonds. In the late 1850s the German chemist Friedrich August Kekulé put forth the first theory of how atoms come together to form a molecule. Chemistry superstar Von Hofmann, then at the University of Berlin, made the first stick-and-ball models of molecules in the 1860s. In Germany, Wiley had learned to respect such precision, a point illustrated by the instruments he’d brought home with him. One of his favorites was called a polariscope (or polarimeter). At Purdue he used it to tell the difference between types of sugars by passing polarized light through sweetened substances and measuring the angle at which the light rotated. “Glucose presents several anomalies when examined with polarized light,” Wiley explained, compared with the true sugars.
He was not shocked when his tests showed that a full 90 percent of his syrup samples were fakes. Shop owners had told him that these new syrups were so sweet and inexpensive that they had almost “driven all others out of the market.” Testing of honey samples also turned up rampant fakery. He somewhat mockingly referred to the counterfeit product as “entirely free of bee mediation,” noting that even the bit of honeycomb that producers stuck in the jar was phony, made from paraffin. In his report, Wiley found no fault with corn syrup per se—it was, after all, a natural sweetener—but he thought that a food or ingredient ought to be called what it was. To fill a bottle with “glucose” and label it as more-expensive maple syrup was to deceive the consumer. In addition to finding corn syrup masquerading as other sugars, the study turned up impurities left by the manufacturing process. There was copper from mixing tubs and some chemical remnants of charred animal bones (used as a charcoal filter), and in some samples he detected sulfuric acid.
Wiley’s report, published both in the state record and in Popular Science in the summer of 1881, gratified those in the real maple syrup business, but it annoyed corn growers, corn-syrup manufacturers, and the bottlers of the mislabeled products—which, put together, made a far larger and more influential interest group than that consisting of maple tree tappers. Wiley, as he would for the rest of his career, had begun making powerful enemies.
Surprisingly, the group that seemed most bothered by his report was beekeepers. Instead of thanking him for exposing what the chemist called “the injury done to the honey industry” by the corn-based fakes, that industry’s trade journals denounced him and the study, referring to it as “Wiley’s Lie.” The honey producers worried about damage to their reputations. But it became obvious as well that there were “beekeepers” who had not, of late, been bothering to keep bees.
Wiley, characteristically, doubled down. He wrote a more in-depth report for the Indiana Board of Health, stressing the importance of truthful labeling. The second report included instructions for how to detect adulterations, and it strongly recommended that Indiana set purity standards for sugar products produced and sold in the state. “The dangers of adulteration are underrated,” he wrote, “when it is for a moment supposed that any counterfeit food can be tolerated without depraving the public taste, and impairing the public safeguards of human life.”
Despite the political pushback, he closed with a firm call for action. It was high time, he wrote, that “the demand for honest food should be heard in terms making no denial.” He wasn’t afraid, as he would say throughout his career, to stand up for what he thought right. After all, he’d been raised that way.
Like Harvey Wiley, Peter Collier, chief chemist of the Department of Agriculture, was fascinated by the science of sugars and the plants from which they could be produced. Even more enthusiastic about sorghum cane than Wiley was, the Yale-educated Collier saw it as a crop of the future. He envisioned glowing copper-and-green sorghum fields across the country, a potential source of sugar as bountiful as corn or even sugarcane.
His boss, the pragmatic George Loring, did not share Collier’s vision. Commissioner Loring (titled “commissioner” because the USDA was not yet a cabinet-level department) was a former Massachusetts physician with a special interest in treating the often-crippling diseases of farm animals. The sorghum disagreement between Collier and Loring might have stayed a matter of internal discussion except for one problem. Whenever Collier felt aggrieved, he had a habit of complaining about the commissioner to the Washington press. Exasperated by newspaper stories in which his chief chemist suggested that he was an idiot, Loring in 1882 sought and received permission from President Chester A. Arthur to replace Collier with a more amiable scientist. Later that year, at a December meeting of Mississippi sugarcane growers, Loring heard a speech by Harvey Wiley, who had been invited to present an overview of sugar-producing crops. It was, thought Loring, a balanced presentation. It was objective. Unlike Collier, who had become increasingly fanatic about the dreamy future of sorghum, Wiley gave each crop its due. The Purdue scientist impressed the commissioner as the reasonable man he was looking for.
Two months later, Loring offered Wiley the chief chemist job. The timing was perfect. Wiley had been feeling increasingly stifled and unappreciated at Purdue. Conservative members of the university’s board of trustees hadn’t cared for the negative attention his state honey and syrup study had drawn, especially from the influential corn industry. One trustee had declared publicly that scientific progress was “the devil’s tool.” The board even publicly disapproved of Wiley’s personal life, including his regular baseball games with the college students and the high-wheel bicycle that he rode to campus daily, dressed in knee breeches. Trustees had called him into a meeting to upbraid him for making a spectacle of himself, even comparing him to a circus monkey. Wiley, as he wrote in his diary, would have taken insult if he hadn’t found the scolding so amusing. Yet he admitted frustration. He had just that year been considered, but passed over, for the post of university president. For the thirty-nine-year-old bachelor, Loring’s offer seemed a lifeline out of a job that increasingly felt like a trap.
But he had not anticipated that the ever-combative Collier would turn his attention from Loring to him. Furious at the loss of his job and status, Collier promptly engineered a series of attacks on his designated successor. His well-placed allies wrote to farm trade journals, denigrating the Indiana sugar studies and suggesting that their author was an inferior scientist. Collier also persuaded the senators from his home state of Vermont to visit President Arthur, demanding that Wiley be denied the position. The aggressive campaign only irritated the president and it did not win Collier his job back. But it was successful in embarrassing Wiley.
“These were the first public attacks on me and they cut to the quick,” Wiley later wrote. “I felt hurt to be the victim of such insinuations and misstatements.” He wrote to the same publications, attempting to defend and justify his work. Collier’s faction, in turn, accused him of bragging. The best way to respond to such attacks, he would gradually come to believe, “is to go about one’s business and let enemies do their worst.” He began packing up for the move to Washington, DC.
In 1883, the Agriculture Department’s sprawling campus was situated between the Smithsonian Institution’s redbrick castle and the almost-completed Washington Monument. The grounds boasted experimental gardens, greenhouses, conservatories, and a grand, modern main building, built in the 1870s, with a stylish mansard roof. The tiny Division of Chemistry, however, was tucked into what Wiley called a “damp, illy-ventilated, and wholly unsuitable” basement. One of the first acts of the new chief chemist was to ban smoking. Not only was the laboratory air stale already but a stray spark, he feared, would have turned the place into a bonfire.
For his living quarters, Wiley rented a bedroom from a Washington family, with whom he would happily stay for the next twenty years. Treated as a well-liked family member he frequently spent evenings helping the children with their homework. Social by nature, he accepted an invitation to join the prestigious Cosmos Club, a men-only, intellectually inclined organization whose members included Alexander Graham Bell and Mark Twain. He also joined the more casual Six O’Clock Club, which by contrast did admit women and boasted American Red Cross founder Clara Barton on its executive committee.
New to the charged political climate of Washington, Wiley scored an early coup in 1885, when Grover Cleveland became president. A dedicated Republican, Wiley knew that his job security could well depend on Democrat Cleveland’s choice to replace Loring as commissioner of agriculture. The chemist started a letter-writing campaign to influential friends, urging the appointment of Norman J. Coleman, a Missouri Democrat and a longtime publisher of farm trade journals, who approved of Wiley’s research. The campaign worked, and Wiley was blessed with a grateful and supportive new boss.
Coleman, who would help create a national network of agricultural experiment stations, also believed that it was a public servant’s duty to champion the public interest. In fact, he wanted the chief chemist to be more aggressive in tackling food safety issues—something Wiley too had been advocating. Coleman even had a suggestion for some timely official investigations. He recommended that the Chemistry Division report on the quality and healthfulness of commercially sold milk. The scientists, he proposed, also should investigate dairy products such as butter and evaluate the new and highly suspect industry of butter substitutes.
The problems of the dairy industry had continued to fester basically unchecked. Mullaly had written in 1853 about the practice wherein distillers housed dairy cows in stinking urban warehouses where each animal was tethered immobile and fed on the spent mash, or “swill,” from the fermentation process used in making whiskey, an arrangement that enriched the owners but was linked to a host of public health problems.
In the 1850s Frank Leslie’s Illustrated Newspaper had exposed these fly-ridden, maggot-infested milk factories, where the animals stood in their own waste, subsisting on the warm swill, which still contained residual sugar and alcohol but little nutrition. Over the cow’s short, miserable life, its teeth tended to rot out before the animal stopped giving milk and was sent to slaughter—or dropped dead in the stall. Pediatricians linked swill milk to a list of childhood symptoms of ill health. “I have every year grown more suspicious of distillery milk,” one doctor wrote, “whenever I have seen a child presenting a sickly appearance, loose flabby flesh, weak joints, capricious appetite, frequent retchings and occasional vomitage, irregular bowels with tendency to diarrhea and fetid breath.”
The notoriously corrupt Tammany Hall government of New York City resisted reform, but finally, in 1862, passed a city ordinance outlawing swill milk, to little effect. Difficult to enforce even in the city, the new law did nothing to help manage poor dairy practices beyond its boundaries. More than two decades later a study published in the Journal of the American Chemical Society looked at swill milk still being produced just across the Hudson River in New Jersey and found “so numerous a proportion of liquefying colonies [of bacteria] that further counting was discontinued.” A subsequent report in Indiana by that state’s board of health added that a random sampling of milk found “sticks, hairs, insects, blood, pus and filth.”
Under Wiley the Agriculture Department’s first detailed examination of food products, Foods and Food Adulterants (technical Bulletin no. 13), was published in three parts in 1887. It revealed, as expected, that little had improved with regard to how milk was produced and what it contained. Wiley’s investigating chemists had found a routinely thinned product, dirty and whitened with chalk. It wasn’t just bacteria swimming in the milk. At least one of the samples that Wiley’s crew tested had worms wriggling in the bottom of the bottle. The Division of Chemistry’s findings about other dairy products were more eye opening. Much of the “butter” that the scientists found on the market had nothing to do with dairy products at all except for the fictitious name on the product.
The ability of producers to so mislead resulted from the work of several French chemists, including one of the nineteenth century’s greatest, Michel Eugène Chevreul. He drew from the Greek word margarites, meaning pearl, and added the Latin for olive, oleum, to coin the term oléomargarine, which is what he called a glossy, whitish, semisolid that two colleagues had derived from olive oil. In 1869 inventor Hippolyte Mège-Mouriès appropriated Chevreul’s terminology and applied it to a butter substitute he made from beef tallow and finely ground animal stomachs. The latter was the basis of a host of butter substitutes embraced by American food processors, which began manufacturing an inventive range of such products in 1876.
Eager to expand a new market, U.S. innovators competed to improve oleomargarine, seeking patents for variations such as “suine” (from suet) and “lardine” (made from pork fat). The industry especially took off after the powerful meatpacking interests realized the potential for profit from the by-products of slaughterhouses and canneries. Barely had the idea of oleomargarine reached the fast-growing Chicago stockyards when some processors decided that if they added just a dab of actual milk to the product, they might cast off its meaty association. Trying for a more appealing name, meatpackers like the Armour brothers and Gustavus Swift borrowed another term for margarine that was in use in Britain, one that at least sounded dairy based: “butterine.” Other manufacturers didn’t even bother with that terminology; they simply called their oleomargarine “butter.”
In his 1883 book Life on the Mississippi, Mark Twain recounted overheard comments made by an oleomargarine salesman from Ohio. “You can’t tell it from butter,” the salesman said. “By George, an EXPERT can’t. . . . You are going to see the day, pretty soon, when you can’t find an ounce of butter to bless yourself with, in any hotel in the Mississippi and Ohio Valleys, outside of the biggest cities. Why, we are turning out oleomargarine NOW by the thousands of tons. And we can sell it so dirt-cheap that the whole country has GOT to take it—can’t get around it, you see. Butter don’t stand any show. . . . Butter’s had its DAY.”
The dairy industry, not surprisingly, disagreed. And furiously. Dairy organizations petitioned members of Congress, demanding action and protection from such deceptive practices. The resulting hearings in both the U.S. Senate and the House of Representatives in 1885 reflected that bitterness, taking up the issue of whether margarine should even be allowed for sale in the United States.
“We face a new situation in history. Ingenuity, striking hands with cunning trickery, compounds a substance to counterfeit an article of food,” charged U.S. senator Robert La Follette. A Wisconsin Republican, La Follette was firmly in the corner of that state’s numerous dairymen. They objected especially to the practice of coloring oleomargarine to make it look like butter. La Follette conveniently overlooked the fact that butter itself, when produced in the winter from cows fed on hay rather than pasture grass, turns out more white than yellow—and that in addition to diluting and adulterating milk, some dairies routinely added golden coloring to their pale butter. The new, nondairy spreads were nothing better than “counterfeit butter,” the senator charged. Congressman William Grout, Republican of Vermont, went further, dubbing the products “bastard butter.” Without regulation, who knew what might be in the stuff? Grout called it “the mystery of mysteries.”
Patent applications for margarine listed such ingredients as nitric acid, sulfate of lime, and even sugar of lead. Congressman Charles O’Ferral, a Virginia Democrat, decried the inclusion of bromo-chloralum, a disinfectant also used to treat smallpox. O’Ferral charged that the disinfectant’s purpose in margarine was “to destroy the smell and prevent detection of the putrid mass” of ground-up sheep, cow, and pig stomachs used in many recipes. Lawmakers wanted to know if other leftover bits of dead animals were finding their way into the recipes. “You do not think that you could make good oleomargarine out of a dead cat or dog?” asked Senator James K. Jones, a Democrat from Arkansas, questioning an industry representative. “It has reached the point in the history of the country where the city scavenger butters your bread,” declared Congressman David B. Henderson, an Iowa Republican. Witness L. W. Morton protested. “An ounce of stale fat put into a ton of good fresh fat will spoil the whole,” Morton testified, pointing out that it was common knowledge that butter also went bad.
The hearings led to the Butter Act of 1886, which passed with support from both parties and was signed by President Cleveland. But thanks to intervention from the meatpackers, the law was less than hard-hitting, imposing a tax of merely two cents a pound on margarine, leaving the imitation still cheaper to produce than the real thing. The law did define butter as “made exclusively from milk or cream” (with the possible addition of salt or dye), meaning that products like butterine had to be labeled “oleomargarine.” False labelers could be fined up to $1,000—assuming they could be caught.
Members of Wiley’s staff had been witnesses at the hearings, but their findings in the new Bulletin 13 series weren’t issued until the next year, 1887, which made the report an anticlimax of sorts. The studies by the agriculture chemists clearly established, however, that at least a third of what was sold commercially as farm-fresh butter was oleomargarine. The bulletin also noted that thirty-seven American factories were producing more than three million pounds of oleomargarine from animal fats every month. The quality varied widely and there was at least a possibility that some animal parasites could survive the manufacturing process and be present in the spread that consumers purchased. “It is undoubtedly true that a great deal of artificial butter has been thrown on the market that is carelessly made,” Wiley wrote.
Still, the Agriculture Department did not offer a blanket condemnation. The division chemists found that if animal-fat oleomargarine was made with care, the product was in many ways comparable to butter, with “nearly the same chemical composition in digestibility. There may be a slight balance in favor of butter but for healthy persons this difference can hardly be of any considerable consequence.”
The primary health concerns, the investigation found, derived from dyes used to improve the look of butter and margarine. Traditional butter dyes had been vegetable products: annatto (from the fruit of a South American tree), turmeric, saffron, marigold, and even carrot juice—all benign if pure. But suppliers were adulterating the dyes. Annatto, the most popular, often had brick dust, chalk, and traces of red ocher mixed into it. Processors were also using industrial dyes such as chromate of lead, already notorious for instances of lead poisoning from eating yellow candy. Similar problems occurred in cheese, where manufacturers used red lead to enrich color. In all food products, the report warned, “the use of mineral coloring like chromate of lead is highly reprehensible.”
The Division of Chemistry included in the report descriptions of several methods for testing products. With the use of a microscope and a little knowledge of what to look for, it was easy to tell if a spread was butter or margarine. At the molecular level, butter displayed long, delicate, needlelike crystalline structures. Melted, it appeared as shorter needles gathered in bundles. Beef fat crystals, by contrast, appeared as spiky, needle-studded globes, like a “sea urchin or hedgehog.” Oleomargarine was a messy tumble of crystalline clumps resembling flattened cauliflowers. Complete with photos, these were handy instructions for anyone with access to a microscope but of little use to the average consumer in 1887.
That same year a New York chemist, Jesse Park Battershall, published a book called Food Adulteration and Its Detection, which offered easier home tests. Some, such as one to detect adulterations in tea, could be conducted in any home kitchen. Battershall recommended simply putting the “tea” into a cylinder containing cold water, capping it, and shaking it hard. Ingredients other than tea would form either a scum on the top or a sludge on the bottom. “In this way, Prussian blue (cyanide, used as dye), indigo (another dye), soapstone, gypsum, sand, and turmeric can be separated,” Battershall explained. And, he added, housewives should not be too surprised to find them there.
Against the backdrop of rising public concern, and with Commissioner Coleman’s support, Wiley resolved to continue raising awareness about impurities and fakery in American food products. The 1887 issues of Bulletin 13 examined three broad areas of food and beverage manufacture, dairy being only the first. The second subject had gotten far less attention—certainly nothing like congressional hearings, let alone a regulatory law—but it concerned products even more rife with fakery. “Could only a portion of the unfortunate dislike for oleomargarine be directed toward the spices?” Wiley wrote in an official letter to his boss.