7

The Sinister Side of Progress

Science Finds, Industry Applies, Man Conforms.

—Motto of the 1933 World’s Fair in Chicago

“When I think of all the deaths we could have avoided in the factories if we’d taken measures as soon as we found out about the toxicity of a number of chemical products, I’m truly revolted . . .” I met with Peter Infante one day in October 2009 in his home in the Washington, DC, suburbs. He is an American epidemiologist who fought his entire career to defend a cause that was “mistreated by the ideology of progress”: public health and occupational safety. “Blue collars, that is to say workers, have paid a heavy price to manufacture all the magnificent objects that consumer society provides us with every day,” he explained, his voice heavy with emotion. “At the very least, public authorities should do everything they can to limit workers’ exposure to dangerous chemical substances as much as possible, while guaranteeing them compensation when they become ill. Unfortunately, industry has systematically crushed all efforts to go in that direction.”1

Peter Infante and David Michaels Versus the Chemical Industry Lobbies

At sixty-nine, Peter Infante knows what he’s talking about. For twenty-four years, he worked at the Occupational Safety and Health Administration (OSHA),2 the agency in charge of health and safety in the workplace, which was created at the same time as the Environmental Protection Agency (EPA) in 1970. It was an era when, mindful of the concerns provoked by Rachel Carson’s Silent Spring, America was paving the way. “I came to OSHA in 1978, at a time when the agency was doing its job well,” he explained. “Under the direction of Eula Bingham, a toxicologist who had been nominated by President Jimmy Carter, we had succeeded in considerably reducing the Occupational Exposure Limits for lead, benzene, and cotton dust. Then Ronald Reagan, who swore by deregulation, was elected to the White House. Manufacturers had taken control of OSHA, so to speak, and I nearly lost my job.”

The epidemiologist showed me a letter sent by Al Gore,3 then chairman of the Subcommittee on Investigations and Oversight in the Congressional Committee on Science and Technology, to “the Honorable Thorne Auchter,” assistant secretary for Occupational Safety and Health, Department of Labor. Written on July 1, 1981, it contested a dismissal notice for Peter Infante, whom his management reproached for having informed the International Agency for Research on Cancer (IARC) of the latest scientific work targeting formaldehyde—the IARC, which is dependent on the World Health Organization, has the mission of classifying chemical products according to their degree of carcinogenicity (see Chapter 10). Also known as methanal, formaldehyde was on a list of priority substances the IARC had announced it was evaluating. This very volatile organic compound is found (in solution) in a number of commonly used products, such as glue for plywood furniture, detergents, disinfectants, and cosmetics (nail polish, for example). As such, it is involved in a number of industrial and artisanal manufacturing processes. In November 1980, a group of scientists called upon by the National Toxicology Program concluded that it was “prudent to regard formaldehyde as posing a carcinogenic risk to humans.”4 Peter Infante decided to inform John Higginson, director of the IARC, which triggered the wrath of OSHA’s management.

In his letter, Al Gore did not mince his words: “I believe that a strong case can be made that your agency’s action is politically motivated. In your own statement of charges, you attach letters from the Formaldehyde Institute critical of Dr. Infante. I am highly suspicious of any personnel action that would have as its base a letter from an industrial group that obviously has a stake in finding that formaldehyde is not a carcinogen. [ . . . ] If OSHA succeeds in firing Dr. Infante, it will be a clear message to all civil servants who are charged with protecting the public health that those who do their jobs will lose their job.”

“In the end, you weren’t dismissed?” I asked, after reading the surprising letter.

“No! And the IARC classified formaldehyde as ‘carcinogenic for humans’ in 2006,” Infante replied. “But at OSHA, our dark period was just beginning. Under the Republican administrations, first Reagan, then Bush, Sr. and Jr., we were paralyzed. The number of products we regulated is ridiculous, barely two over the last fifteen years! In 2002, I left the agency to start working as an independent consultant.”

If the second part of this book starts with Peter Infante’s story—which we will come back to later—it is because it is indicative of the many maneuvers the chemical industry launched over the course of the twentieth century to keep highly toxic products on the market, at the risk of poisoning those that make or consume them. The American epidemiologist David Michaels brilliantly demonstrated this in his previously cited 2008 book Doubt Is Their Product: How Industry’s Assault on Science Threatens Your Health, which Peter Infante highly recommended to me. And for good reason: not long before I interviewed Infante in October 2009, President Barack Obama nominated David Michaels to be the head of OSHA. I would very much like to have met the renowned epidemiologist, a professor of environmental and occupational health at George Washington University, but it was not possible.

When I tried to meet with him, he was very busy with his nomination, which triggered virulent opposition from industrial lobbies prepared to do anything necessary to block the Senate’s indispensable green light. To no avail. On December 3, 2009, David Michaels was confirmed for the position, which was without a doubt good news for the United States. Because if there is one thing that cannot be reproached of the new OSHA head (and the assistant secretary of labor), it is supporting—either closely or at a distance—poison manufacturers. In his book (to which I will return), he shows how those manufacturers, backed up by lies, manipulations, as well as a disregard for human life, are at the origins of an unprecedented “assault” on our health, favoring the establishment of what Geneviève Barbier and Armand Farrachi call a “carcinogenic society.”5

Cancer, a Disease of “Civilization”

Before diving into the chemical industry’s nauseating and (it must be said) criminal history, which comprises one of the key elements of my investigation, I would like to briefly review the history of medicine, as it pertains to this issue. I spent a lot of time in the libraries of Paris consulting books and doctoral theses, trying to answer this fundamental question: Is cancer, as some claim, a “disease of civilization?” And more precisely, is its development linked to that of industrial activity? From my numerous readings, I concluded that cancer is, of course, a very old disease, but that it was extremely rare until the end of the nineteenth century.

As the authors of La Société cancérigène (Carcinogenic Society) explain, “no discovery has ever established that a man died of cancer before the appearance of agriculture. Infectious lesions, rickets, traumas have been detected, but no cancer.”6 For his part, Jean Guilaine, a specialist in prehistory and Neolithic civilizations, notes that the chapter on “neoplasia is reduced to nothing, as no case of authentic malignant neoplasia has been found.”7 Of course, he adds that “the absence of skeletal localizations proves nothing in terms of the possible existence of malignant tumors in soft tissue” and that it remains to be seen “whether prehistoric populations paid the same cancerous toll as today’s.”8

The consensus is that the “oldest description of cancer dates back to about 1600 BC,” as stated on the American Cancer Society website. It was found on Egyptian papyrus, discovered by the British surgeon Edwin Smith in 1862, and described eight cases of breast tumors, for which it was specified that there was “no treatment.” According to British toxicologists John Newby and Vyvyan Howard, who have consulted a large portion of the available literature, “evidence of malignant melanoma” (skin cancer) has been found in a 2,500-year-old Incan mummy in Peru, while the discovery of traces of lymphoma in Homo erectus remains has been attributed to the Kenyan paleontologist Louis Leakey.9

Proving that the disease was duly identified during antiquity, the word “cancer” was invented by Hippocrates (460–370 BC), who, in observing the characteristic branching of tumors, associated their form with that of a crab (carcinos in Greek). In his treatises, the man nicknamed the “father of medicine” describes several types of cancer that he associates with an excess of “black bile.”10 The word carcinos was then translated into Latin by the Roman physician, Celsius, in the beginning of our era.

But while the disease was well known by the Ancients, it was nevertheless “remarkably rare or absent”11 in peoples isolated from industrial development, as clearly shown in the book Cancer: Disease of Civilization by Vilhjalmur Stefansson (1879–1962), an Icelandic ethnologist and Arctic explorer who was a noted authority in the field.12 In the preface of his work, René Dubos, a professor of molecular biology at the Rockefeller Institute, notes that cancer is unknown in “certain primitive people . . . as long as nothing is changed in the ancestral ways of life.” This statement is confirmed by the numerous accounts of traveling physicians cited by Vilhjalmur Stefansson, such as that of Dr. John Lyman Bulkley, who reported in the journal, Cancer, in 1927 that “during a sojourn of about twelve years among several of the different tribes of Alaskan natives . . . he never discovered among them a single true case of carcinosis.”13 Similarly, Joseph Herman Romig, who was then “Alaska’s most famous doctor,”14 wrote in 1939 that “in his thirty-six years of contact with these people he had never seen a case of malignant disease among the truly primitive Eskimos and Indians, although it frequently occurs when they become modernized.”15 Stefansson also cites the accounts of Dr. Eugene Payne, who “examined approximately 60,000 individuals during a quarter of a century in certain parts of Brazil and Ecuador, [and] found no evidence of cancer.”16 He also cites those of Dr. Frederick Hoffman who, at the 1923 cancer congress in Brussels, said in reference to Bolivian women, “I was unable to trace a single authentic case of malignant disease. All of the physicians whom I interviewed on the subject were emphatically of the opinion that cancer of the breast among Indian women was never met with.”17

Observations made by Anglophone scientists were corroborated by their Francophone counterparts, such as Albert Schweitzer, who in his book On the Edge of the Primeval Forest comments on his experience “with the indigenous people of Equatorial Africa” in 1914: “In the first nine months of my work here I have had close on two thousand patients to examine, and I can affirm that most European diseases are represented here. [ . . . ] Cancer, however, and appendicitis I have never seen.”18 The authors of La Société cancérigène also cite the account of Professor de Bovis, “one of the first doctors to take an interest in the generalization of malignant tumors,” who wrote that at the beginning of the twentieth century, “primitive races were once unscathed, or nearly, by cancer. Since our civilization has penetrated into theirs, they have started to develop cancer. The word ‘cancerization’ has even been used in this context, in reference to primitive races.”19

To those who would object that “it is impossible to obtain convincing statistical data on the frequency of cancer among uncivilized races such as those in Africa and the Indians in the North and South of America,” Dr. Guiseppe Tallarico would rightly retort that “all of the doctors who have long practiced among these primitive races are unanimous in rarely or never witnessing cases of cancer.”20 And it isn’t for lack of searching! As the French historian Pierre Darmon reports, traveling physicians identified a certain number of “exotic cancers,” such as the so-called Kangri cancer, which affects the “epithelium of the anterior abdominal wall.” It is “very common in Kashmir, where inhabitants shield themselves from the cold by wearing a kangri—a sort of terracotta vase containing a wood coal flame, which causes burns and chronic irritation—under their tunics.” Similarly, “lip, tongue, and mouth cancers are relatively frequent in India, where women and men chew betel, a kind of mixture composed of betel leaves, tobacco, and lime.”21 Cancers linked to the chewing of betel are still common in the Indian state of Orissa, which I visited at the end of 2009, whereas all other cancers are nearly nonexistent there, though perhaps not for very much longer . . .

In reading these travel accounts written by men of science in the early twentieth century, I came to understand how they became evidence that some would work very hard to deny, even ridicule, mocking the “myth of the good savage”: the incidentally observed absence of cancer in “primitive peoples” stood in stark contrast to the situation that was then prevailing in “civilized” countries, where, in the wake of the industrial revolution, cancers were increasing at an astounding rate.

An Eighteenth-Century Precursor: Bernardino Ramazzini and Occupational Diseases

“The historical period of the fight against cancer starts in 1890, the year that a collective awareness of the scourge in all its scope set in,” writes the French historian Pierre Darmon, who points out the “statistical spike: 1880–1890.”22 Echoing the concerns of an era still characterized by the predominance of infectious disease,23 the historian notes that “the take-away from early investigations is overwhelming. Year after year, cancer was gaining ground. It’s clear that the raw data is incontrovertible. Between 1880 and 1900, the mortality rate of cancer per 100,000 inhabitants seems to have doubled in most countries,” such as the United Kingdom, Austria, Italy, Norway, and Prussia. In England, considered the cradle of the industrial revolution, the number of deaths attributed to cancer went from 2,786 in 1840 (or 177 deaths per million inhabitants) to 21,722 in 1884 (713 deaths per million inhabitants), according to a report published in 1896 in the British Medical Journal.24 “In the space of forty years, the virulence of disease thus quadrupled,” Darmon concludes. He also gives the example of the “little Swedish town of Follingsbro, where cancer deaths have been documented since the beginning of the 19th century—their number went from 2.1 to 108 per 100,000 inhabitants.”25 According to the (numerous) studies published at that time, cancer affected not only the industrialized countries in Europe, but also those of the New World. “If for the next ten years the relative death-rates are maintained, we shall find that ten years from now . . . there will be more deaths in New York State from cancer than from consumption, smallpox, and typhoid fever combined,” Professor Roswell Park wrote in the Medical News in 1899.26

It is interesting to note that in order to explain this troubling development, some commentators were already adopting arguments harshly criticized today by those who would like to deny the environmental origins of cancer, the prevalence of which has nonetheless increased unabated for a century. I will come back to this in more detail (see Chapter 10), but for now let’s turn to Pierre Darmon’s observations about the dramatic upsurge in malignant tumors at the dawn of the twentieth century: “Many authors blame longer life expectancies, flaws in old statistics and improvements in clinical medicine, which allowed an increasing number of cancers to be highlighted.” This is exactly what would be written a century later at the hand of preeminent oncologists—such as Professor Maurice Tubiana in France—who continually minimize the environmental factors in the etiology of cancers. Granted, it’s easy enough to blame “increased life expectancy”—it went from an average of forty-five in 1900 to nearly eighty in 2007. But as we will see, the only relevant data in measuring the unstoppable rise of cancers is the increase in prevalence rates among the general population, especially by age groups—a detail certain leading experts from the French Academy of Sciences seem to want to ignore.

These pseudo-arguments, as Pierre Darmon points out, “are often lost behind what a number of scholars consider the carcinogenic factor par excellence—the progress of civilization.”27 In fact, doctors started to establish a link between disease and certain professional activities as early as the mid-sixteenth century. For example, in 1556 the German doctor and geologist Georg Bauer (also called Georgius Agricola) published De re metallica, a monumental work in which he describes not only mining and metallurgic techniques, but also the many tumors and pulmonary ailments he observed in miners.28

However, we owe the first systematic study on the relationship between cancer and exposure to pollution or toxic substances to the Italian doctor Bernardino Ramazzini (1633–1714). In 1700, the University of Padua professor of medicine, considered the father of occupational medicine, published De morbis artificum diatriba (Diseases of Workers), a work in which he presents thirty or so guilds vulnerable to the development of occupational diseases, notably lung tumors. They included craftsmen working closely with coal, lead, arsenic, or metals—glassblowers, painters, goldsmiths, mirror dealers, potters, carpenters, tanners, weavers, blacksmiths, apothecaries, chemists, starch workers, fullers, bricklayers, printers, launderers, those exposed to sulfur vapors, and “those who anoint with mercurial ointment,” as well as those preparing and selling tobacco. In his seminal work, which would serve as a reference for over two centuries, Bernardino Ramazzini notes that nuns have a much lower incidence of uterine cancer than other women of the era, unknowingly emphasizing the role of certain sexually transmitted viruses in the malignant disease. He states that, in contrast, single women have breast cancer more often than married women, an observation that would be confirmed four centuries later by the discovery of the protective role played by breastfeeding against the hormone-dependent disease.

Ramazzini was a curious and precise man who, simultaneously playing the sociologist, journalist, and physician, did not hesitate to visit the factory floors. He was also a humanist capable of a rare compassion for those he called “patients of the working class.” In the preface to De morbis artificum diatriba, he cautions the physician that upon arriving “to attend some patient of the working class, he ought not to feel his pulse the moment he enters, as is nearly always done without regard to the circumstances of the man who lies sick; he should not remain standing while he considers what he ought to do, as though the fate of a human being were a mere trifle, rather let him condescend to sit down for a while with the air of a judge, if not on a gilded chair as one would in a rich man’s house, let him sit, be it on a three-legged stool or a side-table. He should look cheerful, question the patient carefully, and find out what the matter is. . . . There are many things that a doctor, on his first visit to a patient, ought to find out either from the patient or from those present. For so runs the oracle of our inspired teacher: ‘When you come to a patient’s house, you should ask him what sort of pains he has, what caused them, how many days he has been ill, whether his bowels are working and what sort of food he eats.’ So says Hippocrates in his work Affections. I may venture to add one more question: What occupation does he follow?29

Ramazzini’s originality lies in demonstrating that a number of serious illnesses are caused by human activity, especially activity linked to burgeoning industry. Karl Marx recognized the import of the Italian doctor’s revolutionary work, and cited it in Das Kapital. According to Paul Blanc, Marx foresaw that “the production of illness could represent a hidden cost of industrial manufacture.”30 “Some crippling of body and mind is inseparable even from division of labor in society as a whole,” the theoretician of communist thought states in the first volume of The Process of Production of Capital. “Since, however, manufacture carries this social separation of branches of labor much further, and also, by its peculiar division, attacks the individual at the very roots of his life, it is the first to afford the materials for, and to give a start to, industrial pathology.”31 A note follows, referencing De morbis artificum diatriba.

The Industrial Revolution: Source of an Epidemic of Unknown Illnesses

Strangely, as Paul Blanc remarks, the concern with illnesses developed by laborers working in factories that flourished nearly everywhere in Europe and America in the nineteenth century was not shared by those considered “progressives” at the time or, to use a more Anglophone term, “liberals.” On the contrary, everything indicates that the progressive ideology that developed alongside the industrial revolution, meant to ultimately bring about universal well-being, relegated the health or environmental harm of factory activities to the background. Blanc cites the example of Harriet Martineau (1802–1876), a British militant feminist and abolitionist, journalist, and sociologist who, interestingly enough, translated the works of positivist August Comte. According to Martineau, the regulation of work safety was superfluous, as she believed it came under the sole responsibility of manufacturers, in the name of the liberal doctrine of “laissez-faire.” Often compared to Alexis de Tocqueville for a study she carried out in the United States, she became famous through her heated exchanges with Charles Dickens who, in contrast, advocated for state intervention to strengthen work safety.

The David Copperfield author, a committed writer, inveterate adversary of poverty and industrial exploitation, maintained close relationships with physicians, whose observations on the diseases commonly found among workers of Victorian and industrial England nourished his novels. In an article published in 2006 in the Journal of Clinical Neuroscience,32 Kerrie Schoffer, an Australian neurologist, demonstrates how precisely Dickens described the Parkinsonian syndromes of one of his characters, who were overcome by uncontrollable limb movements, at a time when “there was no name for that and no understanding of the biological basis of it.”33

But while the political classes remained generally impermeable to the health consequences of the industrial revolution, doctors did not stop trying to decode the new illnesses affecting the working class. They drew their inspiration from the pioneering work carried out by the English surgeon Percivall Pott (1714–1788), who in 1775 published a study on a then little-known disease—scrotal cancer. After examining a number of chimney sweeps in a London hospital, Pott observed that they frequently developed tumors of the scrotum, due to the soot deposited in this quite delicate part of the anatomy. Pott noted that German and Swedish chimney sweeps, who had the good idea to wear leather trousers, were less affected than their British colleagues.34 A century later, in 1892, Dr. Henry Butlin caused a sensation at a conference at the Royal College of Surgeons when he revealed that “chimney sweep cancer” also affected workers in naval shipyards who coated the hulls of ships with coal tar.35

But the long litany of harmful effects of coal by-products was only just beginning. Soon, various clinical reports and studies would show that laborers working in charcoal briquette factories (such as in Wales), or in workshops using creosote36 to treat wood, were also developing skin cancer, a disease so rare at the time that it prompted the powerful dockers’ union to request an official investigation. Published in 1912, this “sound epidemiological investigation,” the first of its kind, confirmed the excess of melanoma among naval shipyard workers;37 what’s more, its findings “matched with an elegant set of animal experiments duplicating the same cancer link, some of the earliest laboratory work ever done in the field of chemical carcinogenesis,” to quote Paul Blanc.38

In truth, reading the medical literature from the early twentieth century is quite chilling. For example, one finds accounts of afflictions among men and women working in matchmaking factories in Germany, Austria, or the United States, where the phosphorus industry was flourishing. In 1830, ten years after the launch of this rather profitable activity, the first medical reports pointed out the appearance of a disease as terrible as it was new: osteonecrosis of the jaw, brought on by yellow phosphorus vapors, which manifests as extremely serious lesions of the mouth’s mucous membrane, erosion of the mandible bone and the progressive disappearance of teeth. As Paul Blanc emphasizes, the history of “phosphorus necrosis” perfectly illustrates the harmful effects of the laissez-faire attitude in the realm of occupational safety, as it would take until 1913 for yellow phosphorus to be banned in the production of matches, after which the industry developed less dangerous alternatives (such as red phosphorus-based solutions).

Driven Crazy by Poison

At the same time, neurological diseases were also receiving a lot of attention. Within this category—and this is but one case among many—the history of carbon disulfide is particularly terrifying. Paul Blanc devotes an entire chapter of his book to it, entitled “Going Crazy at Work,”39 in which he speaks at length about the cynical and criminal obliviousness underpinning the industrialization of so-called civilized countries. Used in chemistry to dissolve a number of organic compounds, carbon disulfide is a highly toxic solvent that acts as an intermediate for synthesis in the manufacture of vulcanized rubber products and of medications and pesticides (in the nineteenth century, it was used to combat grape phylloxera).40

In 1856, Auguste Delpech, a young Parisian doctor, gave a brief statement to the Imperial Academy of Medicine (Académie impériale de médecine), in which he presented a new disease he attributed to work in rubber factories. In it, he described the case of Victor Delacroix, a twenty-seven-year-old worker whose symptoms were very similar, he said, to lead poisoning: headaches, muscle stiffness and weakness, insomnia, memory problems, mental confusion, and sexual impotence.41 At the same time that Claude Bernard was preparing his lectures on the effects of toxic and medicinal substances, Delpech was testing the toxicity of carbon disulfide on two pigeons that died immediately, and a rabbit, which ended up paralyzed.42 As Paul Blanc underlines, “Delpech’s studies of carbon disulfide poisoning, matching narrative descriptions of human illness with an experimental model of the disease reproduced in the laboratory, fit particularly well with the scientific concerns and worldview of his medical contemporaries.”

This is true, but with the exception of a few “luminaries” (like the American scientists Alice Hamilton and Wilhelm Hueper) it was rare to find doctors willing to leave the confines of their scientific milieu to appear in the public arena and denounce the occupational diseases they were diagnosing in their practices or laboratories. On the contrary, everything seems to indicate that the horrors observed were generally accepted as inevitable collateral damage of a necessary process of industrialization—an opinion shared by the majority of contemporary journals. So, in 1863, August Delpech published a lengthy article in which he detailed twenty-four cases of carbon disulfide poisoning among workers who manufactured inflatable balloons and condoms in a blown rubber factory. The article revealed that most of them suffered from hysterical fits and periods of sexual excitation followed by impotence, and that one female worker ended up killing herself by inhaling the poison’s vapors.43 The London Times commented on the impressive work: “It is one of the most dangerous substances known in chemystry [sic], but unfortunately also one of the most useful.”44

Twenty-five years later, on November 6, 1888, the renowned professor Jean-Martin Charcot (1825–1893), at one of his equally renowned “lectures” organized every Tuesday at the La Salpêtrière hospital, presented a patient who was the victim of acute carbon disulfide poisoning to a learned assembly of physicians in white coats. After working in a rubber factory for seventeen years, the young man had fallen into a coma after cleaning the vulcanization tanks. “This poor devil is an exceptional case of masculine hysteria,” the neurologist summed up, reminding his audience that hysteria was generally considered a feminine illness. Emphasizing the role of carbon disulfide in the illness’s etiology, he taciturnly explained (much like an expert examining an oddity) that, “Hygienists and clinicians are concerned with these industries because of certain accidents, principally neurological, to which its workers are subject.”45 The “lecture” would turn out to be historically significant, since a British medical dictionary in 1940 would qualify neurological problems brought on by carbon disulfide “gassing” as “Charcot’s carbon disulfide hysteria.”46

But, contrary to what might be believed, the accumulation of medical data would not bring about the prohibition, or even the regulation, of carbon disulfide use. In 1902, Dr. Thomas Oliver, a British doctor and one of Charcot’s disciples, tried to sound the alarm by denouncing the limits of laissez-faire, which would have it that occupational safety be the exclusive responsibility of manufacturers. In a very well-researched study, he describes the phenomenon of addiction that accompanies the hysterical and sexual problems rubber factory workers were experiencing: “In the morning they drag themselves to the factory feeling ill and headachy, and, like people who are accustomed to the intemperate use of alcohol, they only get relief and recover their nervous equilibrium by renewed inhalation of the vapors of carbon disulfide.”47

But this new publication would change nothing for working conditions in factories where poisons were used. Because in the meantime, their use had become even more varied with the advent of a new miracle product: viscose, also called “artificial silk” or “rayon,” which had a bright future in store.48 The synthetic fiber was fabricated using cellulose extracted from tree pulp, thanks to a chemical process in which carbon disulfide was the major chemical component. “Once again,” Paul Blanc notes, “medical reports very quickly identified the hazard. The blunted response to these findings, absent any effective controls for at least several decades, demonstrates the power that economic-political forces can successfully exert in retarding public health interventions in the industrial sector.”49

Brussels, 1936: The Congress on the Causes of Cancer

“It was perhaps the most momentous Cancer Congress ever held,” Isaac Berenblum (1903–2000), a biochemist and oncologist, would later say.50 “A veritable Manhattan Project on cancer,” wrote the American epidemiologist Devra Davis in 2007, in her previously cited book, The Secret History of the War on Cancer.51 The event was so significant that the magazine Nature decided to announce it as early as March 1936, six months before the congress opened in Brussels on September 20.52 On that day, the two hundred top cancer specialists in the world converged on the Belgian capital. Coming from North America, South America, Japan, and all of Europe, often after long weeks of boat travel, the distinguished specialists exchanged their knowledge on a disease that was growing incessantly.

“I was stunned to see how much was known about the social and environmental causes of cancer before World War II, seventy years ago,” commented Davis, who created the first center for environmentally specialized cancer research at the University of Pittsburgh. “The three volumes from this congress included surprisingly comprehensive laboratory and clinical reports showing that many widely used agents at that time were known to be cancerous for humans, including ionizing and solar radiation, arsenic, benzene, asbestos, synthetic dyes and hormones.”53

The conference participants included William Cramer (1878–1945), a Briton who, after comparing the medical history of identical twins (that is, from the same ovule and thus sharing strictly identical genetic material), concluded (already!) that “cancer as a disease is not inherited.”54 Furthermore, after studying death records in the United Kingdom, the researcher from the Imperial Cancer Research Fund noted that the incidence of the disease had risen by 30 percent since the beginning of the century. He also specified (already!) that he had arrived at this number after deducting factors of population increase and life expectancy. On those grounds, considering that the development of tumors was the result of exposure that occurred twenty years earlier, he recommended limiting carcinogenic agents in the workplace while increasing experimental research, because, as he noted (already!), “cancer often develops in both rodents and humans in the same tissues.”

In Brussels, Angel Honorio Roffo (1882–1947), an Argentinian, was also present. He presented photographs of mice that had developed tumors after regular exposure to X-rays or ultraviolet rays (already!); the risk was heightened when rodents were exposed simultaneously to hydrocarbons. James Cook and Ernest Kennaway (1881–1958) were also in attendance: the two Britons from London’s Royal Cancer Hospital had carried out a meta-analysis of thirty or so studies showing (already!) that regular exposure to the hormone estrogen led to mammary cancer in male rodents.

“How did these scientists decide what was a cause of cancer in 1936?” Devra Davis asks. “They combined autopsies with medical, personal and workplace histories of people who had come down with cancer. They reasoned that if they found tar and soot in the lungs of those who had worked in mining and showed that these same things caused tumors when placed on the skin or into the lungs of animals, that was sufficient to deem these gooey residues a cause of cancer that should be controlled.”55

On paper, all of this seems crystal clear, or as they say, “just plain common sense.” But in reading these 1936 congress proceedings, a question logically arises: If all of these researchers already understood that the main cause of the cancer explosion was exposure to chemical agents and if, moreover, they already knew how to limit the damage caused by poisons, why did no one listen to them? The answer is as simple as the question: If all these researchers’ studies and recommendations were ignored, it is because starting in the 1930s industry began strategizing how to control and manipulate research on the toxicity of its products, while waging a merciless war on all the scientists wishing to maintain their independence in the name of the defense of public health. The first victim of this David-and-Goliath battle was Wilhelm Hueper, a renowned American toxicologist of German descent, considered Bernardino Ramazzini’s successor, who participated in the Brussels congress a few months before being fired by his employer, the American chemical company DuPont de Nemours.

Wilhelm Hueper’s Solitary Battle

Wilhelm Hueper’s story is exemplary, because it captivatingly summarizes everything I discovered over the course of my lengthy inquiry. Born in Germany at the end of the nineteenth century, this young man was sent to the front at Verdun during World War I, where he saw the damage done by the poison gas invented by his fellow citizen Fritz Haber (see Chapter 2). From this experience was born an unwavering pacifism, which would remain with him his whole life. After finishing medical school, he immigrated to the United States in 1923. He worked at a Chicago medical school before joining the University of Pennsylvania’s laboratory for cancer research in Philadelphia, chiefly funded by DuPont, one of the biggest chemical companies of the time. In 1932, after learning that the Deepwater, New Jersey, plant was making benzidine and beta-naphtylamine (BNA), which was used in the production of synthetic dyes, he wrote a very candid letter to Irénée du Pont (1876–1963), the company’s owner, to inform her of the bladder cancer risks her workers were facing. His letter was never answered.

Wilhelm Hueper was quite familiar with the subject of synthetic dyes: an occupational health specialist, he very closely followed the medical reports that peppered the development of this booming activity, which was fortuitously born in a British laboratory. In 1856, William Henry Perkin, a chemistry student, discovered that he could transform coal tar—a by-product that had little value at the time and was obtained during the distillation of charcoal to produce gas for lighting—into a mauve solution he called “mauveine.” It was the first synthetic dye in history. Young Perkin’s discovery was monumental: the production of synthetic dyes would constitute the basis of the development of the organic chemistry industry, which would revolutionize the manufacture of medications (aspirin, syphilis treatment), explosives, adhesives and resins, pesticides and, of course, textiles, thanks to the use of aromatic amines, like benzidine and BNA. Very quickly, Germany muscled in on the synthetic dye market, filing hundreds of patents. However, in 1895, the German surgeon Ludwig Rehn reported that in a Griesheim factory where fuchsine (a magenta dye) was made, three workers out of forty-five had developed bladder cancer. Eleven years later, thirty-five of them had. Over the following decade, dozens of cases were reported all over Germany, and also in Switzerland.56 In 1921, using a number of clinical reports as evidence, the International Labor Office published a position paper on aromatic amines, including benzidine and BNA, recommending that “the most rigorous application of hygienic precautions should be required.”57

Once again, however, these reports did not accomplish much. At the end of World War I, the United States confiscated the patents held by vanquished Germany, and distributed them at low prices to American companies like American Cyanamid, Allied Chemical, Dye Corporation, and DuPont. The last immediately built its first organic chemical factory in Deepwater, called “Chambers Works,” where benzidine and BNA production began in 1919. According to internal documents consulted by David Michaels, the firm’s doctors detected the first instances of bladder cancer in 1931, not long before Wilhelm Hueper wrote his letter to Irénée du Pont. “For the next several years, these physicians documented the rapidly growing epidemic both at national conferences and in the scientific literature; at least 83 cases had been recognized by 1936,” writes Michaels in an article on bladder cancer of occupational origins.58

In fact, a study published in 1936—the same year as the Brussels congress—by Dr. Edgar Evans, the chief physician at DuPont, is testimony to the firm’s desire to promote transparency.59 Two years earlier, as a belated follow-up to Hueper’s letter, DuPont had even asked Hueper to join the new industrial toxicology laboratory it had created in Wilmington, North Carolina, precisely to study bladder cancer. The researcher developed an experimental protocol to test the effects of BNA on dogs. The results were incontestable: regular exposure to aromatic amines resulted in bladder tumors, as it did in humans. Deeply troubled by the human implications of his study, yet convinced of his employer’s good intentions, the toxicologist requested to visit Chambers Works in order to see how workers’ safety could be improved.

He detailed what followed in his memoirs:

The manager and some of his associates brought us first to the building housing this operation, which was located in a part of a much larger building. It was separated from other operations in the building by a large sliding-door allowing the ready spread of vapors, fumes and dust from the betanaphthylamine operation into the adjacent workrooms. Being impressed during this visit by the surprising cleanliness of the naphthylamine operation, which at that occasion was not actively working, I dropped back in the procession of visitors, until I caught up with the foreman at its end. When I told him ‘Your place is surprisingly clean,’ he looked at me and commented, ‘Doctor, you should have seen it last night; we worked all night to clean it up for you.’ The purpose of my visit was thereby almost completely destroyed. What I had been shown was a well-staged performance. I, therefore, approached the manager with the request to see the benzidine operation. After telling him what I had just been told, his initial reluctance to grant my request vanished and we were led a short distance up the road where the benzidine operation was housed in a separate small building. With one look at the place, it became immediately obvious how the workers became exposed. There was the white powdery benzidine on the road, the loading platform, the window sills, on the floor, etc. This revelation ended the visit. After coming back to Wilmington, I wrote a brief memorandum to Mr. Irenee Du Pont describing to him my experience and my disappointment with the attempted deception. There was no answer but I was never allowed again to visit the two operations.60

For Wilhelm Hueper, it was the beginning of the end. Soon after, he clashed with the company, which prohibited him from publishing his study on dogs. He was eventually fired in 1937, after the Brussels congress. Braving the wrath of DuPont, which threatened him with legal proceedings, he eventually published his study in a scientific periodical in 193861 and, four years later, in a book as important as Bernardino Ramazzini’s was in his time. Entitled Occupational Tumors and Allied Diseases, it focuses on the important research carried out for more than half a century on the link between cancer and exposure to chemical products. In his autobiography, Hueper says that he had first planned on dedicating his work to “the victims of cancer who made things for better living through chemistry.” It was an ironic allusion to DuPont’s slogan, launched in 1935: “Better living through chemistry.”62 Fearing retaliation, he ultimately opted for a less confrontational dedication: “To the memory of those of our fellow men who have died from occupational disease contracted while making better things for an improved living for others.”63

Despite DuPont’s defamation campaign, including accusations of being “a Nazi, and later a Communist sympathizer,”64 the scientist was recruited in 1948 by the prestigious National Cancer Institute, where he founded the first department for environmental cancer research. It was there that he met Rachel Carson, to whom he would open his archives to support her research for Silent Spring. As for the chemical company, it would continue to produce BNA until 1955 and benzidine until 1967, without ever truly modifying its manufacturing process. In a letter dating from June 1947 and addressed to Dr. Arthur Mangelsdorff, the medical director at American Cyanamid, Edgar Evans—the head doctor at Chambers Works and author of the 1936 study—plainly admitted: “The question of health control of employees in the manufacture of Beta Naphthylamine is indeed a grave one. [ . . . ] Of the original group, who began the production of this product, approximately 100% have developed tumors of the bladder.”65

It is impossible to know today how many victims the bladder cancer epidemic claimed and continues to claim, due to use of aromatic amines, including benzidine and BNA of course, as well as ortho-toluidine (o-toluidine), an antioxidant widely used in the manufacture of rubber products, such as tires. This was how American health authorities, alerted by unions in the early 1990s, identified a “cluster”—that is, an abnormal concentration—of bladder cancer in a Goodyear factory in Buffalo, whose ortho-toluidine stock came from DuPont.66 It goes without saying that this American manufacturer is far from an exception. From one product to the next, but also from one country to the next, the same story keeps repeating itself, following a pattern whose rules are invariably dictated by industry, with the tacit complicity of public powers who accept the death toll, acting only when “the human cost [becomes] so obvious that it [is] no longer acceptable,”67 to borrow a few words from David Michaels, the U.S. assistant secretary of labor since 2009.