4
Dying for a Living
They shrug at the pleas of workers whose health they destroy in order to save money. They hire experts—physicians and researchers—who purposely misdiagnose industrial diseases as the ordinary diseases of life, write biased reports, and divert research from vital questions. They fight against regulation as unnecessary and cry that it will bring ruination. They ravage the people as they have the land, causing millions to suffer needlessly and hundreds of thousands to die.
—Rachel Scott, Muscle and Blood1
 
 
No one knows how many people died in the Hawk’s Nest tragedy of the early 1930s, and no one ever will. The number of deaths is probably greater than the number who perished with the sinking of the Titanic, but there is no ship’s register or other list of names that can be used to tabulate the casualties, many of whom were buried in unmarked graves. Nor are you likely to read about Hawk’s Nest in history books, even though it is generally recognized by industrial health researchers as the worst industrial disaster in U.S. history. For a long time it was dangerous to talk about it in West Virginia, where the disaster occurred. In 1939, the governor of West Virginia refused to sanction a Federal Writer’s Guide to his state until the writers toned down their lengthy and graphic discussion of Hawk’s Nest. Even in the 1960s, a West Virginia university professor received more than a dozen death threats when he set out to interview some of the survivors. In 1986, physician Martin Cherniack wrote a meticulously documented account of the disaster, titled The Hawk’s Nest Incident, but although Cherniack’s book was praised by reviewers, it too has gone out of print and into obscurity.
The passengers on the Titanic included scions of wealthy families—people whose passing was deemed important enough to memorialize in books and movies. By contrast, the five thousand workers at Hawk’s Nest were poor, predominantly black, and considered expendable in the early years of the Great Depression. Drawn by promises of better pay and steady work, they left the coal mines and were put to work drilling a three-mile-long tunnel to divert water for a hydroelectric plant being constructed by the Union Carbide Company to provide power to its nearby petrochemical plant. They had no way of knowing that the dusty air in the tunnel would send as many as half of them to an early death. The tunnel still stands today, behind a commemorative plaque that describes it as an engineering marvel. In 1986, the state of West Virginia finally agreed to place a second marker at the site, a three-foot-square sign with a mere eleven lines of text dedicated to the memory of the men who died there.
The mountain through which the workers bored was made of almost pure silica, the hard glassy mineral from which sand and quartz derive. Inhalation of silica dust had been identified 15 years earlier as the cause of an often fatal disease that slowly suffocates its victims by destroying the ability of their lungs to absorb oxygen. Before the scientists labeled it “silicosis,” the disease was called “miners’ phthisis,” “potters’ consumption,” or “grinders’ rot”—names associated with the professions that brought workers into contact with the dust. At Hawk’s Nest, Union Carbide’s management and engineers were mindful of the dangers associated with silica dust, and they wore face masks or respirators for self-protection when they entered the tunnel for periodic inspections. The workers themselves, who spent eight to ten hours a day breathing the dust, were not told about the hazard, nor were they given face masks. Wetting the job site would have reduced the amount of dust in the air, but this was not done either. “The company doctors were not allowed to tell the men what their trouble was,” one of the doctors would testify later. If a worker complained of difficulty breathing, he would be told that his condition was pneumonia or “tunnelitis.”2 For treatment, the doctors prescribed what came to be called “little black devils”—worthless pills made from sugar and baking soda.
In moderately dusty conditions, workers would be expected to contract silicosis after 20 or 30 years. For jobs such as sandblasting, accelerated silicosis might strike in 10 years. At Hawk’s Nest, conditions were so bad that workers were dying from acute silicosis within a single year. The road that led from the workers’ homes to the work site became known as the “death march.” On their way home, the workers would be covered in white rock dust, giving them a deathlike appearance. Many were disturbingly thin, sick, coughing and bleeding.3 “I can remember seeing the men, and you couldn’t tell a black man from a white man. They were just covered in white dust,” recalled a woman who lived near the Hawk’s Nest tunnel.
George Robison, a tunnel worker, said people were forced to live in company houses until they were too sick to work, at which time the sheriff would evict them. “Many of the men died in the tunnel camps,” Robison said. “They died in hospitals, under rocks, and every place else. A man named Finch, who was known to me, died under a rock from silicosis.” 4 A local undertaker, paid by the company, buried 169 of the men in a mass grave in a nearby field.5 The widow of one worker had her husband’s body exhumed only to find that the man, buried by the company barely hours after his death, had three other men stacked on top of him.6 Some family members never found out what had happened to their loved ones. When they inquired, the company would just say that the worker had moved on.
It took a militant labor movement and Franklin Delano Roosevelt’s New Deal to bring the Hawk’s Nest scandal to national attention. Congressional hearings were held in 1935 to probe what one senator described as “American industry’s Black Hole of Calcutta.” At the hearings, a Union Carbide contractor admitted, “I knew I was going to kill those niggers, but I didn’t know it was going to be this soon.”7 The estimated number of deaths among the workers who labored in the tunnel ranged from a few hundred to two thousand. Worse yet, the Hawk’s Nest disaster was not an isolated incident. Thousands of other workers throughout the country were developing silicosis through occupational exposures in foundries, mines, potteries, and construction sites. With public interest aroused, popular and scientific magazines began to write about conditions in the “dusty trades.” Frances Perkins, Roosevelt’s secretary of labor, declared “war” on silicosis.
The response from industry set a pattern that would be repeated countless times in subsequent years when corporate interests faced similar crises. As science writer James Weeks observes, “Surprisingly similar stories—concerning the meaning of ‘scientific’ terms and attribution of responsibility—could be and have been told about asbestos-related diseases, ‘black lung,’ byssinosis [brown lung disease], cancers caused by occupational exposures, lead poisoning, and others.”8 In each case, the exposures that cause disease were only the symptoms of a deeper problem—corporate denial regarding the deadly risks associated with growing industrialization. The company doctors who lied to dying workers at Hawk’s Nest were following a new version of the Hippocratic Oath: “First, do no harm to the boss.” This willingness to subordinate health to profits was common and notorious among physicians who worked for industry. As physician and public health reformer Alice Hamilton observed, a doctor who left private practice to take such employment thereby earned “the contempt of his colleagues.” Company doctors were known as the least competent and least ethical members of their profession.

Hygiene Hijinks

Less than a week after the 1935 Hawk’s Nest hearings adjourned in Congress, a group of industrialists met privately at the Mellon Institute, a foundation that had been established by financiers Andrew and Richard Mellon in 1913 to “benefit American manufacturers through the practical cooperation of science and industry.” The meeting led to the formation of a new organization, headquartered at Mellon, called the Air Hygiene Foundation (AHF). “Because of recent misleading publicity about silicosis and the appointment of a Congressional committee to hold public hearings,” noted a confidential Mellon report, “the attention of much of the entire country has been focused on silicosis. It is more than probable that this publicity will result in a flood of claims, whether justified or unjustified, and will tend toward improperly considered proposals for legislation.” In order to fend off these feared laws and lawsuits, the Air Hygiene Foundation planned a public relations campaign that purported to “give everyone concerned an undistorted picture of the subject.”9
Leading scientists and public officials were appointed to serve as members and trustees of the foundation. Its spokesmen began to be widely quoted in popular trade publications. “Silicotics are rare compared with men driven from their jobs by shyster lawyers,” commented AHF representative Alfred C. Hirth. The AHF’s own “shyster lawyer,” Theodore C. Waters, accused doctors of fabricating claims of silicosis. “In many instances,” he stated, “employees have been advised by physicians, untrained and inexperienced in the diagnosis and effect of silicosis, that they have the disease and thereby have sustained liability. Acting on this advice, the employee, now concerned about his condition, leaves his employment, even though that trade may be the only one in which he is able to earn a living.”10
Companies did finally begin to limit the worst abuses, improving ventilation, wetting down the dust, offering respiratory masks, and using other methods to reduce silica exposures. Gross slaughters like Hawk’s Nest were easily preventable, and they generated headlines that were bad for business. Businesses were also aware of their increasing financial liability due to lawsuits. At the beginning of the twentieth century, the legal system was heavily biased to prevent workers from successfully suing their employers. By the 1930s, however, courts had become increasingly willing to hold employers liable for both actual and punitive damages. Driven by rising jury awards and insurance awards, the “dusty trades” took their problem out of the courts by convincing state governments to incorporate silicosis into state workers’ compensation schedules.
With the Air Hygiene Foundation, industry had found an effective propaganda formula: a combination of partial reforms with reassuring “scientific” rhetoric, under the aegis of an organization with a benevolent, independent-sounding name. Even though the AHF was governed by and for the dusty trades, it had successfully become a vehicle for deployment of the “third party” technique. “A survey report from an outside, independent agency carries more weight in court or before a compensation commission than does a report prepared by your own people,” explained AHF membership committee chairman C. E. Ralston at the foundation’s fifth annual meeting. By 1940, the AHF had 225 member companies, representing such major polluters of the day as American Smelting and Refining, Johns-Manville, United States Steel, Union Carbide, and PPG Industries. In 1941, it changed its name to the Industrial Hygiene Foundation (and later still to the Industrial Health Foundation), broadening its agenda beyond dust-related diseases to encompass other industrial health issues. By the 1970s, it had more than 400 corporate sponsors, including Gulf Oil, Ford Motor Company, General Motors, Standard Oil of New Jersey, Kawecki Berylco Industries, Brush Beryllium, Consolidated Coal, Boeing, General Electric, General Mills, Goodyear, Western Electric, Owens-Corning Fiberglass, Mobil Oil, and Dow Chemical.11
In the mid-1930s, silicosis was regarded as the “king of occupational diseases,” as well known and notorious as asbestos would become in the 1990s. Thanks in large measure to the work of the AHF, however, it began to fade from the headlines by the end of the decade. The history of silicosis is documented in a book titled Deadly Dust by professors Gerald Markowitz and David Rosner, who study the history of occupational and public health policies. By the 1940s, they note, industry health analysts declared silicosis a “disease of the past,” and by the 1950s, it was “officially declared unimportant, and those who spoke about it found it necessary to apologize for ‘bringing up such a shopworn, dusty topic.’ ” Its disappearance from the headlines is arguably an even bigger scandal than the coverup at Hawk’s Nest, because the disease itself has not been eliminated, even though its cause is well understood and avoidable. In England and other parts of Europe, a ban on sandblasting has been in place since 1949. In the United States, however, the National Institute for Occupational Safety and Health (NIOSH) currently estimates that a million U.S. workers are at risk of developing silicosis, of whom 100,000 are in high-risk occupations—such as miners, sandblasters, rock drillers, pottery and mason workers, roof bolters, and foundry workers. NIOSH estimates that 59,000 of these workers will develop adverse health effects from silica exposure.
“Despite years of assurance that silicosis was a disease of the past and that workers could be adequately protected through proper ventilation, substitution of non-silica abrasives such as steel shot or garnite, and protective equipment, the reality is that during the postwar years workers continued to be exposed to excess amounts of silica and that silicosis never really vanished,” write Rosner and Markowitz. “However, it is virtually impossible to develop reliable statistics concerning its prevalence in the decades following World War II given the general complacency of industry and the industrial hygiene and medical communities regarding this disease and the fact that silicosis was often not listed on death certificates as a cause of death or contributing factor. In general, doctors were neither trained to diagnose this disease nor given reason to suspect its prevalence among industrial workers.”12
Recent cases that we do know about, culled from news stories and the Centers for Disease Control and Prevention, 13 include the following:
• A 39-year-old man was diagnosed with silicosis and tuberculosis in April 1993 after working 22 years as a sandblaster, during which he typically spent six hours a day sandblasting. He had worn a charcoal filter respirator, but it failed to protect him.
• A male nonsmoker was diagnosed with advanced silicosis, emphysema, and asthma at age 49 after working 23 years as a tile installer. His work included polishing and drilling tile, and he was exposed to grout dust and sandblasting. He did not use a respirator, because information about dust control had not been made available to workers.
• A brick mason was diagnosed with silicosis, emphysema, and lung cancer at age 70 after 41 years on the job. He had worn a respirator while working in dusty conditions, but again it wasn’t enough protection.
• A 47-year-old man was diagnosed with severe silicosis in 1992 after working 22 years as a rock driller. He lingered for two years before dying in 1994. The drills he had used were equipped with dust controls, which were usually inoperable.
• Leslie Blevins, a 41-year-old coal miner, spent three months cutting through sandstone to get to a coal seam. On orders from the company, he helped conceal his sandstone mining from federal inspectors. “There’s a lot of things that wasn’t supposed to be done, but you either do it or you go home,” he explained. The mining machine that he worked on was an old machine. Its water sprays—used to suppress dust—were constantly breaking. The dust was suffocating. “Sometimes I’d have to shut the miner down and go back in the fresh air and puke,” he said. “My boss would come back and tell me to go back in.” A year later, he was diagnosed with severe silicosis. A doctor gave him two years to live, but he managed to hang on for three.14
Not everyone dies from silicosis. Some are permanently disabled and turn for help to the worker’s compensation system that industry helped put in place in the wake of the Hawk’s Nest scandal. Often, however, they must fight insurance companies to obtain benefits. “Even if they win, the payments they receive rarely equal their previous earnings and may end after a period of years,” notes Houston Chronicle reporter Jim Morris. “The maximum benefit for a ‘permanent, total’ disability in Texas, for example, is $438 per week for 401 weeks, or a little more than 7½ years. That’s hardly comforting to an incapacitated, 40-year-old silicosis victim who had expected to work another 25 years.”15

Rediscovering the Obvious

Even today, most states and the federal government make no serious attempt to track silicosis, which is not classified as a reportable disease. “If you look at the whole surveillance system, it’s been a joke,” says Dr. Kenneth Rosenman, an associate professor of medicine at Michigan State University, noting that government health officials “can’t even keep track of how many people actually die from falls and other trauma in the workplace. The Bureau of Labor Statistics probably has a 75 percent under-count of silicosis.”16 The workers of today, in other words, are not that much different from the workers at Hawk’s Nest. No one knows how many are dying from exposure to deadly dust, and perhaps no one ever will. Aside from the workers themselves and a few academics and isolated government officials, no one really seems to care.
The story of silicosis since Hawk’s Nest has unfolded as a series of episodes in which, every decade or so, the disease gets “discovered” all over again, followed by efforts at regulatory reform. Each time, these efforts to defeat the disease have been thwarted by industry campaigns modeled after the pattern set by the Air Hygiene Foundation. In the 1960s, for example, university researchers documented an epidemic of silicosis among shipyard workers in Louisiana. When similar reports in the 1970s prompted the National Institute for Occupational Safety and Health (NIOSH) to propose more stringent standards for worker silica exposure, the affected industries established a group called the Silica Safety Association (SSA). Like the AHF, it professed concern for worker safety, stating that its mission was to “investigate and report on possible health hazards involved in [the] use of silica products and to recommend adequate protective measures considered economically feasible.”17 The key phrase here, of course, is “economically feasible.” In reality, the SSA regarded any new policy measures to restrict silica exposure as unfeasible. After successfully lobbying to prevent the Occupational Safety and Health Administration (OSHA) from adopting the proposed new NIOSH standard, the SSA disbanded in 1982, its true mission accomplished. At about the same time, a new epidemic of silicosis emerged, this time among Texas oil workers who contracted the disease while sandblasting pipes and storage tanks. A six-month investigation by the Houston Chronicle in 1992 found that “silicosis is often misdiagnosed by doctors, disdained by industry officials and unknown to the very workers who stand the greatest chance of getting it. . . . Old warnings and medical studies have been ignored, products falsely advertised and government rules flouted—especially with regard to sandblasting, an activity so hazardous that NIOSH recommended its banning in 1974.” As late as 1996, the National Institute for Occupational Safety and Health estimated that more than a million workers continue to be exposed to silica.18 A study by the Centers for Disease Control found 14,824 cases of silicosis-associated deaths between the years of 1968 and 1994.
These disclosures prompted a National Conference to Eliminate Silicosis in 1997, which attracted more than 600 federal employees, industry representatives, union officials, and public health workers. New evidence has emerged suggesting that silica exposure may cause lung cancer in addition to silicosis. In May 1998, the official publication of the American Society of Safety Engineers dubbed crystalline silica “the new asbestos.” Once again, the dangers of silica exposure have been rediscovered. Nearly seven decades after Hawk’s Nest, silicosis has become a “new” disease all over again.
In response, industry has mobilized again. “Silica Scare Beginning to Hit Home,” complained the Aggman, a trade publication of the aggregates industry, which produces crushed stone, sand, and gravel. Writing in the same publication, Mark Savit, the industry’s lobbyist at the well-connected Washington law firm of Patton Boggs, accused “regulatory agencies, such as OSHA, the Mine Safety and Health Administration (MSHA) and the Environmental Protection Agency (EPA)” of going to “great lengths to whip up emotions regarding this issue,” which “could have a profound effect on the way in which our industry does business in the future.” He added, however, that industry would have “multiple opportunities to challenge the regulations that the agencies are trying to impose, and to expose the flawed science upon which they are based. . . . As a first step, my law firm, Patton Boggs, will sponsor ‘Silica in the Next Century—The Need for Sound Public Policy, Research and Liability Prevention’ on March 24, 1997, the day before the OSHA/MSHA meeting. Top scientists, industry and association executives, and attorneys will provide participants with the ammunition they need to defend themselves from the coming attack.”19
As a second step, the “dusty trades” created yet another group, this time called the Silica Coalition. “While the organization is ostensibly aimed at providing ‘sound science’ and legal resources to companies potentially affected by any change in government regulation of silica, it is also clear that increased awareness of the dangers of silica and the resulting threat of litigation hang over the heads of industry executives,” note Rosner and Markowitz.20

Different Disease, Same Story

We have chosen to detail the history of silicosis because it serves as an archetype for the way that government, industry, and public health authorities have reacted to countless similar health threats. Each year, more than 800,000 people develop new cases of occupational illness that, combined with on-the-job injuries, kill as many as 80,000. “The medical costs of occupational injuries and illnesses appear to be much larger than those of AIDS,” concluded a 1997 report in the Archives of Internal Medicine. “The total costs appear to be larger than those for Alzheimer’s disease and are of the same magnitude as those of cancer, of all circulatory disease and of all musculoskeletal conditions.”21 In 1991, former New York Times labor correspondent William Serrin noted that some 200,000 U.S. workers had been killed on the job since the passage of the Occupational Safety and Health Act of 1970, and as many as two million more had died from diseases caused by the conditions where they worked. “That’s 300 dead men, women and children a day. In fact, work kills more people each year than die from AIDS, drugs, or drunken driving and all other motor vehicle accidents,” he observed. “Moreover, another 1.4 million people have been permanently disabled in workplace accidents since the act became law. Yet in those twenty years, only fourteen people have been prosecuted by the Justice Department for workplace safety violations, and only one person, a South Dakota construction contractor who was convicted in the deaths of two workers in a trench cave-in, has gone to jail—for forty-five days.”22
In many cases, corporate and public officials have known for decades about life-threatening chemical hazards while failing to protect workers and publicly proclaiming their safety. The solvent benzene, for example, was considered dangerous as early as the 1920s and was linked to leukemia and other cancers in a 1948 toxicological review prepared for the American Petroleum Institute which stated that “the only absolutely safe concentration for benzene is zero.” Yet benzene continues to be widely used and manufactured in refineries and chemical plants and is still present in the workplace today.
As early as 1918, asbestos was considered so hazardous that a medical statistician for Prudential Insurance Company advised against offering coverage to asbestos workers, “on account of the assumed health-injurious conditions of the injury.” The Metropolitan Life Insurance Company reached similar conclusions in 1922, linking asbestos to fibrosis of the lungs. Numerous articles about asbestosis and “industrial cancer” among asbestos workers appear in the 1930s files of the Industrial Hygiene Foundation. All of these industry sources were talking among themselves about the link between asbestos and cancer long before the Journal of the American Medical Association first reported in 1944 that asbestos was among “agents known to cause occupational cancer.” In 1948, the American Petroleum Institute’s Medical Advisory Committee spoke of the need to “aim at the complete elimination” of worker exposures to both asbestos and benzene. For public consumption, however, industry churned out one misleading study after another, such as a massive 1958 study funded by the Quebec Asbestos Mining Association that was widely cited as the largest epidemiological study done on asbestosis, involving some 6,000 asbestos miners. Performed by the Industrial Hygiene Foundation, the report looked impressive unless you happened to pay attention to its method. “Among numerous errors in method was one central, scientifically inexcusable flaw,” notes David Kotelchuck, director of the Center for Occupational and Environmental Health at Hunter College in New York:
The investigators, Daniel Braun and T. Truan, virtually ignored the 20-year time lag between exposure to an agent known to cause lung cancer and the first visible signs of disease (the so-called latent period). They studied a relatively young group of workers, two-thirds of whom were between 20 and 44 years of age. Only 30 percent of the workers had been employed for 20 or more years, the estimated latent period for lung cancer. With so many young people in the study, too young to have the disease although they might well be destined to develop it, Braun and Truan of course did not find a statistically significant increase in lung cancer among the miners. As became obvious later, they had drowned out a clear danger in a sea of misleading data.23
By 1960, 63 scientific papers on the subject of asbestosis had been done, 11 of which were sponsored by the asbestos industry, the other 52 coming from hospitals and medical schools. The 11 industry studies were unanimous in denying that asbestos caused lung cancer and minimizing the seriousness of asbestosis—a position diametrically opposite to the conclusions reached in the nonindustry studies. In 1962, the Gulf Oil Company’s advice to workers, in a training manual for insulators, was that “the fibers of asbestos . . . are not injurious to the respiratory organs. Working with this material does not subject one to this hazard to one’s health.”24 As we all know today, this advice was not only a lie but a murderous lie. The history of industry denials was neatly summarized by David Ozonoff from Boston University, who served as a witness in asbestos litigation and described the series of defenses used by the asbestos industry:
Asbestos doesn’t hurt your health. OK, it does hurt your health but it doesn’t cause cancer. OK, asbestos can cause cancer but not our kind of asbestos. OK, our kind of asbestos can cause cancer, but not the kind this person got. OK, our kind of abestos can cause cancer, but not at the doses to which this person was exposed. OK, asbestos does cause cancer, and at this dosage, but this person got his disease from something else, like smoking. OK, he was exposed to our asbestos and it did cause his cancer, but we did not know about the danger when we exposed him. OK, we knew about the danger when we exposed him, but the statute of limitations has run out. OK, the statute of limitations hasn’t run out, but if we’re guilty we’ll go out of business and everyone will be worse off. OK, we’ll agree to go out of business, but only if you let us keep part of our company intact, and only if you limit our liability for the harms we have caused.
Much the same story can be told with respect to brown lung disease, an affliction of cotton mill workers that was first observed in the early 1900s but, following the standard pattern, was barely studied for half a century after its discovery. In 1945, a report by the U.S. Department of Labor said brown lung disease was not a problem in American cotton mills. The extent of the problem came to light when a Yale researcher began studying the health of prison inmates who were found to suffer a high experience of the disease at cotton mills operated by inmates of the Federal Penitentiary in Atlanta.25 Similar histories of official neglect have been written about worker deaths from exposure to the metal beryllium; exposures to heavy metals such as lead, mercury, or cadmium; lung hazards such as fiberglass and coal dust; or chemicals such as chlordane and dioxin.

Without Propaganda, Pollution Would Be Impossible

As evidence began to mount in the 1970s about the harmful effects of chemicals such as DDT, PCBs, vinyl chloride and benzene, companies—including Mobil Oil, Monsanto, and Union Carbide—launched multiple massive advertising and public relations campaigns, using slogans like Monsanto’s “without chemicals, life itself would be impossible.” Union Carbide’s propaganda efforts alone involved some 200 company managers, coordinated by the company’s communications department as they pumped out speeches, tapes, canned editorials, educational films for public schools, and articles for newspapers and magazines.26
The propaganda effort relied heavily on questionable statistics designed to create the impression that excessive regulation was stifling American creativity and prosperity. Faced with proof that vinyl chloride caused a rare form of liver cancer, chemical manufacturers announced that a proposed federal standard for vinyl chloride exposure would cost two million jobs and $65 billion. “The standard is simply beyond compliance capability of the industry,” declared their trade association. After the screaming was over, the standard was adopted and the industry continued to flourish, without job losses and at 5 percent of the industry’s estimated cost.27
Information on occupational health hazards is rarely collected and even more rarely reported in the news. In the early part of this century, the concept of industrial safety was a novelty in the United States when Alice Hamilton, the country’s first industrial physician, began to investigate what she came to call “the dangerous trades.” In her autobiography, Hamilton described how she became aware of the problem: “It was also my experience at Hull House that aroused my interest in industrial diseases. Living in a working-class quarter, coming in contact with laborers and their wives, I could not fail to hear talk of the dangers that working men face, of cases of carbon-monoxide gassing in the great steel mills, of painters disabled by lead palsy, of pneumonia and rheumatism among the men in the stockyards.” Hamilton went to the library “to read everything I could find on the dangers to industrial workers, and what could be done to protect them. But it was all German, or British, Austrian, Dutch, Swiss, even Italian or Spanish—everything but American. In those countries, industrial medicine was a recognized branch of the medical sciences, in my own country it did not exist.”28
Decades later, Rachel Scott found the situation had not changed much when she set out to research her 1974 book, Muscle and Blood, which examined conditions affecting workers in steel foundries and other industrial settings. “At the library I had hoped to find some explanation of hazards to foundry workers—mortality studies, perhaps, which would shed some light on whether foundry employees showed higher incidences of diseases commonly associated with dusts and fumes, such as heart disease, respiratory disease, or lung cancer. I found French studies, Italian studies, German studies, and a few British studies, but in the American literature, nothing. . . . In spite of my failure at the library, I could not believe there were no studies of present-day American foundries. But calls to federal and state officials confirmed that, indeed, no one knew how foundry workers may be reacting to their often hazardous environment.”29
Even today, the situation is not much better. “We have better data on cattle slaughter in the United States than we do on work-related deaths and injuries,” says Joseph Kinney of the National Safe Workplace Institute in Chicago, which he founded in 1987 after his brother died in a workplace accident for which the employer was fined only $800.30
Industry-financed propaganda campaigns like the Air Hygiene Foundation have helped create this vacuum of information, along with the notion that other people’s problems are not our own and that the benefits of modern society outweigh the dangers. There is a cost, however, attached to this disregard for what happens to workers in their places of employment. Like coal-mine canaries, workers are often the first to encounter and recognize hazards in the broader environment that affects us all. Exposures to harmful chemicals are typically more severe and frequent in the workplace than elsewhere, and workers who fall sick often serve as early warnings that the solvent, metal, or pesticide with which they are working may be a threat to the broader community. Often, in fact, it has been workers themselves—not doctors, scientists, scholars, or government officials—who have discovered and raised the first alarm about a new health risk.

Lead and the “House of the Butterflies”

Given the long history of worker poisonings from exposure to lead, simple common sense should have been enough to avert the massive lead contamination that the United States and other industrial nations experienced during the twentieth century. After all, lead has been a known poison since antiquity. During the first century A.D., lead miners strapped animal bladders over their mouths as a way to avoid inhaling it. Benjamin Franklin wrote about the “mischievous effect from lead,” which he experienced firsthand in his work as a printer. “You will observe with concern,” he wrote, “how long a useful truth may be known, and exist, before it is generally received and practis’d on.” If he had lived on into modern times, Franklin would no doubt be amazed at the “scientific” arguments that corporate propagandists have mustered to prevent the “useful truth” of lead’s dangers from being “received and practis’d on.”
“Why had it taken so long to confirm that environmental lead was a legitimate hazard?” asks William Graebner, a professor of history at the State University of New York at Fredonia. “The single most important answer to that question is that the lead industries did not want to see the triumph of an environmental perspective; and the lead industries exercised enormous influence over the production and dissemination of knowledge about lead in the four decades after 1925. This influence might best be described as a kind of hegemony over scientific research and over perceptions of lead-related problems.”31
Lead exposure can cause anemia, kidney cancer, brain damage, abdominal pain, weight loss, weakness, reproductive system impairment, and miscarriage. Its effect on the brain can be severe and permanent, causing hallucinations, tremors, outright insanity, and even death. These effects were detailed in 1861 by novelist Charles Dickens, who exposed the horrors visited upon women who went mad working in lead factories. By the late nineteenth century, England was regulating workplace exposure to lead, and by the 1920s Australia and a number of European countries were regulating lead in paint, which was affecting painters and was becoming a common cause of poisoning among children. In the United States, however, the regulatory mechanisms moved in the opposite direction, thanks largely to America’s infatuation with the automobile and the discovery of “no-knock gasoline.”
In 1922, researchers for General Motors discovered that adding tetraethyl lead to gasoline could raise the compression and power of internal combustion engines. By the end of the decade, this discovery helped GM displace Ford as the country’s number-one automaker. The downside, however, was instantly recognized by industrial hygienists. Lead in paint is bad enough, but dried paint at least fixes most of the lead in a solid form that requires some effort to ingest. Tetraethyl lead, however, is an oily liquid that is easily absorbed through the skin or inhaled as it evaporates. This makes it more “bioavailable” than lead in solid form, as a series of tragedies quickly demonstrated. The first to reach the attention of the public occurred at a tetraethyl lead processing plant owned by Standard Oil in Elizabeth, New Jersey. Within a five-day period beginning on October 24, 1924, five of the plant’s 49 workers died and 35 developed severe dementia and other neurological symptoms of lead poisoning. Several would spend the rest of their lives confined to insane asylums.
Following the all-too-familiar protocol in such cases, the company’s spokesmen responded to these poisonings by attempting to blame the workers for their own fate. The New York Times, which reported on the disaster, noted that a company doctor had suggested that “nothing ought to be said about this matter in the public interest.” The workers’ supervisors opined that “these men probably went insane because they worked too hard.”32 Reports soon emerged, however, of other cases in which workers had died while handling tetraethyl lead. A General Motors research site in Dayton, Ohio, saw worker deaths, as did a DuPont chemical plant at Deepwater, New Jersey. During a two-year time span, the Deepwater plant saw more than 300 cases of lead poisoning. Eighty percent of the workers at DuPont who handled tetraethyl lead during that period were poisoned, some repeatedly. Other employees took to calling the tetraethyl lead unit the “house of the butterflies,” a grim joke about the nonexistent insects that the exposed workers were seeing as part of their hallucinatory dementia.33
The political struggle over the introduction of leaded gasoline marked a historical watershed, a moment that would help define the future direction of technological development and corporate power in American society. The automobile was fast becoming the mechanical “chicken in every pot” that each American family craved as a symbol of personal financial success. Simultaneously it was coming to symbolize the idea that technological innovation marked the way forward for human freedom and progress. “No-knock” gasoline meant that automobile engines would have more power, more efficiency, more speed—in short, everything that modern society has come to see as desirable indicators of progress. The worker poisonings at several different locations suggested, however, that this progress might come at a great price. No one knew or could even imagine yet the sheer number of automobiles that would be racing down public highways 50 years hence, but it was clear that the lead going into gas tanks would exit through the tailpipe—not as a liquid but as an aerosol, making it almost entirely bioavailable from the moment it left the engine. It would float in the air, then gradually settle to the ground, contaminating streets and soil.
A proper, precautionary response to the lesson learned from worker exposures would have been to ban leaded gasoline. Instead, automakers and government officials preferred to assume that the amount of lead in leaded gasoline was so small as to present no danger. In a letter to the U.S. Surgeon General, DuPont’s chairman stated that this question “had been given very serious consideration . . . although no actual experimental data has been taken.” Even without data, he was confident that “the average street will probably be so free from lead that it will be impossible to detect it or its absorption.” In order to minimize public concern about the product’s potential hazard, leaded gasoline was given the brand name Ethyl, with the word “lead” deliberately omitted.
For help in generating a scientific rationale for the introduction of leaded gasoline, General Motors turned to the U.S. Bureau of Mines. As an official arm of the U.S. government, the Bureau of Mines purported to offer an “independent” and hence reliable assessment of the safety risks involved with leaded gas, but in fact its independence was compromised at multiple levels. Its history with respect to the safety of mineworkers had shown it to be a pliable tool of industry. In reality it was an institution that existed to promote and support the mining industry, and tetraethyl lead promised to create a huge new market for mined lead. Worst of all, GM was paying the Bureau to conduct its study on the safety of leaded gas—creating an obvious conflict of interest, as several prominent public health specialists pointed out to little avail. “It seems to me extremely unfortunate that the experts of the United States Government should be carrying out this investigation on a grant from General Motors,” wrote Dr. Yandell Henderson, a leading public health physiologist at Yale University, pointing to the “urgent need for an absolutely unbiased investigation.”34
Just as the Ethyl Gasoline Corporation had taken the word “lead” out of its own name, the Bureau of Mines went out of its way to omit references to lead in its internal correspondence regarding the GM-funded study. Questioned about this omission, a Bureau of Mines official replied that it was deliberate. “If it should happen to get some publicity accidentally, it would not be so bad if the word ‘lead’ were omitted as this term is apt to prejudice somewhat against its use,” he stated. Censoring the word “lead” out of research into lead toxicity strays considerably, of course, from what might be considered strict scientific rigor. Not surprisingly, the Bureau of Mines study produced a scientific whitewash, which was promptly released as “proof” that “there is no danger of acquiring lead poisoning through even prolonged exposure to exhaust gases of cars using Ethyl Gas.”
In addition to the Bureau of Mines, industry turned for scientific backing to the Charles F. Kettering Foundation and the Kettering Laboratory of Applied Physiology. Forerunners of today’s Sloan-Kettering Institute for Cancer Research, both the foundation and the laboratory were founded by Charles Kettering, a General Motors executive who had been directly involved in the company’s efforts to develop tetraethyl lead as a gasoline additive. The laboratory’s first director, Robert Kehoe, was the Ethyl Gasoline Corporation’s medical director. He quickly became the most vocal scientist in the United States on the subject of lead hazards. His writings, which remained influential well into the 1960s, claimed that lead occurs “naturally” in human beings and that the body “naturally” eliminates low-level lead exposures. At “natural” low levels, it was safe. The only exposures that mattered, he said, were acute exposures like the worker poisonings that had occurred at the “house of butterflies.” This formulation of the facts, which has since been conclusively refuted, provided the scientific weapon that industry needed to fight off the threat that lead poisoning might be environmentally defined and that tetraethyl lead might be banned.

God, Gas, and Civilization

On May 20, 1925, the U.S. Surgeon General convened a national conference that brought together representatives from labor, business, and the public health community to discuss the future of tetraethyl lead. “At this conference the ideologies of the different participants were clearly and repeatedly laid out and provide an important forum in which we can evaluate the scientific, political, economic, and intellectual issues surrounding this controversy,” Rosner and Markowitz observe. “In the words of one participant, the conference gathered together in one room ‘two diametrically opposed conceptions. The men engaged in industry, chemists, and engineers, take it as a matter of course that a little thing like industrial poisoning should not be allowed to stand in the way of a great industrial advance. On the other hand, the sanitary experts take it as a matter of course that the first consideration is the health of the people.’ ”35
Frank Howard of the Ethyl Gasoline Corporation provided industry’s viewpoint. “Our continued development of motor fuels is essential in our civilization,” he told the conference, describing the discovery of leaded gasoline as a “gift of God. . . . Because some animals die and some do not die in some experiments, shall we give this thing up entirely?” he asked. “I think it would be an unheard-of blunder if we should abandon a thing of this kind merely because of our fears.”36
Not everyone shared this faith, however. Yandell Henderson, the Yale physiologist who had criticized the Bureau of Mines study, warned presciently that as the automobile industry expanded, hundreds of thousands of pounds of lead would be deposited in the streets of every major city of America. “The conditions would grow worse so gradually and the development of lead poisoning will come on so insidiously . . . that leaded gasoline will be in nearly universal use and large numbers of cars will have been sold . . . before the public and the government awaken to the situation,” Henderson said.37
In fact, even Henderson’s warning turns out to be a gross underestimate. By the mid-1970s, 90 percent of the gasoline used for automobiles in the United States was formulated with ethyl. During the 60 years that leaded gasoline was used in the United States, some 30 million tons of lead was released from automobile exhausts. “When many cars were getting just ten miles to a gallon in stop-and-go traffic, a busy intersection might have gotten as much as four or five tons of lead dumped on it in a year,” notes Howard Mielke, an environmental toxicologist and lead expert at the College of Pharmacy at Xavier University of Louisiana, in New Orleans. “That’s roughly equal to having a lead smelter at every major intersection in the United States. As a result, there is a very, very large reservoir of lead in soil.”38
Industry trade associations, in particular the Lead Industry Association, vigilantly responded to research that might have alerted the public to lead’s environmental risks. In 1939, Dr. Randolph Byers, a pediatrician at Boston Children’s Hospital, tracked the development of 20 children who had been treated successfully for lead poisoning. He found that even though they had been cured of their acute symptoms, many were experiencing profound learning disabilities and showed evidence of personality disorders. The lead industry responded by threatening Byers with a million-dollar lawsuit. In 1955, a study of Philadelphia tenements revealed that the city’s children were becoming ill and dying from eating chips of lead-based paint. This, too, failed to have any appreciable impact on public perceptions or public policy. In the 1960s, the lead industry tried to have a scientist fired from the California Institute of Technology after his research indicated that leaded gasoline posed a risk to public health. “It really is a sorry track record of dirty tricks and dirty science to promote the broader use of lead,” says Don Ryan of the Alliance to End Childhood Lead Poisoning.39
The Industrial Hygiene Foundation, which had previously risen to the defense of the “dusty trades” in the matter of silicosis, also helped to disseminate the pro-lead writings of the Kettering Foundation’s Robert Kehoe. It argued against the need for government regulations on lead, and railed against those who “exaggerate and dramatize accidental occurrences and alleged injurious effects which have not been established.”40 IHF’s complaint reflected a common industry approach to environmental as opposed to occupational health risks. High-level, occupational exposures like the “house of butterflies” poisonings create obvious, acute responses. The effect of lower-level environmental exposures, however, is typically less obvious and harder to establish scientifically beyond all reasonable doubt. A commonsense precautionary approach would have aimed at preventing even low-level exposures, but in the absence of absolute proof of harm, industry preferred to characterize such precautions as extreme, unscientific, and unnecessary. Thanks to this industry campaign, the first U.S. government regulations on gasoline lead emissions were not issued until 1973. For children, whose developing bodies are much more sensitive to lead than are adults’, even those regulations would prove inadequate.

Faster Cars, Slower Kids

The dangers of lead exposure first came to the attention of Herbert Needleman in the 1950s, while he was still a student in medical school. To help cover his tuition, Needleman took a summer job as a day laborer at DuPont’s chemical plant in Deepwater, New Jersey. He noticed that one group of older workers kept to themselves, moving and speaking slowly and awkwardly, spending their breaks staring into space. His coworkers told him that they were the survivors of the “house of butterflies”—deeply damaged, but still able to work.
Needleman began reading the available literature on lead poisoning and was struck in particular by the work that Byers had done decades earlier showing long-term effects of lead on children. In 1974, he undertook his own study of 2,500 first- and second-graders. Lead tends to accumulate in bones and teeth, and by testing children’s “baby teeth,” he was able to determine which kids had experienced higher-than-average lead exposures. The results, published in the New England Journal of Medicine in 1979, were explosive, showing impaired mental development even at levels of exposure that had previously been considered safe. In addition to having lower intelligence, lead-exposed children are more likely to be hyperactive, suffer from attention deficits, or engage in violent behavior and delinquency.
“The paper was devastating to the lead industry and came at a critical time,” observes writer Thomas A. Lewis. “A federal ban on lead in household paint had taken effect in 1977. Exposure to lead in the workplace had come under strict monitoring and remediation requirements under the 1978 Occupational Safety and Health Act (OSHA). . . . Needleman’s work suggested that far more stringent regulations were needed.”
Industry’s experts, of course, disagreed—in particular, Dr. Claire Ernhart, a developmental psychologist at Case Western University who has received substantial grants from the industry-funded International Lead Zinc Research Organization. Ernhart also serves periodically as a courtroom “expert witness” for defendants in cases involving lead contamination and cleanup. In 1982, for, example, she testified in favor of the lead industry before an EPA panel that was contemplating phasing out all leaded gasoline. More recently, she served as an expert witness for a land-lord who was sued after a young girl developed severe brain damage as a result of ingesting lead paint.
In 1981, Ernhart formally accused Needleman of flawed research, leading to a two-year EPA investigation by a panel of six outside experts. After reviewing and reanalyzing his data, the panel found some inconsequential statistical errors and concluded that his data was insufficient to support the hypothesis that low levels of lead impaired children. The panel also concluded that Ernhart’s data was insufficient to refute Needleman’s hypothesis, but Ernhart had the benefit of a coordinated PR campaign on her side. The firm of Hill & Knowlton—then the world’s largest PR firm—“papered the world” with a draft copy of the EPA panel’s report, in the words of EPA senior scientist Joel Schwartz. Copies were sent to journalists throughout the United States, accompanied by a cover letter claiming that the EPA advisory panel had rejected Needleman’s findings. In fact, Needleman’s point-by-point response to the EPA panel’s criticisms was so persuasive that the agency ended up reversing its position and adopting his findings as part of the basis for restricting lead in gasoline. Hill & Knowlton stood their course. “To this day they are circulating the draft report,” Schwartz noted in 1992.41
In 1991, Needleman was scheduled to testify against a lead smelter in a Superfund case involving the cleanup of lead tailings. Ernhart and another psychologist, Sandra Scarr, were hired as expert witnesses for the defense. The case was settled out of court but sparked a renewed attack on Needleman’s credibility. In a letter to the National Institutes of Health, Ernhart and Scarr charged him with scientific misconduct and threw in a new claim that Needleman had “failed to cooperate” with the earlier investigation. His university convened a new inquiry, and although it found “no evidence of fraud, falsification or plagiarism,” it added that it could not “exclude the possibility of research misconduct” and recommended further investigation. The process dragged into 1992, when Needleman requested and obtained an open hearing so that he could publicly confront his accusers. During two days of testimony, Needleman brought forth other scientists to testify on his behalf, including Joel Schwartz from the EPA.
The charges by Ernhart and Scarr were based on arcane statistical details. Essentially, they were claiming that he had manipulated variables in his data to produce a biased, anti-lead result. Needleman’s scientific defenders, however, showed that even when those variables were taken out of the analysis, the result would be essentially identical to the conclusion that Needleman had published in 1979—namely, that for every 10 parts per million increase of lead in a child’s tooth there was a two-point drop in IQ. After two months of deliberation, the full hearing board concluded that no evidence suggested scientific misconduct, although it added that Needleman’s research methods had been “sub-standard.” Outraged, Needleman filed a lawsuit to force the university to retract this finding.
The matter was referred to the federal Office of Research Integrity for yet another hearing. Nearly two years later, ORI found him innocent of intentional scientific misconduct, again noting that he had made “numerous errors and misstatements,” mostly of a statistical nature that did not affect his conclusions—the same result, in other words, as that of the previous 1981 investigation. After 13 years of harassment, he had managed, more or less, to clear his name. Nevertheless, he said in 1995, “The misrepresentation is still being used by people in the lead industry to try to discredit my work.”42
“When U.S. callers dial an (800) lead industry hotline, they are sent a thick packet of information, including a quasi-scientific paper that questions the work of [lead researcher] Ellen Silbergeld and others, and a Wall Street Journal story about the integrity charges brought against Needleman; a more recent Journal article by the same reporter, describing Needleman’s vindication before NIH, is not included,” noted Common Cause magazine in 1992. “The packets are issued by Edelman Public Relations Worldwide, which is under contract to the Lead Industries Association.”43

Winners and Losers

Herbert Needleman’s work is a success story, relatively speaking. His research has been confirmed by dozens of separate scientific studies conducted by other researchers and has become generally accepted. Thanks to federal regulations that followed from this research, the amount of lead in gasoline in the United States has dropped 99.8 percent from pre-1970s levels. The amount of lead found in the blood of Americans has also dropped dramatically.
Even today, however, the average North American carries between 100 and 500 times as much lead in his or her blood as our preindustrial ancestors. In cities where there has been a high density of automobile traffic, adults have blood-lead levels of about 20 to 25 micrograms per deciliter—roughly half the level at which lead exposure leads to impairment of peripheral nerves. No other toxic chemical has accumulated in humans to average levels that are this close to the threshold for overt chemical poisoning. How has this affected us? Has it made us less intelligent, less rational? As the lead industry will be the first to tell you, it is difficult if not impossible to answer these questions with any degree of scientific precision.
What we do know is that the lead industry continues to lobby, even today, against measures such as an excise tax on lead that would discourage its use and generate funds to help clean up its toxic legacy. Cleanup is needed because some three million tons of lead remain on the walls of homes that were built and painted prior to 1970. Another five million tons is found in the soil near busy roadways. Lead from batteries ends up in waste dumps and incinerators and enters people’s drinking water through the lead in plumbing fixtures. Opposition to a cleanup comes from a diverse array of economic forces: the National Association of Water Companies, which doesn’t want to replace lead pipes; the National Association of Realtors and the National Association of Home Builders, which want to avoid the costs of cleaning up lead-painted homes; the electronics, plumbing, and ceramics industries, all of which use lead in their products. “The war over lead, like so many consumer and environmental problems, is largely waged out of public view, in the bureaucratic and congressional trenches,” observes Common Cause. “It is at this unglamorous level that industry goes head to head with government rule makers, wearing down their resistance and often winning through brute persistence.” 44 It is a war, in other words, in which advocates for public health are perpetually outnumbered and outmaneuvered by expert hired guns whose mission, it seems, is literally to pump the public full of lead.