POLIO, MEASLES, AND THE “DIRTY DISEASE GANG”
In September 1962, about a month before Kennedy signed the Vaccination Assistance Act into law, a white eleven-year-old girl was admitted to a hospital in Sioux City, Iowa. Her face was swollen, her cervix was inflamed, and her enlarged tonsils were coated with a “grayish membrane” that covered the upper part of her throat. The swelling obstructed her breathing, so doctors cut an incision in her trachea to bring oxygen to her lungs. But the girl only worsened: the wound turned purple and red and then bled, her kidneys failed, and she seized repeatedly. She fell into a coma, and on her fifth day in the hospital she died. Cultures of bacteria from her throat confirmed the cause of death: diphtheria.
Alarmed by the severity of the case and the prospect of an outbreak, scientists from the state health department and the CDC descended on Sioux City. The outbreak turned out to be mild: it caused seventeen cases, claimed no other lives, and was over in a couple of weeks. But investigators were stumped as to how it had begun in the first place. Ninety-one percent of children in the district were vaccinated, so it was surprising that an outbreak had taken hold. There was also no obvious source of infection. The bacterium wasn’t present in nearby schools where the affected children had siblings or friends. The outbreak wasn’t caused by raw milk, once a common source of diphtheria, because all of the city’s milk was pasteurized. Moreover, noted investigators, Sioux City had no “true skid-row section,” nor was there any “district catering to transients” nearby.1
The start of the 1960s was characterized by an optimism about the conquest of infectious disease, as described in the previous chapter. New vaccines and new federal resources had led a growing number of experts to predict that vaccine-preventable infections would soon be wiped out for good. But in Sioux City and elsewhere, outbreaks of preventable disease persisted. Health experts who attempted to explain the trend often revived age-old assumptions about the ignorance and disease-breeding proclivities of the poor; these were the very ideas insinuated by the Sioux City investigators’ comments about “skid row” and “transients.” This tendency to hold the poor accountable for outbreaks also reflected a new pattern of disease that emerged over the course of the sixties. With record numbers of middle- and upper-class parents vaccinating their children, preventable infections began to concentrate in new populations. This was particularly true for polio and measles, both targets of federally sponsored vaccination campaigns that overshadowed diphtheria prevention. In the wake of national immunization efforts, polio, once a middle-class disease, became a disease of the “slums” and, in some areas, of minorities. Measles, which once struck all children, became a disease of the disadvantaged.
The complicated relationship between vaccination and social class and the ways in which vaccination transformed disease in the sixties is the subject of this chapter. The decade’s campaigns against polio and measles took place in the context of a national war on poverty, widespread anxiety over the decline of American cities, and the civil rights movement; worries about poverty, urban transformation, and race were thus subtly inscribed upon the nation’s efforts to immunize against these infections. The decade was also marked by growing scientific enthusiasm for disease eradication, which inspired a push to not just vaccinate against diseases, but eliminate them entirely. The decade’s most high-profile vaccination campaigns both shifted their target diseases’ epidemiology—the pattern of who got sick where and when—and provoked changes in the diseases’ popular reputations. Measles eradication proponents, for instance, urged Americans to see measles not as a familiar part of childhood, but as a fate worse than polio, drawing upon middle-class anxieties about poverty and urban decay as they did so. As one health educator put it, measles immunization programs needed to highlight the disease’s “dramatic aspects” in order to make Americans fear the disease, for only then would the country stand a chance at wiping out a disease still harbored in its “ghettos” and “slums.”2 This approach reinscribed vaccination as a middle-class concern, even as the decade’s social welfare programs aimed to ensure vaccination’s equitable distribution across class lines.
POLIO AND POVERTY
In 1959 Surgeon General Leroy Burney penned a letter to health departments across the country, encouraging them to redouble their efforts against polio. “We in the Public Health Service share with you a deep concern that there was more paralytic poliomyelitis in 1958 than in the previous year,” he wrote. When Salk’s polio vaccine was first introduced in 1955, demand for it was so overwhelming, and vaccination rates climbed so quickly, that cases of the disease quickly plummeted. But demand then quickly slackened, noted Burney. And while rates of polio were still far lower than they had been a decade before, the sudden decline in cases showed troubling signs of reversal. There were roughly 5,500 cases in 1957 and close to 6,000 cases in 1958. If health departments didn’t act quickly to halt the trend, things would only get worse, because more than half the population under forty was either unvaccinated or incompletely vaccinated, and close to a third of children under five weren’t protected at all against the disease.3
Burney’s letter also pointed out that polio wasn’t affecting the same segments of the population as it once had. Before 1955, children between the ages of five and nine were at greatest risk of the disease, but by the end of the decade, paralytic cases were concentrated in children under five, and attack rates were highest among one-year-olds. Cases of polio were also appearing clustered in urban areas, with attack rates concentrated in the poorest of urban districts. When polio struck Rhode Island in the summer of 1960—the state’s first epidemic in five years—CDC epidemiologists noted that the “pattern of polio” was “quite different from that generally seen in the past.” Prior to the introduction of the Salk vaccine, polio cases were scattered throughout the capital city of Providence, “without preference for any socioeconomic group.” But in 1960, the cases were almost entirely confined to the city’s lower socioeconomic census tracts, especially in areas with housing projects, through which the disease seemed to spread without resistance. By stark contrast, noted investigators, “the degree to which the upper economic areas were spared is quite remarkable.”4
These observations were of no minor consequence; the fact that polio was now concentrated in urban areas and among the poor meant that its “epidemic pattern” had changed in the wake of the vaccine, according to epidemiologists.5 When outbreaks hit rural areas, the cases were “scattered” and the disease struck the usual school-age children. But in cities, where most outbreaks now occurred, epidemics were concentrated in “lower socioeconomic” areas and younger children were affected more frequently than older schoolkids.6 “A definite trend seems to be developing whereby poliomyelitis is appearing more in lower socio-economic groups and among pre-school children,” noted a Public Health Service fact sheet that described polio’s new predilection for urban areas.7 There was no mistaking that the trend had been sparked by the advent of Salk’s polio vaccine. But the trend was particularly troubling against the backdrop of the country’s changing urban landscape.
In the 1950s, most of the country’s largest cities had begun losing population and wealth to new developments of detached homes on their edges and outskirts. “The city is not growing; she is disintegrating: into metropolitan complexes, conurbations, statistical areas, or whatever one chooses to call them,” said Burney’s successor, Luther Terry, at a national meeting on health in American cities in 1961. He went on: “Those who love her despair as they perceive new gaps in the beloved façade; a raffishness in her grooming; her growing relief rolls; another respectable old neighborhood turned ‘honkey tonk’; a sudden show of violence. Those who despise her feel vindicated as they move their homes farther from these unpleasant surroundings in which, nevertheless, they work, or study, or trade.”8 In the decade and a half since World War II, millions of Americans with the means to do so had moved to new suburbs that had been built to address a national postwar housing shortage. Federal housing loans and investment in the interstate highway system accelerated the migration. By the start of the sixties, central cities were markedly poorer than their suburbs and continued to grow even more so.9 And in public health literature and correspondence on the new patterns of polio distribution, “urban” and “poor” often came to be used synonymously. The conflation of the two categories was a testament to changing realities and presumptions about American cities and the people who called those cities home.
The nation’s cities in this period were poor; they were also, increasingly, black. From the end of the Second World War through the 1950s, vast numbers of African Americans moved out of the rural South and into cities in the North and West. In total, some 3.6 million whites moved to the suburbs while 4.5 million non-whites moved to the nation’s largest cities over the course of the decade. By 1960 most Americans lived in the suburbs, but only 5 percent of African Americans did; the vast majority of them lived in these “disintegrating” areas marked by “raffishness,” violence, and poverty.10 Thus, when public health officials began examining new urban outbreaks of polio, they also documented rates among “Negroes” and other non-whites that far exceeded those among whites living in suburban settings. In the Providence, Rhode Island, outbreak, there were few “Negro” cases, but that was consistent, investigators noted, with the black population in the city and state as a whole.11 In Baltimore, by contrast, a polio outbreak affected “Negroes” at twice the rate of whites and left the suburbs untouched. In an epidemic in northern New Jersey, non-whites were six times as likely to contract polio compared to whites. In Detroit, non-whites were eighteen times as likely to contract paralytic polio compared to whites, and cases clustered in the center of the city, “paralleling the known distribution of negroes in the city.” Whites who got sick lived either on the edge of “Negro areas” or in “mixed areas,” investigators observed.12
The history of American medicine is riddled with instances of the poor and non-white being held culpable for their own infirmities and the diseases of their communities. In nineteenth-century New York, cholera epidemics were blamed on the intemperance, filth, and godlessness of the city’s poor. In turn-of-the-century California, responsibility for outbreaks of plague was pinned on the habits and customs of Chinese and other Asian immigrants. Not long after, public health workers blamed poor eastern and southern European immigrants for the spread of polio through American cities (its means of transmission still unknown). Through the middle of the twentieth century, scientists held that “syphilis soaked” blacks were uniquely susceptible to the disease and its effects because of racially determined inferiorities.13 In the late 1950s and early 1960s, however, a new, seemingly objective explanation was offered for the appearance and persistence of polio among urban non-white populations: the cities were populated by the poor, and the poor simply weren’t vaccinated, whether they were black or white. In Harrisburg, Pennsylvania, 65 percent of the city’s uppermost socioeconomic group was fully vaccinated, but just 35 percent of the city’s poorest residents were protected. In Atlanta, Georgia, 78 percent of the city’s wealthiest residents were immunized, and only 30 percent of the poor were.14 Immunization surveys in the city concluded that this differential had more to do with class than with race, as the “upper echelons in both the white and Negro population” had similarly robust rates of vaccination.15
Not surprisingly, the dramatic income gap between the vaccinated and the unvaccinated prompted many to speculate that “economics” was the root cause of failure to vaccinate; those who could afford to vaccinate did, and that was that.16 But thanks to the Polio Vaccination Assistance Act and the March of Dimes, polio vaccines had been administered free of cost in a great many communities—so there must have been some other root cause of the failure to vaccinate. Some observers fell upon explanations that blamed the poor and minority communities for their purported ignorance, apathy, and procrastination habits. Others pointed to their lack of motivation and imperviousness to the high-profile publicity of the polio immunization campaigns of the previous decade. Often this attribution of blame was subtle. With such widespread publicity and access to free vaccines, the poor’s failure to vaccinate was perceived to have been to some extent deliberate. “We need a doorbell ringing campaign in poorer sections where vaccination is ignored,” said Donald Henderson, head of surveillance at the CDC, in 1964.17
A growing body of research on the sociology of poverty influenced investigations as to why the poor “ignored” pleas to immunize. The Welfare Administration’s 1966 report Low-Income Life Styles epitomized the tone of such research; the report summarized the poor’s “typical attitudes” about life, courtship, marital relations, education, money management, and health. “Literally hundreds” of recent research studies, said one review, were showing how “poverty-linked attitudes and patterns of behavior . . . set the poor apart from other groups.”18 At the Public Health Service, a team of behavioral scientists argued that educated mothers with “white-collar” spouses were more likely than mothers with less education and “blue-collar” spouses to vaccinate their children because it was more convenient for them and because peers and community groups encouraged them. Moreover, when lower-income people read the papers, they read only about “crime, disaster, and sports” and therefore missed the news on public and social affairs, science, and politics—and, it followed, free polio vaccines.19 The vaccination status of the poor was thus chalked up to a combination of misplaced social pressures, inconvenience, and a lack of education that made them different from the middle-class norm. In short, those in poverty—and their “Life Styles”—were responsible for their newfound vulnerability to polio.20
In the early sixties, it was clear that polio vaccination campaigns had thus far simply “‘skimmed off the cream’ of people who are well motivated toward immunization programs,” as one health official put it.21 The families who lined up in droves to vaccinate their children early and completely were the middle-class families who, from the start, felt most threatened by the disease and most invested in the search for a means of treatment or prevention.22 Future immunization campaigns would have to try harder to reach those who had been missed the first time around, advised health officials. But even this advice had its limitations. An adherence to middle-class values and a sense of trepidation about the dangers of urban poverty areas were evident in federal guidelines that instructed health departments on how to identify the unvaccinated in their communities. The guidelines advocated going from house to house to identify pockets of the unvaccinated, even though it might be difficult to find people at home in areas where “both parents are working during the day.” In such cases, it was best to leave a note on the door and call back in the evening—though it was expressly not recommended to return to “slum areas” after dark.23
Such fear of the slums was driven by urban decay that would reach a crisis point in the mid-1960s, fueled by urban deprivation and sparked by movements demanding civil rights and economic justice. In the postwar period, the United States had become not only the most powerful but the wealthiest nation on earth. The proportion of people living in poverty had declined since the end of World War II, but in the 1950s the decline had slowed, the gulf between the rich and the poor had widened, and the poor had become increasingly segregated from the middle class and the wealthy; in the words of one prominent writer, they had become “invisible.” At the end of the decade, the issue of persistent mass poverty became the subject of intellectual and political debate. Social critics and economists declared American poverty “chronic,” a more intractable and widespread problem than most Americans realized or cared to admit.24 When Kennedy was elected in 1960, it was difficult to disagree. By most estimates, some 40 million Americans—22 percent of the population—were living in poverty, and for some segments of the population, the statistics were even worse: 40 percent of the elderly and more than half of all African Americans were poor.25
In light of such figures, the Kennedy administration promised prosperity, equality for all, an end to poverty—and better health. “Let it not be said that the world’s richest nation in all history failed to meet the people’s health needs,” said Luther Terry, Kennedy’s surgeon general.26 For health officials under Kennedy, and subsequently under President Lyndon B. Johnson, poverty, education, and disease were tightly intertwined; it was folly to address one while ignoring the others, and it was unjust that any of the three should exist in a nation that had come to be defined by its wealth. “Too many of our families are experiencing privation in the midst of plenty. We cannot accept poverty as inevitable, any more than we accept illiteracy or disease as inevitable,” said Secretary of Health, Education, and Welfare Anthony Celebrezze.27 The sentiment was embraced by Johnson, who believed it fell to government to “battle the ancient enemies of mankind, illiteracy and poverty and disease.”28
Kennedy’s promises of civil rights and an anti-poverty program were realized under Johnson, who rolled them into a vision of a “Great Society” built on “abundance and liberty for all” and an “end to poverty and racial injustice.” As Johnson made his case for massive societal investments, he frequently referenced the nation’s changing urban landscape, the “despoiled” suburbs and city centers marked by “decay” and “blight.” For Johnson, the achievement of a great future rested in large part on remedying the lack of jobs, opportunities, good schools, and health and social services in urban areas. “Our society will never be great,” he declared, “until our cities are great.” His Great Society would launch a war on poverty, improve education, control crime, clean up the environment, preserve voting rights, protect civil rights, support the arts, launch a “massive attack on crippling and killing diseases”—and “make the American city a better and more stimulating place to live.”29
As the Kennedy and Johnson administrations expounded on the trinity of poverty, education, and disease, health officials increasingly spoke of the connection between urbanization and health. Cities were losing population to suburbs, but the metropolitan areas they formed when combined were growing by leaps and bounds. By 1970 three-fourths of the U.S. population was projected to be living in “great complexes of city and suburb.”30 Early in the sixties, Surgeon General Terry had identified metropolitan growth as a major factor challenging health programs in the future. Later he cited the plight of American cities as justification for more federal involvement in addressing local problems. “The continued exodus to suburbia is draining the central city of resources and complicating the delivery of health services. . . . Great concentrations of people increase the hazards and complicate the control of communicable diseases,” he remarked.31 Under Johnson’s Great Society, both the Public Health Service and the Department of Health, Education, and Welfare launched “an attack on urban ills.” As the nation’s “urban crisis” devolved into inner-city riots, they would cite a new triumvirate in need of addressing the “problems of poverty, the problems of the Negro, and the problems of the city,” in the words of Johnson’s Health, Education, and Welfare Secretary John W. Gardner.32 If “poor” and “urban” were synonymous before, now “black, urban, and problem” came to be used interchangeably, too.33 The shape of this “problem,” for public health specifically, had been sketched by the shift in polio epidemiology in the late fifties, when polio epidemics appeared concentrated among urban populations of color. In the sixties, urban decay and civil unrest etched the “problem” of race, urban poverty, and disease into stone. This view of the problem of vaccine-preventable disease persistence would be subtly reflected in the national anti-measles campaign launched later that decade.
EXPANDING THE VACCINATION ASSISTANCE ACT
Great Society legislation expanded federal involvement in American lives generally, and it paralleled an expansion of federal involvement in health that had been under way for more than a decade. Between 1953 and 1965, more than fifty new programs were added to the Public Health Service, and its budget increased almost tenfold, from $250 million to $2.4 billion. In 1966 Johnson reorganized the burgeoning Public Health Service, increasing its number of bureaus from three to five.34 This expansion was evident in the area of immunization, specifically. In 1965 the expiring Vaccination Assistance Act came up for renewal. Congress chose not only to extend it for another five years, but to dramatically broaden its scope. Now it covered not just “intensive” immunization programs but all “community” immunization programs. Measles was added to the act’s list of target diseases—along with any other infectious disease for which a vaccine became available in the future, provided the surgeon general deemed it a “major” problem and one “susceptible of practical elimination . . . through immunization.”35
Even before the Vaccination Assistance Act was expanded, it had an arguably impressive track record. Between 1962 and 1964, 50 million children and adults were vaccinated against polio, and 7 million children were immunized against diphtheria, pertussis, and tetanus. As a result, annual cases of polio had dropped from 910 to 121, and cases of diphtheria had fallen from 404 to 306. Communities had also established programs to ensure that rates continued to decline in the future.36
In its first two years, the Vaccination Assistance Act had supported a massive federally sponsored educational effort spearheaded by the CDC. At the creative heart of this effort was “Wellbee,” a smiling, round-faced cartoon bumblebee designed by a Hollywood artist. Wellbee was designed to be “the personification of good health,” a standardized symbol that state and local public health agencies across the country would use to communicate the importance of immunization widely. Wellbee urged children in Atlanta and Tampa to “drink the free polio vaccine” and appeared on billboards and pin-on buttons in Chicago. He posed with the Red Sox and with Mayor John F. Collins (a former polio patient) in Boston, went from school to school in Honolulu, rode a dogsled in Anchorage, and warned against “Illbee” in Dallas. His main cause was promoting oral polio vaccine: “Tastes Good • Works Fast • Prevents Polio,” read posters with the smiling bee.37 Local health departments improvised with Wellbee, but his message of preventive health and, above all, the importance of immunization was consistent across the nation.
The expansion of federal involvement in health generally was justified by the nation’s rapid growth and the changing demographics of the population. The numbers of the very old and very young were increasing particularly fast, and it was these groups that consumed the most health resources and medical attention.38 The expansion of federal involvement in immunization specifically—well captured by Wellbee’s far-flung appearances—was justified quite differently, in terms that made reference to a world in which germs knew no geographic boundaries, whether between two nations, two states, or between city and suburb. “Modern air travel” meant that communicable disease was “no longer a local problem,” noted the Chicago Board of Health (adding, “For this reason, we have welcomed the aid of the United States Public Health Service in our overall control program”).39 A Children’s Bureau film made more explicit reference to the diseases latent in poverty-stricken cities and the threat they posed to the suburbs: “When large numbers of unimmunized persons live together in close proximity, as they now tend to do in low-income neighborhoods—the danger is a concern to everyone. Public health officials warn that under such circumstances, an individual case could quickly grow into an epidemic—and once started, epidemics do not respect neighborhood boundaries.”40 The only way to protect everyone from an infectious disease was to make sure no one was at risk—and that was best done by eliminating the disease entirely. This philosophy lay at the heart of the Vaccination Assistance Act’s expansion to cover measles and the first federal attempt to eradicate the long-familiar childhood infection.
TARGETING MEASLES
In its pre-vaccine days, measles was inescapable: nearly all children caught the infection, which causes fever, fatigue, and often a cough or runny nose followed by a blotchy, full-body rash. Most children were protected for the first year of life by antibodies they received from their mothers, but when that wore off, they were susceptible to infection. In the 1930s and 1940s, children who were exposed to someone with measles sometimes received a shot of “convalescent serum,” blood serum taken from people who had previously had measles. The antibodies in the serum reduced children’s risk of coming down with the disease and lessened their symptoms if they did get sick.41 A measles vaccine containing live, weakened (attenuated) virus was licensed for use in 1963. A vaccine containing a killed version of the same strain of measles virus, the so-called Edmonston strain, was licensed that same year. Before the decade was through, two more live, further attenuated measles vaccines were also licensed for use.42
Within just a few years of the first measles vaccine hitting the market, 20 million doses had been administered to American children.43 The number of measles cases declined, but the disease’s national morbidity rate—the amount of total illness and disability it caused—didn’t budge nearly as much.44 In large part, that was because of who was getting the vaccine, and who wasn’t. From 1963 through 1965, most of the measles vaccine in the country had been administered by physicians in private practice, noted the CDC. Little was given to children living in poverty, and the consequence was a new pattern of disease. In Los Angeles, for example, measles declined in predominantly white, “middle class and upper socioeconomic areas” and “increased in areas with large populations of the lower socioeconomic group,” a UCLA epidemiology professor noted. As a result, “measles became . . . a disease of the Negro population and the population with Spanish surnames.”45 Across the country, measles-susceptible children were now concentrated “in a central core, lower socioeconomic area of the city,” said CDC epidemiologist Robert Warren. As with polio, the new vaccine had shifted the disease’s epidemiology. If the country was going to eradicate measles, noted Warren, it would have to ensure the vaccine was used more “homogenously” in the future.46
But even before the measles vaccine had become available, an Oklahoma physician was among those who had anticipated problems encouraging parents of any race or class to vaccinate their children against the disease. “Before the ultimate benefits from vaccination can be realized,” he said, “it is necessary that the public be educated to a realization of the great hazards of measles.”47 Measles had been a part of childhood for so long, explained a California physician, that the disease was widely considered as inevitable as “wornout shoes” and scraped knees.48 It was no wonder, then, that the vaccine hadn’t been met with a fraction of the enthusiasm shown for the polio vaccine. Measles affected all children, almost without exception, and most of them recovered without incident. That and the long-standing use of immune globulin to prevent outbreaks and serious cases meant that for most parents, measles wasn’t a prominent health worry. “It was only measles, doctor,” said a reportedly baffled parent whose infant daughter died after catching the disease from a sibling. “No one worries about measles anymore.”49
If that was true, it was also because measles was deeply overshadowed by polio. The threat of communicable diseases of childhood, measles included, was already fading when polio began to appear in truly epidemic proportions in the 1940s. By midcentury, parental anxieties, research dollars, and scientific attention all focused tightly on polio, at the expense of other (albeit still present) infections. The development of new measles vaccines, however, prompted both medical and public health professionals to take a new and closer look at the familiar infection. “Now that means are at hand to prevent measles and poliomyelitis, it is interesting to compare the mortality associated with the two,” wrote the editors of the Journal of the American Medical Association. The exercise may have been partly intellectual, but it ended up yielding a host of reasons for immunizing the population widely and “homogenously” against measles, as had been done against polio. In 1964 polio killed seven people; measles killed four hundred. Polio had the potential to cause paralysis, measles the potential to cause ear infections, bronchopneumonia, and encephalitis, which could result in deafness, brain damage, or death. “Physicians would do well to ponder these facts,” the editors concluded.50
Measles’ long-ignored risks and litany of serious complications formed a refrain that surfaced as the first new measles vaccines were still being tested. In 1961 Surgeon General Terry told physicians it was time to think differently about measles, which each year killed more children than any other childhood disease.51 In 1963 Merck, which had just licensed a live-virus measles vaccine, pointed out that the disease threatened 20 million Americans, a figure that, with the birth rate as high as it was, would grow by another 4 million each year.52 CDC scientists began to tally measles’ threats even more precisely: one in every thousand cases suffered encephalitis, one in three encephalitis cases died, and another third suffered permanent damage to the central nervous system.53 Such findings encouraged Merck to chime in on measles complications: “All too often, measles is a killer or the cause of mental crippling so severe that the victim survives only as a mental defective,” the vaccine maker propounded.54
Polio had caused national panic and roused national demand for a widely available vaccine, as noted in chapter 1. But measles was another story. Many doctors and health officials agreed that the disease had the potential to pose serious complications, but it was never a given that mass immunization was the appropriate response. Pediatricians debated their role in promoting the vaccine, the price of vaccine, how measles shots complicated the process of scheduling the growing number of vaccines for children, and whether measles was even “worth worrying about,” as one doctor put it.55 A New York physician griped about having to “sell” mothers on the prospect of more painful shots in their “baby’s tender bottom”; an Alabama physician, by contrast, lamented that vaccine maker Merck was the only one promoting the vaccine.56 At the CDC, immunization officials’ position on the matter evolved. In 1964 the Advisory Committee on Immunization Practices concluded that “rarely would there appear to be a need in the United States for mass community immunization programs” targeting measles.57 A year later, the committee revised its position: experience with the measles vaccine to date indicated a need for high levels of mass immunization, as well as a way to achieve such levels.58
OPTIMISM AND ERADICATION
The same year that Johnson renewed the Vaccination Assistance Act, he also signed a bill to greatly expand federal support for medical and health sciences research. On signing the bill, Johnson gave an “evangelistic” speech on the nation’s role in leading a “worldwide war on disease.” Polio had nearly been eliminated in the United States, and the new measles vaccine promised another victory over disease in the future. With the advances that scientists had already made against these diseases as well as cancer, kidney disease, and heart disease, a “staggering era for medicine” lay ahead.59 The president’s remarks captured the scientific spirit of the day. Scientists announced that the medical sciences were experiencing a revolution akin to the one in the physical sciences two decades before. It was an era that prompted scientists in government, the academy, and industry alike to expound on a future in which people would live longer, happier, healthier, and even more intelligent lives.60 American science had the proven capacity to conquer disease, and the nation was witnessing just the beginning of what it would ultimately achieve.
In this era of medical optimism, disease eradication superseded disease control. Control had already proven within reach, remarked American Society of Tropical Medicine and Hygiene president E. Harold Hinman. With the help of drugs, pesticides, and vaccines, developed countries including the United States had all but eliminated from within their borders a long list of diseases: typhoid fever, tuberculosis, diphtheria, smallpox, whooping cough, typhus, cholera, plague, malaria, yellow fever, and more.61 The next step was to move beyond national borders, and beyond mere control. In 1963 the World Health Assembly had pledged its support for a push to vaccinate 80 percent of the earth’s population against smallpox.62 In 1966 the assembly formally adopted a resolution to eliminate the disease from the face of the earth, and the CDC donated experts, equipment, and vaccine to the effort.63 A diplomatic plea from the Soviet minister of health played an important role in pushing the world’s nations to sign on to the project; in the USSR as in the United States, said the minister, smallpox had been eliminated, but the disease’s presence in other countries posed a perpetual threat of imported outbreaks.64 A few years into the program, containment of the disease in poor, populous countries with high rates of smallpox demonstrated that with cooperation and the right technology, the ancient and horrifying disease could be eradicated even “under the most difficult conditions expected.”65 The accomplishment suggested a model that the United States could follow in its own intractable urban poverty areas.
The United States’ commitment to smallpox eradication abroad was in fact mirrored by the campaign to eradicate measles at home, announced officially by President Johnson early the following year.66 Measles was markedly different from smallpox. It was a disease of “only mild severity,” which caused “infrequent complications” and only rarely caused deaths, noted CDC head David J. Sencer and chief epidemiologist Alexander Langmuir.67 Over the last half century, they continued, “man” had developed a “deep respect for the biological balance of the human race with measles virus.” But this accepted doctrine, they argued, was ready to be overturned. They outlined three bases for the disease’s eradication. The measles virus had been isolated and studied, and, like smallpox, was now known to infect only humans, creating neither chronic carriers (as diphtheria could) nor “inapparent” infections (as polio sometimes did). The isolation of the measles virus, furthermore, had led to several “potent” vaccines. Lastly, epidemiological studies of the disease had demonstrated both how it spread and how many “susceptibles” were needed to sustain an outbreak. As with smallpox, it was clear that the disease could be eliminated with less than 100 percent vaccination coverage. Based on this knowledge, the CDC officials confidently declared measles could—and would—be eradicated from the United States before the end of 1967.68
Scientific reasons for measles eradication partially belied cultural ones. In discussions about measles eradication (and disease eradication generally), attitudes toward “nature” (of which measles virus was a part) reflected a modern intolerance for ancient woes and challenges. Nature was something to be overcome with technological breakthroughs and scientific expertise; it was not something to be sentimentally respected. “To those who ask me, ‘Why do you wish to eradicate measles?’ I reply with the same answer that [Sir Edmund] Hillary used when asked why he wished to climb Mount Everest,” said Langmuir. “He said, ‘Because it is there.’ To this may be added, ‘. . . and it can be done.’”69 Tackling nature was the modern way. But it required an all-or-nothing approach. Experience had already shown that partial vaccination changed a disease’s epidemiology, concentrating cases in certain segments of the population. There were other concerns, too: members of a CDC immunization committee noted that because the research community was uncertain about how long the “artificial immunity” created by vaccines would last, it was possible that vaccination of children was creating a situation in which, “in the not too distant future, many adults . . . may become once again susceptible to such diseases as diphtheria, pertussis, and measles.” The prospect prompted the committee members to remark on the “sensitive balance of nature”—and to conclude that the only answer was to immunize early and repeatedly, until nature had effectively been conquered.70 In the 1960s, vaccination against disease created an imperative to eradicate disease. Modern confidence in humankind’s capacity to triumph over nature helped this idea take shape and take hold.71
The public, meanwhile, was given a very different reason why measles needed to be eradicated in 1967. Among themselves, health experts may have reflected on the balance of nature and referred to measles as “mild,” but publicly they described measles as a serious disease with horrific and sometimes fatal complications. The CDC-led anti-measles campaign—designed to reach 8 to 10 million unvaccinated children—produced public service announcements, billboard ads, films, comic strips, coloring books, and more with the message that measles was a treacherous threat. Radio and television stations across the country notified the public that “measles is a serious disease that sometimes causes pneumonia, deafness, encephalitis and even death.”72 Personalities from the surgeon general to Ann Landers spread the word that measles could leave children blind, deaf, and mentally impaired.73 The federal government didn’t go it alone in this effort; new vaccine-maker Merck, hoping to make good on the resources it had invested in developing its measles vaccine, launched a marketing campaign with the slogan “Measles Only Gave Her Spots—Will Your Child Be As Lucky?”74 The national measles campaign, borrowing an idea from the 1950s polio campaigns, selected a white poster child, ten-year-old Kim Fisher, who at age four suffered a case of measles so severe it left her mentally impaired, partially blind, and partly deaf.75 Kim vividly illustrated the idea that measles could be serious, even if parents didn’t see it as such. And if the possibility of complications seemed remote, that was beside the point. Measles immunization was necessary because “one death, one brain-damaged child, or even one child who needs hospitalization is one too many,” as one supporter of the campaign put it.76
Notably, the national campaign also exploited urbanization anxieties of the suburban middle class. In “Spot Prevention,” a cartoon film and coloring book developed by the Department of Health, Education, and Welfare, “measles” was a red-faced, yellow-eyed fiend who popped out of metal garbage cans left on the sidewalk and climbed through nursery- and elementary-school windows, where innocent white children sat unaware and unprotected. At the end of the story, young “Billy” received his vaccine at Happyville Clinic, set in a wide expanse of lawn surrounded by trees, the city where measles once lurked darkly visible in the background.77 In another cartoon developed for the campaign, young “Emmy Immunity” fought off the “Dirty Disease Gang,” a group of social misfits and offenders including Mean-ole Measles, Rolly Polio, and Dippy Diphtheria. “Can Emmy Immunity protect the ‘happy little family’ against the plotting, conniving, evil-doing ‘Disease Gang?’” ran the teaser.78 The cartoons explicitly linked disease and degeneracy; in the world they constructed, measles and other infections arose from filth and social decay. Health officials may have argued that the measles vaccine needed to be used more homogenously, but promotional materials clearly played to white middle- and upper-class anxieties about degenerate and disease-breeding urban cores.
MEAN-OLE MEASLES IN THE SLUMS
The public wasn’t asked to eradicate a mild disease in 1967; they were asked to eradicate a serious one. But they weren’t quick to see it that way, and neither were all health professionals. The disconnect between measles’ public image and its private one was addressed in a memo that CDC head James L. Goddard sent to rally immunization personnel around the country to the cause. “Measles has not suddenly become a more serious disease. It has always been a scourge of childhood,” Goddard wrote. “It commands special attention now because modern medical research has provided us with vaccines which can prevent the disease.”79 As in the case of diphtheria, tetanus, and pertussis, technology warranted renewed focus on a long-familiar infection.
With health experts unevenly enthusiastic, it only followed that the public responded similarly. In Delaware’s anti-measles campaign, the message “that measles can be a dangerous disease with widespread complications was met by general public apathy,” noted a state health officer. “Measles, unlike poliomyelitis, is not a ‘glamorous’ disease. . . . It was found that for the most part the public still considered it a minor, childhood disease.”80 Officials in Washington, D.C., and Atlanta may have believed that all the pieces were in place to conquer yet another childhood scourge, but the buy-in they needed from both local health professionals and the public was only halfhearted, at best.
Nonetheless, an internal report published by the CDC late in 1967 stated that measles cases and deaths had markedly declined. In addition, reporting of measles cases and outbreaks had greatly increased. As a result, a new picture of the disease began to form. Measles complications, once considered rare, now seemed common. Deaths, once thought to occur in 1 of every 100,000 cases, now seemed to occur in 1 of every 10,000 cases.81 Studies had also begun to suggest that even mild cases affected the brain, and that measles might increase susceptibility to polio, cause pulmonary emphysema, and harm the fetus in pregnant women.82 Indeed, the calculated representation of measles as a dangerous infection began to seem like a self-fulfilling prophecy. The measles vaccine became possible only with increased knowledge of the disease; with deployment of the vaccine (combined with disease surveillance) came even more knowledge about the disease and its dangers, which ultimately validated the impression that public health officials had cultivated in order to encourage even further vaccination. Within a decade, measles would no longer be referred to as a minor or mild disease, either by lay observers or health and medical professionals. And its vaccine would come to be widely accepted as a routine part of childhood health care.
In the meantime, measles failed to be eradicated, and at the end of the decade, any progress achieved toward eradication started to come undone.83 The number of measles cases nationwide fell from more than 260,000 in 1965 to just 22,000 in 1968, but soon thereafter the number began to tick upward again.84 At the end of the sixties, new vaccines against mumps and rubella drew efforts and attention away from measles, and immunization funding shortfalls compounded the problem. The Vaccination Assistance Act expired in 1969, and a 1970 extension was left unfunded by President Richard Nixon’s new administration.85 In the wake of the abandoned measles campaign, old patterns became visible again. “Things in Atlanta continue to go as usual,” wrote a CDC immunization officer to a colleague in 1970. “Measles is up 50 percent over last year with epidemics in a number of urban ghettos.”86 In Indiana, too, measles was up, much of it “coming from central city indigent areas,” reported a health officer in that state.87 The same was true in Los Angeles, Chicago, and elsewhere. The CDC was both “chagrined and concerned” about measles resurgence, and the fact that much of the increase was due to “outbreaks in poverty areas of large cities.”88 Despite their best efforts, measles persisted as a disease of the slums.
Explanations for the upsurge in measles cases piled on top of one another. Some blamed the poor and the conditions in which they lived. Some blamed the web of social problems bound in place by poverty.89 In Chicago, where 40 percent of the population lived in the city’s “poverty areas,” measles outbreaks between 1967 and 1969 were attributed to the difficulties of delivering vaccine to “a highly mobile, poorly educated, and impoverished population.” Epidemiologists also speculated that urban renewal projects—in which tenements had been replaced with high-rise apartments—had facilitated measles’ spread, because “children were found to congregate with common babysitters, at recreational facilities, and in other crowded situations.”90 In urban areas, resurgent measles was one of a string of intractable health problems: mental illness, drug abuse, alcoholism, suicide, and new outbreaks of polio and diphtheria, too. CDC officials blamed the latter on low immunization rates, particularly among the youngest children living in urban poverty areas. This time, however, the root explanation was federal budget cuts under the Nixon administration, which were actively protested by a group of federal health employees. “The reason for these epidemics,” said a member of the CDC’s immunization branch bluntly, “is money.”91
There was one attractive solution to these problems: a “new and simpler” approach to immunization. Scientists could either demonstrate that it was safe and effective to give children all five recommended vaccines in a single visit to the doctor, or they could come up with a single vaccine that offered combined protection against multiple infections. Combined vaccines would address the problem of doctors who forgot to recommend vaccines and parents who failed to bring children in for required vaccines. They would also reduce the cost of immunizing all children in “a period of relatively strained health resources” and ensure full protection for the “less health-motivated segments” of the population.92 A combined vaccine against diphtheria, pertussis, and tetanus (as mentioned in the previous chapter) already existed. With three new vaccines against viral infections now available, scientists in government and industry began to press for a way to combine protection against polio, smallpox, measles, mumps, and rubella, too.
But all of these discussions—about the causes of persistent outbreaks and the potential to curb them with simultaneous vaccination—largely sidestepped limitations inherent to vaccines and vaccination. The measles vaccine didn’t offer 100 percent protection; no vaccine did. Up to 5 percent of children vaccinated against measles never produced measles antibodies. Another small fraction of children produced antibodies, but the antibodies failed to protect them against the disease. When children were vaccinated too early, antibodies inherited from their mothers interfered with their ability to mount an immune response, rendering the shots ineffective. At the CDC, public health officials suspected that “lapses in technique” in doctors’ offices—improper storage, poor record keeping, failure to give vaccines at the recommended age—might be compounding these problems further.93
Vaccines had side effects, too; because it was produced in chick cells, measles vaccine was not recommended for children allergic to eggs. In 1969 a CDC physician expressed surprise that not a single serious allergic reaction to the vaccine had been published, even though some were expected. In a letter to Washington, he wrote of plans to corroborate this by examining death certificates. No copies of such certificates appear in CDC files from the time, but one letter describes the death of a two-year-old “Negro Female” who died in an Atlanta hospital four days after receiving the live measles vaccine. The girl had developed a rash after receiving the polio and DPT vaccines the year before. When she was immunized against measles, her left arm swelled, she developed a fever, her parents took her to the hospital, and she died several hours later.94 No one left note of whether the vaccine played a role in her death, but CDC officials were interested enough to keep the case on file. On the whole, however, the side effects and limitations of vaccines received relatively little scientific attention. The goal of eradication made such issues moot—if a disease could be forced out of existence, and fast, the shortcomings of its vaccine wouldn’t matter in the end.
Entering the 1970s, immunization experts and officials took with them lessons learned from their experiences with measles and polio in the 1960s. With polio, fear of disease was universal; with measles, susceptibility to disease was universal. In neither case did all families fully vaccinate their children. The poorer segments of society did not get vaccinated to the same extent as the well-to-do. Federal funds seemed to solve the problem in the case of polio. Federal funds and a campaign that fostered fear of the disease seemed to do the trick in the case of measles. Both campaigns were undertaken to reach the poor in largely urban “slums,” but neither campaign was wholly successful. Persistent epidemics left health workers pondering legal and technological solutions to the problem. For the most part, they didn’t question whether vaccines weren’t a swift means of eradicating disease because of their inherent limitations and because they were a one-size-fits-all solution for a complex and heterogeneous population.
The new funding climate and persistent outbreaks involving preschoolers prompted a shift in focus in the 1970s, away from persuasion and toward compulsion; if families couldn’t be convinced to vaccinate their children, they’d be forced to, through laws requiring immunization for enrollment in school.95 “To reach the single preschool child in the slum is difficult, but to mount a campaign to attain really high levels of immunization in kindergartens and 1st, 2nd grades in school should be relatively easy,” said the head of the CDC’s immunization branch.96 The fact that rates of diseases were lower among such children didn’t matter; by protecting them, the rest of the community would be protected. As CDC head Sencer had pointed out, “If measles isn’t in the schools, it can’t be brought home.”97 Such thinking would prompt a push for more uniform and comprehensive school vaccination laws, which would reach an apex at the end of the seventies. At the same time, health officials would grapple with questions about how to use the newer vaccines, which protected against infections that were less serious than smallpox and polio, and most threatening to specific subsets of the population. Who should receive such vaccines, and when? Experience with one of these new vaccines—the vaccine against mumps—would produce a set of answers.